Co-founder and co-director of UCLA’s Center for Critical Internet Inquiry shares expertise in episode on racial bias in search engines.
UCLA Associate Professor of Information Studies Safiya U. Noble was recently featured in an episode on “Search Engine Breakdown,” for the PBS series, NOVA. The author of “Algorithms of Oppression: How Search Engines Enforce Racism,” shares her expertise in the episode alongside Latanya Sweeney, Daniel Paul Professor of the Practice of Government and Technology at the Harvard Kennedy School.
Noble, who is the co-founder and co-director of the UCLA Center for Critical Internet Inquiry, discussed the role of search engines to categorize and classify information, which results in reflecting and reinforcing racial, sexual, and other bias among their users.
“A lot of the content that comes back to us on the internet is in a cultural context of ranking,” says Noble. “We know very early what it means to be number one, so ranking logic signals to us that the classification is accurate, from one being the best to whatever is on page 48 of search, which nobody ever looks at.
“Part of what it’s doing is picking up signals from things that we’ve clicked on in the past, that a lot of other people have clicked on, things that are popular. So, an algorithm is, in essence, a decision tree: if these conditions are present, then this decision should be made. And the decision tree gets automated, so that it becomes like a sorting mechanism.”
Professor Noble noted that, “What we lose, with our hyper-reliance upon search technologies and social media, is the criteria for surfacing what’s most important can be deeply, highly manipulated.
“I am really interested in solutions,” she said. “It’s easy to talk about the problems, and it’s painful, also, to talk about the problems. But that pain and that struggle should lead us to thinking about alternatives. Those are the kind of things that I like to talk to other information professionals and researchers and librarians about. I think what I feel most hopeful about is that there’s this new cottage industry called “ethical A.I.,” and I know that our work is profoundly tied to that. But on another level, I feel like these predictive technologies are so much more ubiquitous than they were 10 years ago.”
“Some questions cannot be answered instantly,” said Professor Noble. “Some issues we’re dealing with in society, we need time and we need discussion. How can we look for new logics and new metaphors and new ways to get a bigger picture? Maybe we can see, when we do that query, that, “that’s just nothing but propaganda,” and we can even see the sources of the disinformation farms, maybe we can see the financial backers. There’s a lot of ways that we can reimagine our information landscape. So, I do feel like there is some hope.”
To watch the episode, “Search Engine Breakdown,” with Professor Safiya U. Noble, visit the NOVA website.
Photo by John Davis