Algorithms of Oppression: Dr. Safiya Noble Discusses Racism in Google Search Results



CW: mentions of white supremacy and hate crimes

A few years ago, if you tried searching “three white teenagers” and “three black teenagers” on Google Images, you’d get wildly different results. The white teenager search results look right out of High School Musical: brightly lit images of white teenagers smiling directly at the camera while holding miscellaneous pieces of sports equipment. In contrast, if you replaced the word “white” with “black” in your search, the results would be flooded with mugshots of black teenagers.

The difference in search results for black and white teenagers was one of the many examples of how search engines reinforce racism that Dr. Safiya Noble offered last Thursday in her talk “Algorithms of Oppression,” delivered to a packed Vollum Lecture Hall as part of Black Celebration Month. Noble, an Associate Professor of Information Studies and African American studies at UCLA, works primarily at the intersection of race, gender, and technology. She was introduced by her friend (and Reed College Archivist) Tracy Drake.

Noble began her talk by painting a picture of the early days of search engines on the internet. She described it fondly as a bit of a confused mess, not easily navigable and lacking clear structure. Greater organization was necessary and as search engines became more advanced, people initially saw them as fairly benign. They were operated by “just math.”  Over the past decade, however, there has been increasing scrutiny towards technology systems. People began to question algorithmic decision tools and to recognize that they carry biases that can often result in harm to already-marginalized communities.

Noble pointed to several cases where Google’s algorithm has yielded disturbing results. For example, if one searched for the term “black girls” on Google, the first page of results was flooded with links to porn sites and other primarily sexual content, whereas a search for “white girls” delivered non-sexual content on the first page of results. A search for “three black teenagers” brought up mugshots, whereas “three white teenagers” just showed benign stock photos. It became increasingly clear that all these instances, which had individually been explained away as unhappy coincidences or unlikely glitches, together create a strong narrative of how search engines contribute to and encourage negative and harmful stereotypes about women and minorities, especially brown and black people. As these issues were exposed, Google took note. 

The search engine results for black and white teenagers were modified to not appear so distinct from one another. Among the photos of white student athletes, an image of a white man guilty of a hate crime is shown escorted by police. Under a search for “three black teenagers,” on the other hand, an upbeat photo of a group of friends smile at the camera among the mugshots. Despite the controversy and the seemingly minimal effort to fix the problem, Google has not issued an official apology for this particular search. At other times though, Google has made statements.

Noble then went on to describe the kind of apology that Google typically gives when people draw attention to how the algorithm portrays brown and black people, which is generally a few sentences that apologize for “offense this may have caused.” Noble argued that the phrasing of their apologies serves to shift blame from the company to the users of the service. They imply that the company is not responsible for what things surface because the algorithm is simply reflecting the collective social biases of the user base. 

The debate about whether hosting sites should be accountable for the information that flows through them has no end in sight. The issue goes deeper with Google, however: users don’t think of it as a typical advertising company, but rather a public resource. Noble argued that Google has fundamentally rewritten the ways in which content is curated. Instead of experts and librarians being the primary curators, the power now resides with the algorithm. 

Google’s most well known algorithm is called PageRank. This algorithm sorts web pages by how many quality links from other sites go to that particular page. The problem is, Google doesn’t exclusively use the textbook version of PageRank, because it is in part an advertising company, and has shown a propensity to prioritize its own monetary interests, like promoting content from its subsidiary YouTube over content from other video streaming platforms. 

The issue is that users treat Google’s ordering logic like a vetting service. “It was the first thing that came up when I searched” can be used to mean “it’s probably true” or “it’s been verified.” When believability depends on a site’s place in line, order is important. If the truth is out there, but it’s on page seven, would it even be believed?

In the beginning of his manifesto, Dylann Roof, the Charleston church shooter, wrote “this [reading about the murder of Trayvon Martin] prompted me to type in the words ‘black on White crime’ into Google, and I have never been the same since that day. The first website I came to was the Council of Conservative Citizens. There were pages and pages of these brutal black on White murders.” Although Noble points to this as an extreme example, she stresses the fact that what he should have found was crime statistics by a reliable organization; however, the phrase ‘black on white crime’ is frequently used by white supremacist groups, and thus they were likely some of the first links that came up in Roof’s Google search. 

During her talk, Noble also touched on how Google’s search algorithm could affect undecided voters who turn to Google when deciding how to cast their vote for an election. At another point in her talk, she picked at the allure of big data, noting that predictive models built from historical data have the potential to project biases and inequalities into the future. After her talk concluded, a reception was held in the library lobby. 

After the reception, curious to see the search results for myself, I googled “three white teenagers.” The results were strange. It wasn’t the cheerful teenagers from before the controversy, or the cover-up amalgamation that immediately followed public outcry. Instead, Google Images was full of screenshots showing what Google Images used to look like. The top results showed side by side comparisons of what images used to appear for black and white teenagers. It seems good that these images, which expose Google’s algorithm’s reinforcement of racism, are the first thing to pop up, but it also makes a darker point — Google wields an enormous amount of control over its own image, and it gets to tell its own story. 

Luckily, there are other people telling the story of Google, too. Noble’s book Algorithms of Oppression dives deeper into the ways in which Google is harmful to people of color and women and would be an insightful resource to those curious about the hidden biases that exist within the technology that we use everyday.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments

Trending

0
We would love your thoughts, please comment!x
()
x