Algorithms of Oppression

Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction.

Background
Noble earned an undergraduate degree in sociology from California State University, Fresno in the 1990s, then worked in advertising and marketing for fifteen years before going to the University of Illinois Urbana-Champaign for a Master of Library and Information Science degree in the early 2000s. The book's first inspiration came in 2011, when Noble Googled the phrase "black girls" and saw results for pornography on the first page. Noble's doctoral thesis, completed in 2012, was titled "Searching for Black girls: Old traditions in new media." At this time, Noble thought of the title "Algorithms of Oppression" for the eventual book. By this time, changes to Google's algorithm had changed the most common results for a search of "black girls," though the underlying biases remain influential. Noble became an assistant professor at University of California, Los Angeles in 2014. In 2017, she published an article on racist and sexist bias in search engines in The Chronicle of Higher Education. The book was published on February 20, 2018.

Overview
Algorithms of Oppression is a text based on over six years of academic research on Google search algorithms, examining search results from 2009 to 2015. The book addresses the relationship between search engines and discriminatory biases. Noble argues that search algorithms are racist and perpetuate societal problems because they reflect the negative biases that exist in society and the people who create them. Noble dismantles the idea that search engines are inherently neutral by explaining how algorithms in search engines privilege whiteness by depicting positive cues when key words like “white” are searched as opposed to “asian,”  “hispanic,”  or “Black.” Her main example surrounds the search results of "Black girls" versus "white girls" and the biases that are depicted in the results. These algorithms can then have negative biases against women of color and other marginalized populations, while also affecting Internet users in general by leading to "racial and gender profiling, misrepresentation, and even economic redlining." The book argues that algorithms perpetuate oppression and discriminate against People of Color, specifically women of color.

Noble takes a Black intersectional feminist approach to her work in studying how google algorithms affect people differently by race and gender. Intersectional Feminism takes into account the diverse experiences of women of different races and sexualities when discussing their oppression society, and how their distinct backgrounds affect their struggles. Additionally, Noble's argument addresses how racism infiltrates the google algorithm itself, something that is true throughout many coding systems including facial recognition, and medical care programs. While many new technological systems promote themselves as progressive and unbiased, Noble is arguing against this point and saying that many technologies, including google's algorithm "reflect and reproduce existing inequities."

Chapter 1
In Chapter 1 of Algorithms of Oppression, Safiya Noble explores how Google search's auto suggestion feature is demoralizing. On September 18, 2011, a mother googled “black girls” attempting to find fun activities to show her stepdaughter and nieces. To her surprise, the results encompassed websites and images of porn. This result encloses the data failures specific to people of color and women which Noble coins algorithmic oppression. Noble also adds that as a society we must have a feminist lens, with racial awareness to understand the “problematic positions about the benign instrumentality of technologies.”

Noble also discusses how Google can remove the human curation from the first page of results to eliminate any potential racial slurs or inappropriate imaging. Another example discussed in this text is a public dispute of the results that were returned when “Jew” was searched on Google. The results included a number of anti-Semitic pages and Google claimed little ownership for the way it provided these identities. Google instead encouraged people to use “Jews” or “Jewish people” and claimed the actions of White supremacist groups are out of Google's control. Unless pages are unlawful, Google will allow its algorithm to continue to act without removing pages.

Noble reflects on AdWords which is Google's advertising tool and how this tool can add to the biases on Google. Adwords allows anyone to advertise on Google's search pages and is highly customizable. First, Google ranks ads on relevance and then displays the ads on pages which it believes are relevant to the search query taking place. An advertiser can also set a maximum amount of money per day to spend on advertising. The more you spend on ads, the higher probability your ad will be closer to the top. Therefore, if an advertiser is passionate about his/her topic but it is controversial it may be the first to appear on a Google search.

Chapter 2
In Chapter 2 of Algorithms of Oppression, Noble explains that Google has exacerbated racism and how they continue to deny responsibility for it. Google puts the blame on those who have created the content and as well as those who are actively seeking this information. Google's algorithm has maintained social inequalities and stereotypes for Black, Latina, and Asian women, mostly due in part to Google's design and infrastructure that normalizes whiteness and men. She explains that the Google algorithm categorizes information which exacerbates stereotypes while also encouraging white hegemonic norms. Noble found that after searching for black girls, the first search results were common stereotypes of black girls, or the categories that Google created based on their own idea of a black girl. Google hides behind their algorithm that has been proven to perpetuate inequalities.

Chapter 3
In Chapter 3 of Algorithms of Oppression, Safiya Noble discusses how Google's search engine combines multiple sources to create threatening narratives about minorities. She explains a case study where she searched “black on white crimes” on Google. Noble highlights that the sources and information that were found after the search pointed to conservative sources that skewed information. These sources displayed racist and anti-black information from white supremacist sources. Ultimately, she believes this readily-available, false information fueled the actions of white supremacist Dylann Roof, who committed a massacre

Chapter 4
In Chapter 4 of Algorithms of Oppression, Noble furthers her argument by discussing the way in which Google has oppressive control over identity. This chapter highlights multiple examples of women being shamed due to their activity in the porn industry, regardless if it was consensual or not. She critiques the internet's ability to influence one's future due to its permanent nature and compares U.S. privacy laws to those of the European Union, which provides citizens with “the right to forget or be forgotten.” When utilizing search engines such as Google, these breaches of privacy disproportionately affect women and people of color. Google claims that they safeguard our data in order to protect us from losing our information, but fails to address what happens when you want your data to be deleted.

Chapter 5
In Chapter 5 of Algorithms of Oppression, Noble moves the discussion away from google and onto other information sources deemed credible and neutral. Noble says that prominent libraries, including the Library of Congress, encourage whiteness, heteronormativity, patriarchy and other societal standards as correct, and alternatives as problematic. She explains this problem by discussing a case between Dartmouth College and the Library of Congress where "student-led organization the Coalition for Immigration Reform, Equality (CoFired) and DREAMers" engaged in a two-year battle to change the Library's terminology from 'illegal aliens' to 'noncitizen' or 'unauthorised immigrants.' Noble later discusses the problems that ensue from misrepresentation and classification which allows her to enforce the importance of contextualisation. Noble argues that it is not just google, but all digital search engines that reinforce societal structures and discriminatory biases and by doing so she points out just how interconnected technology and society are.

Chapter 6
In Chapter 6 of Algorithms of Oppression, Safiya Noble discusses possible solutions for the problem of algorithmic bias. She first argues that public policies enacted by local and federal governments will reduce Google's “information monopoly” and regulate the ways in which search engines filter their results. She insists that governments and corporations bear the most responsibility to reform the systemic issues leading to algorithmic bias.

Simultaneously, Noble condemns the common neoliberal argument that algorithmic biases will disappear if more women and racial minorities enter the industry as software engineers. She calls this argument “complacent” because it places responsibility on individuals, who have less power than media companies, and indulges a mindset she calls “big-data optimism,” or a failure to challenge the notion that the institutions themselves do not always solve, but sometimes perpetuate inequalities. To illustrate this point, she uses the example of Kandis, a Black hairdresser whose business faces setbacks because the review site Yelp has used biased advertising practices and searching strategies against her.

She closes the chapter by calling upon the Federal Communications Commission (FCC) and the Federal Trade Commission (FTC) to “regulate decency,” or to limit the amount of racist, homophobic, or prejudiced rhetoric on the Internet. She urges the public to shy away from “colorblind” ideologies toward race because it has historically erased the struggles faced by racial minorities. Lastly, she points out that big-data optimism leaves out discussion about the harms that big data can disproportionately enact upon minority communities.

Conclusion
In Algorithms of Oppression, Safiya Noble explores the social and political implications of the results from our Google searches and our search patterns online. Noble challenges the idea of the internet being a fully democratic or post-racial environment. Each chapter examines different layers to the algorithmic biases formed by search engines. By outlining crucial points and theories throughout the book, Algorithms of Oppression is not limited to only academic readers. This allows for Noble's writing to reach a wider and more inclusive audience.

Critical reception
Critical reception for Algorithms of Oppression has been largely positive. In the Los Angeles Review of Books, Emily Drabinski writes, "What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores." In PopMatters, Hans Rollman writes that Algorithms of Oppression "demonstrate[s] that search engines, and in particular Google, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures." In Booklist, reviewer Lesley Williams states, "Noble’s study should prompt some soul-searching about our reliance on commercial search engines and about digital social equity."

In early February 2018, Algorithms of Oppression received press attention when the official Twitter account for the Institute of Electrical and Electronics Engineers expressed criticism of the book, saying that the results of a Google search suggested in its blurb did not match Noble's predictions. IEEE's outreach historian, Alexander Magoun, later revealed that he had not read the book, and issued an apology.