top of page
  • Writer's pictureAdriana Leos

Are algorithms inherently biased and racist?


Screen capture of an article titled, 'Forbes' 'Top-Earning TikTok-ers 2022' List Contains Zero Black Creators'

Screenshot taken from article on bet.com.

When we see articles like this published sadly, unfortunately, and as infuriating as it is - many of us are not surprised.


And because we continuously see incidents like this occur, we are forced to point out research conducted by Ruha Benajmin in Race After Technology and specifically the term coined, the New Jim Code.

The New Jim Code is the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective and progressive than the discriminatory systems of a previous era (pg. 6, Race After Technology).


An important aspect to understanding the New Jim Code comes from, Safiya Umoja Noble’s book in, Algorithms of Oppression, in that “understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings (pg. 1, Algorithms of Oppression).


Benjamin also argues that "bias enters through the backdoor of design optimization in which the humans who create the algorithms are hidden from view (pg. 11, Race After Technology).

One example that Benjamin presents is that of the Shirley Cards produced by Kodak from 1950 to 1990s.


Image of an example of a Shirley Card; produced by Kodak in the 1950s and used until 1990s.

Screenshot taken from search results in Google search images.

The Shirley Cards were part of the film exposure methods used to develop film photography using an image of a White woman as the example to standardize the process. Thus, because the image of a White woman was used as the norm for the development process, darker skinned people would consistently be underexposed in photographs. Benjamin states, "as one photographer put it, 'It turns out, film stock's failures to capture dark skin aren't a technical issue, they're a choice.' This also implies we can choose otherwise" (pg. 104, Race After Technology).

And although many photographers and parents of Black children began objecting to the fact that their children's faces were blurry in yearbook photos as compared to the White students, the "photographic industry did not fully take notice until companies that manufactured brown products like chocolate and wooden furniture began complaining that photographs did not depict their goods with enough subtlety" (pg. 105, Race After Technology).

As danah boyd and M.C. Elish of the Data & Society Research Institute put it, “the datasets and models used in these systems are not objective representations of reality. They are the culmination of particular tools, people, and power structures that foreground one way of seeing or judging over another" (pg. 11, Race After Technology).

"Glitches" in technology:


During the summer of 2020, TikTok seemed to be blocking hashtags related to George Floyd and Black Lives Matter protests; the company apologized, blaming a “technical glitch.”


However, in Algorithms of Oppression, Noble explains, “stories of 'glitches' found in systems do not suggest that the organizing logics of the web could be broken but, rather, that these are occasional one-off moments when something goes terribly wrong with near-perfect systems.


And that these data aberrations have come to light in various forms. For example, in 2015, U.S. News and World Report reported that a “glitch” in Google’s algorithm led to several problems through auto-tagging and facial-recognition software that was apparently intended to help people search through images more successfully” (pg. 6).

One example, and major issue was in 2016 as Google search results when searching the keyword, "gorilla" presented images of Black people in search results.

Screen capture taken from Google of an image of two Black teenagers displayed after searching the word 'gorilla' in Google's search bar.

Screen capture taken from Google of an image of two Black teenagers displayed after searching the word 'gorilla' in Google's search bar.

Screenshots taken from search results in Google search images.

The company quickly apologized and promised to fix the "glitch."

Another example involved Google Maps, when searching the word "N*****" presented a map to the White House during Obama's presidency.

Screen capture taken from Google of an image of the White House in Washington DC in Google maps

Screenshot taken from search results in Google search images.


We must become aware that these incidents are not coincidental or mere "glitches" of technology. Rather, they are only a few examples illustrating how bias and straight-up racism is encoded into the algorithms that we use on a daily basis.

As Noble puts it, “.....in other words, traditional misrepresentations in old media are made real once again online and situated in an authoritative mechanism that is trusted by the public – Google" (pg. 3, Algorithms of Oppression).


And as Noble points out, Google "is trusted by the public," as many of us do turn to Google to search for information and many of us are satisfied with the first answer that we are presented with in search results. And without digging further to look for more information, we may very well be presented with misinformation or confirmation bias.


We are able to make these connections because we know that these algorithms can be inherently racist as they are created and designed by human beings. Furthermore, the algorithms are updated on the basis of our own behavior. Meaning, developers update algorithms based on the patterns that they observe when people use online tools like Google to search.


Therefore, when it comes to search engines like Google, when can confidently state that these online tools reproduce the biases that persist in our society.


"Racism thus becomes doubled – magnified and buried under layers of digital denial" (pg. 12, Race After Technology).

Therefore, when we see articles such as, There Are No Black TikTok Creators on Forbes' Highest Earners List, again as infuriating as it is - because of research presented by people including Ruha Benjamin and Safiya Umoja Noble, we cannot be surprised. Research such as this shows us that just as in society, online tools, apps, and platforms also prove that Black people, and people of color, stand at a disadvantage.


For more, check out our TikTok on this topic and be sure to share it with others to continue to keep this important conversation going.




Adriana Leos; Chief Creative Officer of vznayres

written by:

Adriana Leos

Chief Creative Officer

vznayres

Recent Posts

See All
bottom of page