Tech, Now + Beyond

In this day and age, even algorithms are racist

Results from these formulas feed into a criminal justice system that's already heavily biased against minorities.

We often consider “big data” and the algorithms that govern it to be impartial, since a machine does all the work. Makes sense, right? Now, I’m not about to tell you that your MacBook Pro is racist – but the humans who wrote the algorithms that interpret data on it are. The algorithms are racially biased, which means that the data showing up on your computer screens is skewed – and thus perpetuates discriminatory beliefs. 

In short: algorithms are a reflection of their human creators and users.

A few weeks ago, Pro Publica released their findings on algorithms used to calculate the likelihood of American criminals are to reoffend. The non-profit found that the algorithms predict a higher likelihood of black people reoffending than what really appears in the data.

These predictions have real impact on people’s lives. The algorithms that spit out false positives in regards to African American criminals are an integral part in the decision-making process for prison sentences and bonds.

Results from these formulas feed into a criminal justice system that’s already heavily biased against minorities. The NAACP reports that African Americans make almost 1 million of the total 2.3 million incarcerated population. According to the City University of New York’s Center of Media, Crime and Justice, if African American and Hispanics were incarcerated at the same rates of whites, today’s prison and jail populations would decline by close to 50 percent.

The link between African Americans and crime continues even in our internet searches. Harvard professor Latanya Sweeney found that Google search-algorithms produce ads for arrest records sites when someone searches stereotypically “Black names,” even if there isn’t a significant number of actual arrest records under that name.

Take a moment to read that again.

This is a case of algorithms exposing the biases of its users. These specific ads were appearing more often for certain names because users kept clicking on them. In turn, the search results reinforced people’s prejudices about criminals. It’s a terrible cycle. 

The list doesn’t end there. There are formulas that lead to African Americans receiving poorer credit terms and ones that result in predominantly Black neighborhoods being bypassed by large corporations like Amazon for its Prime same-day delivery.

Nothing, even data, is objective and unbiased.