Tech, Now + Beyond

Self-driving cars are more likely to hit people with darker skin

Once again brown people are left out of technological advancements but are the most likely to be effected by them

Self-driving cars have been in the hot seat for a number of safety reasons. With the fear of cars going off-road or handling sharp turns comes an even worse fear for people of color, being more likely to be hit because of the color of their skin.

new study, Predictive Inequity in Object Detection,” from the Georgia Institute of Technology suggests AI recognition systems in self-driving cars may have more difficulty detecting pedestrians with dark skin than those with light skin. Meaning: a person of color is more likely to be hit by a self-driving car than someone white. 

Authors Benjamin Wilson, Judy Hoffman, and Jamie Morgenstern analyzed how effective image detection systems were at identifying individuals from different racial and demographic groups. They did this by first looking at a large dataset of images containing pedestrians then dividing up the people into two groups using the Fitzpatrick scale, a system that classifies human skin tones from light to dark into six categories, Business Insider reported.

The researchers then analyzed how often eight image-detection models correctly detected the presence of people in the light-skinned group versus how often they correctly detected people in the dark-skinned group. On average, according to the reportthe image-detection systems were 5% less accurate at detecting darker skinned pedestrians.

“This behavior suggests that future errors made by autonomous vehicles may not be evenly distributed across different demographic groups,” the study read. “A natural question to ask is which pedestrians these systems detect with lower fidelity, and why they display this behavior.”

According to VOX, the disparity persisted even when the researchers used controlled variables like the time of day in images. The findings suggest that the difference could result from a lack of images of dark-skinned pedestrians used to train the systems.

“The main takeaway from our work is that vision systems that share common structures to the ones we tested should be looked at more closely,” Jamie Morgenstern, one of the authors of the study, told VOX in an interview.

While the study, Vox reports, has not been peer-reviewed and did not use the same image-detection systems or image sets used by vehicle manufacturers, the study brings to light the issue of potential racial bias and suggests that companies developing driving technology be attentive to the methods they use to train vehicles to identify pedestrians.

In a statement on Twitter, Co-director of the AI Now Research Institute Kate Crawford, said “In an ideal world, academics would be testing the actual models and training sets used by autonomous car manufacturers. But given those are never made available (a problem in itself), papers like these offer strong insights into very real risks.”

But racial algorithm bias in recognition systems and technology isn’t only an issue for the automobile world. Last year, Microsoft, IBM, and Amazon were called out for using facial recognition technology that was biased against people with darker skin tones. These companies failed to accurately identify people with darker skin. Connecting members of Congress to criminals and not recognizing the faces of other people of color they depicted a bias in the creation of technology.

The findings of this study shed light on a wider issue on the lack of inclusion of people of color in developing technology and brings to stage the solution of including more diverse images of pedestrians used in training systems. While the study did not directly research the system in current self-driving cars. it helped builders and developers figure out what we need to do to avoid a future of biased self-driving cars. Not only do we need to start including more images of dark-skinned people in data sets but we need to make sure they are accurately detected.