top of page
Search

Racial Bias In Technology

If you're on your phone frequently, you might have noticed your biometric settings include the option to utilize the Facial Recognition feature of your phone. While this is a useful feature, it can come with a unique set of hiccups. For starters, your settings have to be adjusted depending on whether or not you wear glasses regularly. Second, the Facial Recognition feature doesn't always work at night, or in areas that are not well lit; this prevents the scanners from picking up your face properly. Another issue that seems to be apparent in using Facial Recognition, is that studies have confirmed instances of racial bias.

Certain algorithms show large variances in accuracy across race, gender, and other demographics. A 2011 study, co-authored by one of the organizers of NIST’s vendor tests, found that algorithms developed in China, Japan, and South Korea recognized East Asian faces far more readily than Caucasians. The reverse was true for algorithms developed in France, Germany, and the United States, which were significantly better at recognizing Caucasian facial characteristics. This suggests that the conditions in which an algorithm is created—particularly the racial makeup of its development team and test photo databases—can influence the accuracy of its results.

Similarly, a study conducted in 2012 used a collection of mug shots from Pinellas County, Florida to test the algorithms of three police in California, Maryland, Pennsylvania, and elsewhere. The study, co-authored by a senior FBI technologist, found that all three algorithms consistently performed 5-to-10 percent worse on African Americans than on Caucasians. One algorithm, which failed to identify the right person in 1 out of 10 encounters with Caucasian subjects, failed nearly twice as often when the photo was of an African American.

So why is this a problem?

Envision police are exploring a theft that was captured on camera. Let's say at the point when they run a video still of the suspect's face against their facial recognition database, they get 10 potential matches, yet none are an ideal match to the actual perpetrator. In this instance, the closest match is dealt with as a lead, and police will begin investigating that individual. On account of the rate of accuracy in facial-acknowledgment reflected in the most recent data, this situation is factually bound to happen to an African American than a white individual.

Furthermore, these cases become difficult to prosecute because a lawyer is then essentially arguing a "he said, she said" case with a computer, who is not a physical witness, but can be a credible source. What we need to keep in our minds as legal advisors and lawyers is how the technology we incorporate into our business models affects the lives of our clients and customers in the long run.

1 view0 comments

Recent Posts

See All
bottom of page