MIT researcher warned Amazon of bias in facial recognition software

MIT researcher warned Amazon of bias in facial recognition software
© Getty Images

An MIT researcher warned Amazon CEO Jeff Bezos last month that the company's controversial facial recognition software is most inaccurate on women and individuals of color.

In a June 25 letter, which was not reported on at the time, Joy Buolamwini, who also founded the Algorithmic Justice League to combat bias coded into algorithms, said the biased technology could exacerbate existing racial biases in policing.

Amazon sells its facial recognition software, known as "Rekognition," to law enforcement agencies across the country.

ADVERTISEMENT
While the software showed extremely low rates of error for men with lighter skin, Buolamwini found in her testing of the software that that women of all skin colors and individuals of both genders with darker skin face much higher rates of inaccuracy.

Of the all the male faces tested, only 0.14 percent were misidentified. Lighter skin faces of both genders were incorrectly identified around 2 percent of the time. Women, meanwhile, were misidentified over 16 percent of the time and darker skinned faces of both genders were misidentified over 13 percent of the time.

She also cautioned that the straightforward nature of her tests, which were done on still images, are much more likely to produce accurate results than on potentially blurry pictures in a range of different types of lighting in the real world.

“Given what we know, it is irresponsible to use these systems, and I support a moratorium on the police use of facial recognition,” Buolamwini told The Hill. “I also support the call for federal regulations around facial recognition technology as, unlike Canada, the US has no federal laws on biometric data.”

Buolamwini used the same methodology in her test of Amazon’s facial recognition as she did in her previous analysis of IBM and Microsoft’s facial recognition software, which also both showed higher rates of misidentifying dark skinned women.  

Her study on Amazon’s Rekognition is a more comprehensive version of the one released by the ACLU on Thursday, that also found patterns of bias against people of color. The ACLU’s study was conducted on 535 members of Congress, while Buolwimini’s study was on a data set of over 1200 faces.  

The ACLU, along with Amazon employees, Amazon shareholders and other groups are calling for the company to its facial recognition contracts with law enforcement and for a moratorium government of facial recognition technology.