Racial Bias Found In Amazon Facial Recognition Software Used By Law Enforcement

Racial Bias Found In Amazon Facial Recognition Software Used By Law Enforcement

5 years ago
Anonymous $Dftgs0JzgE

http://www.newsweek.com/racial-bias-found-amazon-facial-recognition-software-used-law-enforcement-1306407

Facial technology software from Amazon used by some law enforcement agencies has shown inaccuracies, particularly when it comes to women of color. Sometimes the technology mistakes dark-skinned women as men.

The system worked fine when recognizing men. It’s identifying women that became a glaring problem for the software.