Racial Bias Found In Amazon Facial Recognition Software Used By Law Enforcement

Racial Bias Found In Amazon Facial Recognition Software Used By Law Enforcement

5 years ago
Anonymous $Dftgs0JzgE

http://www.newsweek.com/racial-bias-found-amazon-facial-recognition-software-used-law-enforcement-1306407

Facial technology software from Amazon used by some law enforcement agencies has shown inaccuracies, particularly when it comes to women of color. Sometimes the technology mistakes dark-skinned women as men.

The system worked fine when recognizing men. It’s identifying women that became a glaring problem for the software.

Last Seen
38 minutes ago
Reputation
0
Spam
0.000
Last Seen
2 hours ago
Reputation
0
Spam
0.000
Last Seen
18 minutes ago
Reputation
0
Spam
0.000
Last Seen
42 minutes ago
Reputation
0
Spam
0.000
Last Seen
35 minutes ago
Reputation
0
Spam
0.000
Last Seen
2 hours ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000