Amazon halts police use of its facial recognition technology

For story suggestions or custom animation requests, contact [email protected] Visit http://archive.nextanimationstudio.com to view News Direct's complete archive of 3D news animations.

RESTRICTIONS: Broadcast: NO USE JAPAN, NO USE TAIWAN Digital: NO USE JAPAN, NO USE TAIWAN
Amazon announced that it will ban police use of the company's facial recognition tool Rekognition for a year, in a blogpost dated Wednesday.

Earlier this year, an Amazon executive told PBS that not even Amazon knows the exact number of police departments that utilize Rekognition.

Amazon's announcement follows in the wake of U.S. protests that have swept across the nation after George Floyd was killed by police in Minnesota.

According to a 2018 study by MIT Media Lab, facial recognition tools were more prone to errors when identifying people of color, and products from major tech corporations misgendered dark-skinned women 35 percent of the time.

A 2019 study by the same institution further states that Amazon's Rekognition mislabelled women of color as men 31 percent of the time, while Microsoft's technology made the error only 1.5 percent of the time.

Citing a statement from Clearview AI, The Verge reports that Amazon, Google and IBM have exited the law enforcement facial recognition market, which left the field to Clearview. In the statement, Clearview asserts that its product "actually works."

RUNDOWN SHOWS:
1. Amazon suspends police use of its Rekognition technology for a year
2. Facial recognition tools make more errors when identifying people of color
3. Rekognition mistook the gender of dark-skinned women 31 percent of the time
4. Amazon, Google, IBM have exitted the LEO facial recognition market

VOICEOVER (in English):
"In a blogpost dated Wednesday, Amazon announced that it will ban police use of the company's facial recognition tool Rekognition for a year."

"Earlier this year, an Amazon executive told PBS that not even Amazon knows the exact number of police departments that utilize Rekognition."

"According to a 2018 study by MIT Media Lab, facial recognition tools were more prone to errors when identifying people of color, and products from major tech corporations misgendered dark-skinned women 35 percent of the time."

"A 2019 study by the same institution further states that Amazon's Rekognition mislabelled women of color as men 31 percent of the time, while Microsoft's technology made the error only 1.5 percent of the time."

"Citing a statement from Clearview AI, The Verge reports that Amazon, Google and IBM have exited the law enforcement facial recognition market, which left the field to Clearview. In the statement, Clearview asserts that its product 'actually works.'"

SOURCES: Amazon, Proceedings of Machine Learning Research, Massachusetts Institute of Technology, The New York Times, The Verge
https://blog.aboutamazon.com/policy/we-are-implementing-a-one-year-moratorium-on-police-use-of-rekognition?ots=1&slotNum=0&imprToken=66b95076-aee0-bd2f-1f8&tag=curbedcom06-20&linkCode=w50
http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf
https://www.media.mit.edu/articles/amazon-is-pushing-facial-technology-that-a-study-says-could-be-biased/
https://www.nytimes.com/2019/01/24/technology/amazon-facial-technology-study.html
https://www.theverge.com/2020/6/10/21287101/amazon-rekognition-facial-recognition-police-ban-one-year-ai-racial-bias