Amazon Bans Police From Using Facial Recognition Technology

Amazon Office

Amazon has announced that they are “implementing a one-year moratorium on police use of Rekognition,” the company’s facial recognition technology. The company does not say whether that applies to federal law enforcement agencies as well. The announcement follows IBM’s statement that was released only three days ago, saying that they will no longer offer, develop, or research facial recognition technology.

No Clear Reason

IBM said that the reason for their decision was the potential abuse of privacy and human rights. Amazon did not really cite a reason for their decision. They announced the choice in a message, in which they gave only a very short explanation.

The announcement says that the company has “advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested”.

The company does say that it will continue to give organizations looking for missing children and combatting human trafficking access to their software. What isn’t clear in this statement is whether the moratorium applies to the federal government too. And it also doesn’t say what the company is planning to do during or after the year. Will it try and improve its technology or will the company follow IBM and quit the game altogether?


Amazon’s facial recognition tool ‘Rekognition’ has encountered a lot of criticism over the years. Firstly, researchers have found out that the software isn’t always that effective. Especially not when people with a darker skin color needed to be identified. A study by Joy Buolamwini and Deborah Raji showed that the system worked almost perfectly when it had to identify someone with a light skin, but when it concerned a darker-skinned person the software struggled with identifying the gender of the person. Darker-skinned women were often mistaken for men.

Amazon said that the findings were incorrect. Buolamwini responded to that by saying that “Amazon’s approach thus far has been one of denial, deflection, and delay. We cannot rely on Amazon to police itself or provide unregulated and unproven technology to police or government agencies”. Other AI researchers backed Buolamwini and stated that Rekognition should not be used by law enforcement.

Amazon has also faced criticism for selling access to Rekognition to police departments. This caused worry with activists and civil rights organizations, because there was no oversight over the way the software was used. Since the technology has been proven to be unreliable, it is very likely that the use of it will lead to discrimination. Shareholders have even asked Amazon to stop selling the technology to law enforcement, but they lost the vote.

Justice In Policing Act

It is likely that Amazon’s decision was a response to the Justice in Policing Act. This bill was presented in congress this week, and if it passes it will mean that restrictions will be put in place on how police can use facial recognition technology. The bill responds to many of the racial inequality issues that are currently being protested in the US. Since Rekognition has faced so much criticism, it might be the right move for Amazon in the fight for equality.

Kristen Clake, the president and executive director of the lawyers’ committee for civil rights under law, said that “[t]he Justice in Policing Act is historic and long overdue legislation that will put our country on a path to reform. This Act is responsive to many of the urgent demands being pressed for by our communities and by the people protesting for racial justice and equity across our nation”.

Cybersecurity analyst
David is a cyber security analyst and one of the founders of Interested in the "digital identity" phenomenon, with special attention to the right to privacy and protection of personal data.