George Floyd: Amazon bans police use of facial recognition tech
Technology giant Amazon has banned the police from using its controversial facial recognition software for a year.
It comes after civil rights advocates raised concerns about potential racial bias in surveillance technology.
This week IBM also said it would stop offering its facial recognition software for "mass surveillance or racial profiling".
The decisions follow growing pressure on firms to respond to the death in police custody of George Floyd.
Amazon said the suspension of law enforcement use of its Rekognition software was to give US lawmakers the opportunity to enact legislation to regulate how the technology is employed.
"We've advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge," Amazon said in a statement.
"We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested."
However, the company said that it would still allow organisations that deal with human trafficking to use the technology.
Like other facial recognition products, Amazon's Rekognition can use Artificial Intelligence (AI) to very quickly compare a picture from, for example, an officer's phone camera and try to match it with mugshots held on police databases that can hold hundreds of thousands of photos.
Facial recognition technology has been criticised for some time over potential bias, with studies showing that most algorithms are more likely to wrongly identify the faces of black people and other minorities than those of white people.
In the past Amazon has defended Rekognition against charges of bias, while continuing to offer it to law enforcement agencies.
The death in police custody of George Floyd, an African American man, reignited those concerns as police tactics and the use of technology for law enforcement have come under intense scrutiny.
Earlier this week IBM said it would no longer offer its facial recognition technology because AI systems used in law enforcement need to be tested "for bias".
In a letter to Congress, IBM chief executive Arvind Krishna said the "fight against racism is as urgent as ever", and set out three areas where the company wanted to work with Congress: police reform, responsible use of technology, and broadening skills and educational opportunities.
Calls for legislation
In recent months Congress has been weighing possible legislation of the technology as lawmakers, companies and civil liberties activists have called for stronger regulation of surveillance software.
House Democrats introduced a police reform bill on Monday that would prohibit federal law enforcement use of real-time facial recognition, but some activists said the measure didn't go far enough.
The American Civil Liberties Union said all use of facial recognition on police body camera footage should be banned, and that federal funding should be restricted for local law enforcement agencies that didn't restrict the technology's use in the same way.