Microsoft will not sell facial recognition tech to police without federal law
Microsoft said Thursday it will not sell facial recognition tools to police departments until there is a federal law governing the technology.
The company’s president, Brad Smith, made the announcement in during a Washington Post Live, noting that it’s in keeping with past Microsoft policy.
“We will not sell facial recognition technology to police departments in the United States until we have a national law in place, grounded in human rights that will govern this technology,” Smith said.
“The bottom line for us is to protect the human rights of people as this technology is deployed,” he added.
Microsoft president @BradSmi says the company does not sell facial recognition software to police depts. in the U.S. today and will not sell the tools to police until there is a national law in place “grounded in human rights.” #postlive pic.twitter.com/lwxBLjrtZL
— Washington Post Live (@postlive) June 11, 2020
Several federal bills governing the use of facial recognition by different groups, including the police, have been introduced, but as of now there are no laws on the books regulating the controversial technology.
Microsoft’s commitment comes amid concern from activists and civil rights groups that law enforcement may be using facial recognition tech to identify individuals participating in the anti-police brutality demonstrations that have erupted across the nation since the killing of George Floyd.
IBM on Monday announced it will no longer offer general purpose facial recognition or analysis software.
Amazon made a smaller move on Wednesday, placing a one-year moratorium on the sale of its facial recognition technology, Rekognition, to police.
That pledge has been slammed by the privacy advocates, who point out the company still markets its technology to Immigrations and Customs Enforcement and collaborates with police with its Ring doorbell company.
Even before the protests, facial recognition technology had been criticized as a tool for unwarranted surveillance, while multiple studies have found that it tends to misidentify women and people of color at comparatively higher rates than men and white people.