Critics Ask Amazon to Stop Selling Facial Recognition Technology to Police

May 23, 2018, 10:29 PM UTC

The ACLU along with around three dozen other organizations are calling for Amazon to stop selling facial recognition technology — called Rekognition — to police departments.

In a letter made public by the ACLU on Tuesday, the civil liberties organizations “demand that Amazon stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country.”

The image detection and recognition technology, introduced in 2016, is one of many offerings from Amazon Web Services (AWS). Rekognition can recognize faces, scenes, and objects.

“Given an image, it will return a list of labels,” Jeff Barr, Amazon’s chief evangelist for AWS, said in 2016. “Given an image with one or more faces, it will return bounding boxes for each face, along with attributes.”

And under its uses, Barr gives examples ranging from helping categorize an online photo collection, to security — visual surveillance, employee face/badge comparisons, etc. Surveillance by police departments is what worries civil liberties organizations.

“People should be free to walk down the street without being watched by the government,” the ACLU and the other groups said in their letter. “Facial recognition in American communities threatens this freedom.

They continued: “In overpoliced communities of color, it could effectively eliminate it. The federal government could use this facial recognition technology to continuously track immigrants as they embark on new lives. Local police could use it to identify political protesters captured by officer body cameras. With Rekognition, Amazon delivers these dangerous surveillance powers directly to the government.”

According to the New York Times, Amazon is pitching the technology to police departments. Adopters include the Orlando Police Department in Florida and the Washington County Sheriff’s Office in Oregon.

The ACLU obtained documents from these two police departments that reveal how Amazon markets the technology, and how police departments use it.

The Washington County Sheriff’s Office pays $6 to $12 monthly to Amazon for the technology, and it has been using it for over a year, according to CNN. The sheriff’s office uploaded 300,000 images from its booking photo database into Rekognition, and it uses those to look for the identity of unknown suspects or victims. Public information officer Deputy Jeff Talbot said that a match on the system is not probable cause.

“The Sheriff’s Office does not use the technology for mass and/or real time surveillance,” Talbot told CNN. “In fact, state law and our policy prohibits it for such use.”

In Orlando, the system is being used along with “existing City resources to provide real-time detection and notification of persons-of-interests, further increasing public safety, and operational efficiency opportunities for the City of Orlando and other cities across the nation,” Orlando Police Chief John Mina said on Amazon Rekonition’s website.

“The purpose of a pilot program such as this, is to address any concerns that arise as the new technology is tested,” a spokesperson for the Orlando police told CNN. “Any use of the system will be in accordance with current and applicable law.”

Amazon told Fortune in a statement that the company “requires that customers comply with the law and be responsible when they use [Amazon Web Services].” It Amazon said that media companies used the technology during last weekend’s royal wedding to identify attendees.

“Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology,” Amazon said.

Still, many groups are still worried about government abusing the technology.

“Amazon must act swiftly to stand up for civil rights and civil liberties, including those of its own customers, and take Rekognition off the table for governments,” the letter concludes.

Update: This article was updated from the original with a comment from Amazon.