Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Amazon’s facial-recognition tool misidentified 28 lawmakers as people arrested for a crime, study finds

Packages move along a conveyor belt at the Amazon.com Inc. fulfillment center in Robbinsville, New Jersey on June 7. (Bess Adler / Bloomberg)
By Tony Romm Washington Post

Amazon’s facial recognition tools incorrectly identified Rep. John Lewis, D-Ga., and 27 other members of Congress as people arrested for a crime during a test commissioned by the American Civil Liberties Union of Northern California, the watchdog said Thursday.

The ACLU said its findings show that Amazon’s so-called Rekognition technology – already in use at law-enforcement agencies in Oregon and Orlando – is hampered by inaccuracies that disproportionately put people of color at risk and should prompt regulators to halt “law enforcement use of face surveillance.”

For its test, the ACLU of Northern California created a database of 25,000 publicly available arrest photos, though the civil liberties watchdog did not give details about where it obtained the images or the kinds of individuals in the photos. It then used Amazon’s Rekognition software to compare that database against photos of every member of the U.S. House and Senate.

Ultimately, Amazon’s technology flagged photos of 28 members of Congress as likely matches with the ACLU’s collection of mugshots. Among the misidentified lawmakers were Sen. Ed Markey, D-Mass., who has called for federal privacy legislation, and six members of the Congressional Black Caucus, including civil rights icon Lewis.

Two months earlier, the CBC wrote a letter to Amazon stressing the lawmakers are “troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protestors.” The CBC said the software was particularly risky because “communities of color are more heavily and aggressively policed than white communities,” meaning mistakes caused by faulty facial-recognition software could prove especially harmful.

On Thursday, Amazon questioned the ACLU’s methodology for its test, stressing the threshold the watchdog set for what qualifies as a match – a “confidence,” or similarity rating, of 80 percent – had been too low. “While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty,” an Amazon spokeswoman said.

But the ACLU of Northern California countered that 80 percent is the default setting on Amazon’s facial recognition tool. “Amazon should not be encouraging customers to use that confidence level for recognizing human faces,” said Jacob Snow, a technology lawyer at the organization.

Snow said the findings nonetheless affirm their worst fears: that facial-recognition technologies are too unsophisticated to be deployed by law enforcement agents, where misidentification isn’t just a privacy concern – it “could cost people their freedom or even their lives.”

The privacy watchdog called again for Congress to write broad new regulations governing the use of the technology, though lawmakers long have struggled to write any federal privacy rules around facial recognition or other high-tech tools adopted by police, including location tracking technologies.

Amazon’s facial-recognition technology has worried civil liberties activists since May, after the ACLU of Northern California obtained and released an open-records request showing Rekognition in use by law enforcement agencies around the country. In Washington County, Oregon, for example, the sheriff’s office had built a database of 300,000 mug shots that officers can query for information about potential suspects, the Post previously reported.

In response, civil-liberties groups including the ACLU wrote Amazon that month, demanding that the e-commerce and cloud-computing giant “stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country.”

At the time, Amazon more broadly defended its technology. “When we find that AWS services are being abused by a customer, we suspend that customer’s right to use our services,” spokeswoman Nina Lindsey said. She pointed to other uses of the Rekognition technology, from identifying celebrities at the royal wedding to locating lost children at busy amusement parks.