Tens of thousands of Amazon.com employees once could listen to voice recordings of Alexa users, according to U.S. regulators.
Some 30,000 Amazon workers had access to audio clips picked up by the company’s voice-activated speakers, the Federal Trade Commission said this week in a complaint about children’s privacy that the company settled for $25 million. The FTC’s tally of employees with access to Alexa recordings covers the period between August 2018 and September 2019, and it’s not clear how many personnel have such access today.
Bloomberg reported in 2019 that thousands of Amazon workers reviewed Alexa voice recordings to improve the software, but the FTC tally demonstrates that access to the recordings went well beyond those employees. About half of them had no business reason to access the data, according to the FTC, which called the practice an “overbroad grant of access” that violated Amazon’s security policies governing sensitive data. Many of the 15,000 personnel “did not even work on Alexa-enabled products,” the FTC said.
The disclosure is the latest in a string of revelations that Amazon, despite pledging to be a good steward of customer information, has at times given employees overly broad access to personal data and failed to disclose the practice to users. Wired reported in 2021 that the company struggled over the years to adequately lock down data, including customer order histories, amassed during years of breakneck growth. A year earlier, Amazon said it had fired some employees who leaked customer email addresses and phone numbers to an unspecified third party.
Amazon has ramped up its privacy practices in recent years amid scrutiny from the media, privacy advocacy groups and the FTC. Alexa, most commonly found in the Echo line of smart speakers, is designed to record audio when it hears a “wake word” and transmits user commands to read a news report or fetch the weather to Amazon’s servers. By default, the company stores that data indefinitely, though Amazon has added tools that let users auto-delete recordings and opt out of having their voices used to train Alexa’s systems.
Kristy Schmidt, an Amazon spokesperson, didn’t say how many employees can access Alexa recordings today. “We have strict policies and practices in place to limit access to voice recordings only to authorized employees,” she said in an emailed statement. “These employees use voice recordings to create new features, help our customer service teams troubleshoot issues for customers, and train and improve Alexa.” In a separate blog post, Amazon said it disagreed with the FTC’s claims and denied violating the law.
The complaint, filed by the Justice Department on behalf of the FTC, also says Amazon repeatedly failed to delete Alexa users’ data, even after customers asked it to. Until mid-2019, when a user requested that Alexa delete their voice recordings, Amazon kept a written transcript that could be used to train its speech-recognition software. The company didn’t disclose that to users, the FTC said.
Amazon also repeatedly failed to delete location data from Alexa app users who thought they had eliminated their files.
Amazon’s Schmidt said the company “can confirm we’ve deleted geolocation data that customers previously requested we delete, and can confirm that the deletion control is working properly for our customers.”
The FTC’s proposed order, which requires a judge’s sign-off, would prohibit Amazon from using data customers had deleted to improve its services, require the company to notify users of its retention and deletion policies and mandate the creation of a privacy program related to use of location data, among other provisions.