A user of Amazon's Alexa voice assistant in Germany got access to more than a thousand recordings from another user because of "a human error" by the company. Reuters noted that Amazon failed to respond to the consumer who received all of those audio files, and the links were deleted after the man loaded them on his home computer. "We have resolved the issue with the two customers involved and have taken steps to further improve our processes", an Amazon spokeswoman said. "As a precautionary measure we contacted the relevant authorities", the Amazon spokesman added.
The customer emailed Amazon.de customer service informing the of the error, and inquired about who the files in question belonged to, though received no reply, and later found that the download link to the file was dead. An individual in Germany requested copies of all of his data on Amazon's files as part of the EU General Data Protection Regulation.
Then one day, over 1,700 recordings of you speaking to Alexa are sent to a totally random person - and you don't even know about it until a magazine gets in touch.
For all these data, for the voice recordings, in Particular, is supposed to apply: Unauthorised persons may not have access to them. 'Using these files, it was fairly easy to identify the person involved and his female companion; weather queries, first names, and even someone's last name enabled us to quickly zero in on his circle of friends, ' according to the report. He reportedly said that Amazon had not reached out to notify him about the leak.
- Internet of Shit (@internetofshit) December 20, 2018New: #Amazon sent around 1700 audio files and transcripts from a single user's Alexa recordings. The victim was told the series of events which led to his data being leaked, though c't notes that Amazon "claimed that they had discovered the error themselves". The fact Amazon is referring to this as an isolated case suggests it may be an automated system, but human intervention was required which resulted in a mistake being made. Amazon said the virtual assistant misinterpreted speech as an order to send the conversation to a contact. It's not clear if the data mix-up is considered a breach under that law.