To the horror of users, VRT NWS heard just everything: personal information, bedroom talks, domestic violence and what not about Google Assistant users in Belgium and the Netherlands. If the audio recorded by Google was secure and protected, a Dutch news site wouldn't have gained access to a treasure trove of what many would consider to be extremely sensitive data. These were often recordings of conversations containing a lot of private information.
"Rarely, devices that have the Google Assistant built in may experience what we call a "false accept, '" Monsees added".
When questioned, Google said that the company uses and transcribes only 0.2% of the recordings and uses it to improve voice recognition technology.
On Thursday, more than a 1,000 voice recordings from Google Home speakers were leaked to a Belgian media outlet.
In response to the report, Google published blog post defending their audio clip transcriptions.
"Language experts review and transcribe a small set of queries to help us better understand those languages", wrote Google product manager David Monsees.
Although user information is de-linked from the audio excerpts to make them anonymous, the recordings themselves make it easy to know someone's identity, who can also be traced through their address or company name that the subcontractors look up on Google or Facebook. Worse, Assistant can be triggered inadvertently, either by accidentally pressing a button or saying something that sounds like "OK Google", which could lead the device to record private conversations or even people engaging in sexual activities.
According to the VRT News report, most of the recordings subcontractors listen to are ones that are made consciously by Google Home users. "We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again", a spokesman said.
"We hold ourselves to high standards of privacy and security in product development, and hold our partners to these same standards", Google said.
They're listening, but they aren't necessarily deleting: a few weeks ago, Amazon confirmed - in a letter responding to a lawmaker's request for information - that it keeps transcripts and recordings picked up by its Alexa devices forever, unless a user explicitly requests that they be deleted.
However, it said, 153 were "conversations that should never have been recorded" because the wake phrase of "OK Google" was not given.
Although Google clearly doesn't seem to have nefarious intentions behind this effort, the fact that other people can listen to what you say to your device could be seen as a potential privacy concern.