And since smart speakers are built for convenience, more often that not there's no additional security measures in place before they can receive those commands. Smartphones, tablets, Facebook Portal, and other devices that use MEMS microphones and have a voice assistant were also found to be susceptible to such laser-based attacks.
"It's possible to make microphones respond to light as if it were sound", Sugarawa explained.
This proves that an attacker could attempt to open smart doors, smart cars, and access anything a Google Home, Amazon Echo, or Apple HomePod would have access to. Because the microphones on voice assistants work by converting sound into electrical signals, encoding the same electrical signal into a laser beam produces an equivalent response to a particular voice command.
Laser pointers could be used by hackers to silently "speak" to smart speakers from hundreds of metres away without a user ever knowing.
Daniel Genkin, one of the paper's co-authors and an assistant professor at the University of MI, told the Times that there is an easy fix for the time being: Leave your voice-controlled assistant out of the line of sight from outside your home, "and don't give it access to anything you don't want someone else to access". The vulnerability is not just limited to smart speakers but can be used to hack into any computer that includes a microphone to accept voice commands. People aren't running around jacking supercars with Radio Shack laser pointers, but the confluence of pervasive connectivity, accessible technology, and a relentless drive for convenience uber alles has managed to drag a lazy and frankly stupid plot device into the real world. They've been able to open a garage door by hitting a voice assistant with a laser beam, and they were able to control a Google Home device on the fourth floor of a building from 230 feet away from the top of a different building.
Interestingly, the researchers found that Google Home and Amazon Alexa smart speakers block purchasing from unrecognized voices, but they do allow previously unheard voices to execute commands like unlocking connected smart locks.
The technique requires the laser to actually hit the target device's microphone port, which could get significantly more hard as the distance gets larger. They notified Google, Amazon, Apple, Tesla and Ford about the vulnerability.
Injecting voice-commands to smart speakers from a long range might not sound like a major threat, but devices from Google, Amazon, and Apple are shaping up to be a main hub for controlling gadgets in the smart home, including lights, smart locks, and garage doors.
"We are closely reviewing this research paper". Amazon did not respond to a request for comment at the time of publication.
Other undetectable means of exploiting voice-command devices have been revealed by researchers, but their powers have been more limited. Attempting to hack phones and tablets from the distance, using laser beams, seems the more unsafe side-effect of the hack, and the kind of attack you'd see in spy movies. Assuming a smart speaker is visible from a window, hackers could use Light Commands to unlock smart doors, garage doors, and auto doors.