A group of researchers discovered they can hack Alexa and other voice assistants with a laser. The takeover is instantaneous and silent - a well-placed command to turn the device's volume down to zero would ensure that even its spoken responses could go unnoticed by its hapless owner.
So how did the researchers manage to control these devices with just a laser?
In one experiment, researchers set up a Google Home device in a fourth-floor office, then hit it with a laser set up in a bell tower, giving a distance of 75 meters (246 feet) between the two.
Or at least that's what we thought until this week when a US-Japanese team published a research paper which confirms an interesting and under-estimated possibility - these devices will also accept "signal injection" commands sent to them using pulses of laser light over distances of a hundred metres or more.
Smart speakers can be hijacked this way from up to 350 (~107m) feet away.
While accurately pointing the laser light at a smart speaker seems like a bit of a task, a determined attacker could do so from nearly 110 meters away outside the window.
The attack, dubbed LightCommands, works because the diaphragm in microphones converts sound into electrical signals. That means an attacker can use digital assistants to make online purchases, unlock door locks, garage doors, or locate a connected vehicle such as a Tesla. By modulating the intensity of the light, the team can match the signal of their chosen voice command. It was much easier hijacking smart speakers from afar, though.
However, it's not just smart speakers that are vulnerable to light commands. Instead of sound, however, an attacker can encode unauthorised voice commands into a laser light beam.
"In the worst cases, this could mean risky access to homes, e-commerce accounts, credit cards, and even any connected medical devices the user has linked to their assistant". In 2017, researchers in China showed it was possible to give commands to smart devices at frequencies inaudible to the human ear, but a transmitter needs to be relatively close to the object for the method to work. "Protecting our users is paramount, and we're always looking at ways to improve the security of our devices".
Other factors also limited the extent of the researchers' hacks.
"This opens up an entirely new class of vulnerabilities", University of MI associate professor of electrical engineering and computer science Kevin Fu said. With the Echo Dot (3rd Gen) among the most sold items during last year's Amazon Black Friday sale, we can expect heavy discounts across Amazon hardware, as well as Sonos Black Friday deals, Black Friday wireless headphones deals, Black Friday speaker deals and much more.