Laser pointers are great for taunting cats and inflicting irritation. But they're also quite effective at hacking Alexa, Siri or Google Assistant, researchers say -- even from hundreds of feet away.
Microphones in smart devices translate sound into electrical signals, which communicate commands to the device. But as researchers have discovered, microphones will respond the same way to a focused light pointed directly at them. It's a surprising vulnerability that would allow an attacker to secretly take over many popular voice-controlled devices with nothing more than a $13.99 laser pointer and some solid aim.
The researchers -- Takeshi Sugawara at the University of Electro-Communications in Japan; and Kevin Fu, Daniel Genkin, Sara Rampazzi and Benjamin Cyr at the University of Michigan -- released their findings in a paper Monday.
"It's possible to make microphones respond to light as if it were sound," Sugawara told Wired. "This means that anything that acts on sound commands will act on light commands."
Inside each microphone is a small plate called a diaphragm that moves when sound hits it.
That movement can be replicated by focusing a laser or a flashlight at the diaphragm, which converts it into electric signals, they said. The rest of the system then responds the way it would to sound.
Since many voice-command systems don't require authentication, an attacker wouldn't need a password or PIN to take over a device with a light command; they just need to be in the object's line of sight.
In the paper, the researchers detailed how they could easily commandeer smart speakers, tablets and phones without being in the same building, just by pointing a laser through a window. In one case, they took over a Google Home on the fourth floor of an office building from the top of a bell tower at the University of Michigan, more than 200 feet away. And they say the trick could theoretically be deployed to buy things online undetected, operate smart switches in homes and endless other unsettling applications.
"Once an attacker gains control over a voice assistant a number of other systems could be open to their manipulation," a breakdown of the study on the University of Michigan's website says. "In the worst cases, this could mean dangerous access to e-commerce accounts, credit cards, and even any connected medical devices the user has linked to their assistant."
Researchers spent seven months testing the trick on 17 voice-controlled devices enabled with Alexa, Siri, Facebook Portal and Google Assistant, including Google Home, Echo Dot, Fire Cube, Google Pixel, Samsung Galaxy, iPhone and iPad. They successfully levied attacks using ordinary laser pointers, laser drivers, a telephoto lens and even a souped-up flashlight.
The researchers weren't sure exactly why these microphones respond to light as they do sound; they didn't want to speculate and are leaving the physics for future study. They notified Google, Amazon, Apple, Tesla and Ford about the vulnerability.
Spokespeople for Google and Amazon said the companies are reviewing the research and its implications for the security of their products but said risk to consumers seems limited. An Amazon spokeswoman pointed out that customers could safeguard Alexa-enabled products with a PIN, or use the mute button to disconnect the microphone. (Amazon founder Jeff Bezos owns The Washington Post, which contributed to this article.)
Apple did not immediately respond to requests for comment.
Other undetectable means of exploiting voice-command devices have been revealed by researchers, but their powers have been more limited. In 2016, researchers at the University of California at Berkeley showed it was possible to cloak commands in white noise, music or spoken text. In 2017, researchers in China showed it was possible to give commands to smart devices at frequencies inaudible to the human ear, but a transmitter needs to be relatively close to the object for the method to work.
There are no known instances of someone using light commands to hack a device, researchers said, but eliminating the vulnerability would require a redesign for most microphones.
And simply covering the microphone with a piece of tape wouldn't solve it. The microphones on several digital assistants had dirt shields that didn't block their commands, Fu said.
But there are limitations to the stealth of a light command attack, researchers found. With the exception of infrared lasers, lasers and other lights are visible to the naked eye and could easily be noticed by someone near the device. Voice-command devices also generally give audible responses, but an attacker could still change the device's volume to continue operating it undetected.
For now, researchers say the only foolproof way to protect against light commands is to keep devices out of sight from windows, away from prying eyes -- and prying laser beams.
Information for this article was contributed by Taylor Telford of The Washington Post; and by Nicole Perlroth of The New York Times.
Business on 11/06/2019
Print Headline: Smart devices vulnerable to laser hacking