Smart Home Speakers Can Be Hacked By Laser Beams, Researchers Warn


If you are planning to buy smart home devices, you might want to reconsider your decision. Researchers at Japan’s University of Electro-Communication and the University of Michigan have revealed that they could successfully control smart home speakers using a laser beam light commands.

We know that a smart home speaker responds to verbal commands. However, the researchers claim that they were able to do it without uttering a word. They were able to command various smart home speakers including Google Home and Amazon’s Echo from hundreds of feet away, simply by shining a laser beam. All that they required was a clear line of sight to the speaker, a good laser, and a few electronics (worth around $400). Researchers have been able to successfully demonstrate a wide range of commands including opening smart garage doors and online shopping using light commands.

How does it work?

In a paper, researchers explain that they executed these commands simply by shining a light with an encoded command (like ‘Ok Google, play the music’) on the smart speaker’s microphone. Microphones built inside the smart speakers convert sound into electrical signals. But, researchers have found that the currently used microphones react to light as well when directly aimed at them.

Sound of an otherwise audio command was encoded in the intensity of the light beam. The light then hit the diaphragm inside the microphone. It caused the diaphragm to vibrate in a similar way as it would have done if the user had said that command. Thus, by modulating an electrical signal in a light beam, attackers can deceive microphones to produce electrical signals even for inaudible light commands.

Possible Threats

Researchers have warned on their website that “By shining the laser through the window at microphones inside smart speakers, tablets, or phones, a faraway attacker can remotely send inaudible and potentially invisible commands which are then acted upon by Alexa, Portal, Google assistant or Siri.”

They have further added that since smart home speakers generally don’t come with any user authentication features turned on by default, it adds to the potential risk of attack. They have said that the Apple devices were among the few exceptions that actually required them to somehow work around their privacy settings before executing light commands.

Such an attack also carries the potential risk of attacker further breaking into your network of integrated smart home devices. For instance, an attacker may further open smart locks by stealthily brute-forcing the user’s PIN number after gaining access to the voice assistant.

The researchers have already tested the light commands against several smart devices apart from smart home speakers. Their list of potentially vulnerable devices included Google Nest Cam IQ, Facebook’s Portal Mini, multiple Amazon Echo Dot & Echo Show devices, iPhone XR, and the sixth-gen iPad.

Precautions to Take

Researchers have noted that this weakness cannot be completely fixed without redesigning the microphones that are built into the smart home speakers. So, for now, they have cautioned users to keep their smart home speakers in a place where it cannot be seen by anyone outside your home.