It turns out you don’t need to be on a WiFi network or even within shouting distance to hack a voice assistant.
Researchers from Japan and the University of Michigan have discovered that voice assistants and smart speakers such as Google Home, Amazon Alexa, Facebook Portal and Apple Siri can be hacked by using light.
The researchers were able to successfully demonstrate how to use light to inject malicious commands into several voice-controlled devices over long distances as well as through glass.
This vulnerability can lead to smart doors being unlocked, unwanted purchases being made and pretty much anything else that a smart speaker or voice assistant can do.
The action, dubbed “light commands,” involves a perpetrator shining a light such as a laser pointer at a microphone. The light contains inaudible commands that can be acted upon by Alexa, Portal, Google Assistant or Siri.
Wondering how a microphone can be triggered by light? The researchers explain that microphones convert sounds into electrical signals. Inside the microphones is a small plate called the diaphragm. Sound causes the diaphragm to move which results in electrical signals.
A laser light shined at a microphone causes the diaphragm to move, resulting in electrical signals representing a potential attacker’s commands.
Though there have been no reports of this type of attack being maliciously exploited, it is fairly affordable to do. The researchers suggest a setup can be built for under $400.
You can view the full research paper here, or watch an overview and demos below:
This article was originally published on our sister publication Security Sales & Integration‘s website.
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!