It turns out you don’t need to be on a WiFi network or even within shouting distance to hack a voice assistant.
Researchers from Japan and the University of Michigan have discovered that voice assistants and smart speakers such as Google Home, Amazon Alexa, Facebook Portal and Apple Siri can be hacked by using light.
The researchers were able to successfully demonstrate how to use light to inject malicious commands into several voice-controlled devices over long distances as well as through glass.
This vulnerability can lead to smart doors being unlocked, unwanted purchases being made and pretty much anything else that a smart speaker or voice assistant can do.
The action, dubbed “light commands,” involves a perpetrator shining a light such as a laser pointer at a microphone. The light contains inaudible commands that can be acted upon by Alexa, Portal, Google Assistant or Siri.
What Happens When Crime Prevention Meets AI (And Why We Can’t Rely on AI Alone)
In an era where crime continues to evolve, relying solely on AI-based solutions proves insufficient in preventing sophisticated threats and false alarms. The webinar will emphasize the paramount importance of human judgment and intuition in tandem with AI technology to create a truly complete security solution. Our session will explore how Deep Sentinel’s revolutionary approach combines the best of both worlds—advanced artificial intelligence and the presence of highly trained human guards. Together, these elements create an unparalleled level of protection for residential and commercial properties alike. Join our panel of esteemed lighting experts that will dive into why it’s time for integrators to rethink their role in the outdoor lighting industry. Register Now!Wondering how a microphone can be triggered by light? The researchers explain that microphones convert sounds into electrical signals. Inside the microphones is a small plate called the diaphragm. Sound causes the diaphragm to move which results in electrical signals.
A laser light shined at a microphone causes the diaphragm to move, resulting in electrical signals representing a potential attacker’s commands.
Though there have been no reports of this type of attack being maliciously exploited, it is fairly affordable to do. The researchers suggest a setup can be built for under $400.
You can view the full research paper here, or watch an overview and demos below:
This article was originally published on our sister publication Security Sales & Integration‘s website.
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!