University researchers have stumbled upon a means of hacking microphones in voice assistants and other digital devices using laser attacks. As well as IoT devices, medical devices, autonomous vehicles, industrial systems and even space systems are vulnerable.
How can Lasers Attack Voice Assistants?
Last year university researchers discovered that they could attack microphones in voice assistants using a laser light beam. Since then researchers from the Universities of Michigan and Florida in the US, as well as the University of Electro-Communications in Japan, have continued to investigate this phenomenon. They are, nevertheless, still unable to explain how this works.
Laser attacks, which use what researchers call “light commands”, exploit smart assistants’ microelectro-mechanical systems (MEMS) microphones. These microphones work by converting sound, i.e. people’s voice commands, into electrical signals that are then translated into actions. However, researchers found that these microphones react to laser light shone directly on them in the same way as to sound.
“By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” explained the researchers.
Doesn’t Even Have to Be Nearby
Furthermore, the laser does not need to be near the device. Researchers found that they could send inaudible commands to the microphones with the laser positioned up to 110 meters away. It also worked if they shone the laser through a window from the same distance.
Nor does the laser need to be particularly powerful. The researchers used a simple laser pointer, available on Amazon for less than $20, to launch the test laser attacks. They also used a cheap laser driver to power the laser and a sound amplifier.
With laser attacks, malicious actors could attack voice assistants without having any physical access to the device. Nor would they need any owner interaction. Consequently, an attacker could cheaply and easily launch a laser attack by standing outside a house and potentially control a voice assistant visible through a window. The attacker could then command the voice assistant to unlock doors, make online purchases or remotely start a vehicle.
Vulnerable Systems and Devices
Researchers first tested sending inaudible commands to microphones in various popular voice assistants. These included Amazon Alexa, Google Assistant, Apple Siri and Facebook Portal. The researchers then went on to test other devices that use voice commands such as Google NEST Cam IQ security cameras. Also tested were Amazon Echo smart speakers and the smartphones iPhone XR, Samsung Galaxy and Google Pixel 2. All these devices were found to be vulnerable.
Essentially, any digital device using MEMS microphones, which does not require additional user authentication, could be vulnerable to laser attacks. Hackers could shine a laser at Amazon Alexa enabled devices and control them, sending commands to open smart lock protected home doors or to steal account information.
After the initial findings, the researchers broadened their research from MEMS microphones in digital devices to sensing systems. They found that sensing systems found in medical devices, autonomous vehicles, industrial systems and even space systems are also susceptible to such attacks.
Steps to Protect Against Laser Attacks
The researchers have suggested steps that could be taken to protect against laser attacks. One step is implementing further authentication on IoT and other digital devices, such as 2-factor authentication. Or have the device ask a question, which the owner needs to answer, before it executes the command.
“An additional layer of authentication can be effective at somewhat mitigating the attack,” said the researchers. “Alternatively, in case the attacker cannot eavesdrop on the device’s response, having the device ask the user a simple randomized question before command execution can be an effective way at preventing the attacker from obtaining successful command execution.”
Another option would be to modify devices so that they need to receive commands from multiple microphones before executing them. Or putting a cover over the microphone so that a laser light cannot be shone directly onto it.
Laser Attack Demonstration at Black Hat
Some of the researchers involved, i.e. Sara Rampazzi, Assistant Professor at the University of Florida, and Benjamin Cyr, PhD Student and Danile Genkin, Assistant Professor both from the University of Michigan, will be demonstrating the laser attack at Black Hat Europe 2020.
Black Hat is a conference held annually that provides attendees with the latest technical and research information relating to the Information Security industry. This year’s conference will run from the 7th to the 10th of December and it will be entirely virtual.
At the conference, the researchers will show how: “
- Light Commands works by exploiting a physical vulnerability of MEMS microphones,
- It’s possible to remotely inject and execute unauthorized commands on Alexa, Portal, Google, and Siri voice assistants
- The ecosystem of devices connected to these voice assistants, such as smart-locks, home switches, and even cars, fail under common security vulnerabilities (e.g. PIN bruteforcing) that make the attack more dangerous”