Alexa, Google Assistant and Siri: a laser is enough to deceive them
A group of researchers managed to issue commands to a voice assistant using a laser beam, opening the door to a technique that is as flashy as it is effective.
Voice assistants Alexa, Google Assistant and Siri may be vulnerable to attacks involving the use of lasers to issue silent and often invisible commands to perform unwanted actions, such as unlocking doors, visiting websites, locating and starting vehicles and so on.
By directing a low-power laser light beam towards devices equipped with voice assistance systems, the researchers who discovered the flaw were able to issue commands up to a distance of 110 meters.
Voice control systems do not normally require the user to authenticate to issue each command, and this allows the attack to be completed without the need to know a password or PIN. Considering the distance and the type of technique, these are attacks that can be successfully conducted even between different buildings or outside the victim’s building, if the voice control device is placed near a window.
This type of attack exploits a vulnerability present in microphones that use MEMS components (micro-electro-echanical systems): these components can respond to light electromagnetic radiation by mistaking it for an acoustic wave.
The researchers carried out their tests on Siri, Alexa, Google Assistant, Facebook Portal and on a limited number of smartphones and tablets, but there are sufficient elements to believe that the problem can reasonably affect all devices that use MEMS microphones.
The attack, called Light Command has a number of limitations. First of all, as you may have guessed, the attacker must have a ” clean shot ” towards the target device: this must be in optical range. And in many cases, the laser beam must be directed to a specific point on the microphone.
And unless using lasers outside the visible spectrum, the light could be easily detected by someone who is in the vicinity of the target device.
Finally, we must not forget that all these voice assistance systems always respond with acoustic feedback (be it a sentence or even a simple sound) when they have received a command to be executed.
But beyond these limitations, the technique identified by the researchers is still significant since, in addition to representing a new type of threat, it is actually replicable in real situations. Despite everything, researchers admit they have not fully understood the scientific reason why this technique has proven to work. A better understanding of the phenomenon could lead to even more effective attacks.
” Voice control systems often do not have user authentication systems or when present, they are not implemented correctly. We have shown how an attacker could use commands given via a light beam to perform actions such as unlocking the door. Entrance, open the garage door, shop online and locate and start vehicles if they are connected to the victim’s Google Account. ”
The researchers described in their publication several ways of performing the attack, with various types of instrumentation. A first setup includes a simple laser pointer, a laser driver and an audio amplifier, and optionally a telephoto optic to concentrate the laser for long-range attacks.
The laser driver is laboratory equipment, which still requires familiarity with the use and configuration of the device. Another kind of setup has seen the use of an infrared laser not visible to the eye. A third configuration instead involved a phosphor laser to eliminate the need to direct the light beam to a specific point on the MEMS microphone.
In one of the various attempts, the researchers managed to complete an attack at a distance of about 70 meters and through a glass window, while in another test the telephoto optics was used to concentrate the laser and attach a device placed at a distance of about 110 meters: it was the maximum possible distance in the test environment available to researchers, which leaves open the possibility of conducting such attacks at even greater distances.