Magazine

Researchers Hacked Siri, Alexa, and Google Home By Shining Lasers At Them

Posted on the 19 November 2019 by Thegeek

The most interesting development in mobile technology is the use of a personal assistant. Google’s Home, Amazon’s Alexa, and Apple’s Siri are all simplifying our work on just a voice command. Out of all the good things that it can do, it is difficult to imagine that these personal voice assistants are vulnerable to hackers.

A recent tech report has revealed that hackers and attackers are targeting these digital voice assistants through laser beams. They inject inaudible and invisible commands in the device that secretly take action on malicious commands like opening the door, start vehicles, unblock confidential websites, etc.

Researchers Hacked Siri, Alexa, and Google Home By Shining Lasers At Them

By throwing laser low powered laser beams on the voice-activated system, attackers can easily break into from the distance of approximately 360 feet. This could be possible because there is no user authentication mechanism built into the systems. So, they become an easy target for hackers. Even if there would be any authentication system, it won’t be a big deal for attackers to crack the PIN or Password, because there is a ceiling on the number of attempts or guesses users can try.

Apart from the existing vulnerability in the digital voice assistants, the light-based hack can be injected even from one building to another. This technique of intrusion can also be used in exploiting the vulnerability present in microphones. Microphones make use of Micro Electro Mechanical Systems (MEMS).

These MEMS reacts to light just like it does for sound. Although the subject under research is Google Home, Sri, and Alexa, it can be safely concluded that all the phones, tablets, and other devices that work on the principle of MEMS can be prone to such light command attacks.

Read – Hackers Using WAV Audio Files To Inject Malware and Cryptominers

Besides the strength of exploiting the vulnerability, there are also some limitations of attacking through the light-based technique. Say, the attacker must have a direct line of sight. In most of the cases, the attack will be successful only when the light falls on a specific part of the microphone.

However, in the case of Infrared laser light, it can detect the nearby devices as well. The research has presented the light-based attacks are responding to voice commands along it also suggests that it can work well in semi-realistic environments. In the future, tech experts are expecting that these attacks are going to be more severe.

We find that VC systems are often lacking user authentication mechanisms, or if the mechanisms are present, they are incorrectly implemented (e.g., allowing for PIN brute-forcing).

We show how an attacker can use light-injected voice commands to unlock the target’s smart-lock protected front door, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) if the vehicles are connected to the target’s Google account.” – as written in the research report entitled “Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems. ”


Back to Featured Articles on Logo Paperblog