Lasers can silently issue ‘voice commands’ to your smart speakers

Lasers can silently issue ‘voice commands’ to your smart speakers

(*)
(*)
(*)

(**)The team has (***)published(****) a paper detailing the light flaw after seven months of experimentation. They were able to hijack smart speakers (******************************) to (***************************) feet away by focusing lasers using a telephoto lens. In fact, the Google Home they tricked into opening a garage door was inside a room in another building. The laser modulation they beamed at its microphone port through the window is equivalent to the voice command “OK Google, open the garage door.”(*****)(**)They explained that there’s a small plate called a diaphragm inside devices’ microphones that moves when hit by sound. Lasers can replicate that movement and convert it into electric signals that the device can understand. They said opening the garage door by taking over Google Home was easy to do, and they could’ve easily made online purchases, opened doors protected by smart locks and even remotely unlocked cars connected to voice AI-powered devices by using the same method.(*****)(**)The researchers have already notified Tesla, Ford, Amazon, Apple and Google about the issue — a move that’s highly important to get the problem fixed, since simply covering microphones with tape wouldn’t solve it. Most microphones, they said, would have to be redesigned. The team was able to hijack Google Home/Nest, Echo Plus/Show/Dot, Facebook Portal Mini, Fire Cube TV, EchoBee 4, iPhone XR, iPad 6th Gen, Samsung Galaxy S9 and Google Pixel 2 devices using the technique. It was much easier hijacking smart speakers from afar, though. The method only worked on the mobile devices from a maximum distance of (**************************************) to (*********************************) feet.(*****)(**)This is far from the first digital assistant vulnerability security researchers have discovered. Researchers from China’s Zheijiang University (******)found(****) that Siri, Alexa and other voice assistants can be manipulated with commands sent in ultrasonic frequencies. Meanwhile, a group from the University of California, Berkeley found that they can take over smart speakers by (*******)embedding commands(****), which aren’t audible to the human ear, directly into recordings of music or spoken text.(*****)(**)(********)(*********)(*****)

(**********)
(**********)
(**********)(*)
(*)
(*)
(**)(***********)All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.(************)
(*****)
(*************)
(*)
(**)(**************)(***************)
(*****)
(**)(****************)

(*****************)
(****************)Comments(*****************)
(*****)
(**********)

(**********)

(**********)
(**********)
(**********)(******************)(*******************)Read More(****)

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.