Researchers at the University of Michigan and Researchers from Japan on Monday published a paper regarding a new peril for voice assistants. Presently, it appears to be that hackers can manipulate Alexa Echo, Siri or Google Home with a laser.
Cybersecurity has always been a feverish passage that we come across almost every single day. As much as we enjoy the advantages of today’s technology, we know that these advantages come with risk. Researchers have found that with light commands, the speakers can be tricked.
According to the researchers, it is possible to make the smart speakers believe light command as a voice command. This is feasible when a laser is focused on the diaphragm in the microphones of the speaker. The diaphragm is placed in the microphone in order to receive the signals.
Basically, the diaphragms react to the external sound pressure and the variations in the sound. Diaphragms in microphones help to convert acoustic energy into electrical energy. Researchers told that they found out that when a laser or flashlight is pointed towards the diaphragm, it reacts to light as if it’s sound. Furthermore, they experimented with this theory in certain cases.
Researchers have tested the top leading devices in the market from Google Home devices to Amazon Alexa, Siri, and Facebook’s portal mini. During the experimentation, they opened a garage door with a laser with Google home.
They have also tried different methods to test this tricky light command. According to the reports by Wired, when the researchers used the light command at a distance of 164 feet away from the device, they could deceive the device and able to register their commands (a 60 milliwatt laser is used for this experimentation).
In order to check the range and ability of the laser, they also did variations with the laser with a 5 milliwatt, which resulted in failures except for first-generation Echo Plus and Google Home device.
These experiments lead to the vulnerability of smart speakers. There are several serious threats like accessing online shopping accounts and opening doors when they are protected with a voice command.
One of the researchers, Kevin Fu, an associate professor of electrical engineering and computer science at the University of Michigan, said that “This opens up an entirely new class of vulnerabilities.” Sara Rampazzi, another researcher, said, “You can hijack voice commands. Now the question is just how powerful your voice is and what you’ve linked it to.”
However, it is not easy to pull off this stunt. The laser has to be pointed towards the diaphragm, and the hacker should gain the visibility of the device. When the device is placed far from windows to avoid the accessibility of strangers, this method cannot be accomplished. Also, the person who tends to hack needs the expertise to work with frequencies of the laser light to bring in the actual frequency to exploit the device.
Security methods like PIN can be used when it comes to shopping accounts and other sensitive matters, which the people think that they need security. When questioned about this, Google and Amazon told Wired that they are analyzing the research paper, whereas Apple waned to comment, and Facebook didn’t respond.
Researchers suggested that the vendors can produce light shield microphones or can use two microphones for voice command instead of one microphone, which is now used.
Discoveries always lead to another Discovery. The same goes for the pros and cons of technology. As of now, a new possible threat has come into the light, and a solution will be found out sooner or later.