Alexa by Amazon, Siri by Apple and other voice assistants like Google Assistant are at the risk of being hacked by the shining of a laser on the microphones of the device as per an international team of researchers.
Dubbed as Light Commands, this hack lets the attackers to inject a remotely invisible and inaudible command in to these voice assistants as per a statement from the experts in Tokyo at the University of Electro-Communication and the experts from University of Michigan.
By its targeting of the microphones with lasers, the researchers have said that they had been able to make these microphones respond to light like it would respond to sound. The researchers have said that by exploiting this effect, sound can be injected into the microphone by just modulating the laser light amplitude.
Through their study the authors had used the lasers for gaining complete control of the voice assistants at a distance of up to 361 feet.
They also wrote that they proved the fact that user authentication had been lacking on these devices often as the attackers were allowed to use the injection of light as the voice commands for unlocking the smartlock protected front door of the target and open the garage doors, do shopping on websites or even go on to locate.
The researchers have been sharing their findings with Google, Tesla, Ford and Apple along with the FDA and the ICS –CERT who noted that these findings must be made public and a date of 4th of November was agreed mutually.
The authorities aim to reduce the risk to the critical infrastructure of America through the findings of this study.