Researchers at Royal Holloway University in London and the University of Catania in Sicily found that it was possible to get an Alexa speaker to perform any number of functions by playing commands through the speaker itself.  Dubbed “Alexa vs Alexa,” the hack could be performed with only a few seconds of proximity to a vulnerable Echo device. Researchers were able to use voice commands to pair an Echo with a Bluetooth device, and provided the Bluetooth device stayed within range, attackers could use that device to issue the Echo commands. So long as the command included the wake-up word (Alexa or Echo), an exploited Echo could be made to buy products, control smart home devices, and even unlock doors. Researchers even added a single “yes” command that would automatically play after six seconds, just in case the Echo needed a verbal confirmation before continuing. Another version of the attack used malicious skills or radio stations to “infect” the Echo and make it vulnerable to attackers’ voice commands. A third exploit also enabled a skill that ran silently, with the attacker intercepting and replying to commands as if they were talking to Alexa. In these cases, attackers could use text-to-speech apps to stream voice commands to the Echo. The attack also relied on something called the “Full Voice Vulnerability,” which prevented the Echo from automatically lowering its volume once it heard the wake-up command. But, despite all the measures in place to prevent misuse, this is not the first time Echos and other voice assistants have been caught out like this. Past cases include workers being able to listen to user audio and approved apps (opens in new tab) eavesdropping on users in an attempt to phish for passwords, for instance. So if you have one of the best Alexa devices in your home, you might want to follow the advice of the researchers who uncovered this particular set of problems and mute your device when it’s not in use. And don’t miss our own list of 5 ways to secure your Alexa device.

Amazon Echo security loophole exploited to make them hack themselves - 99Amazon Echo security loophole exploited to make them hack themselves - 23Amazon Echo security loophole exploited to make them hack themselves - 43Amazon Echo security loophole exploited to make them hack themselves - 19Amazon Echo security loophole exploited to make them hack themselves - 35Amazon Echo security loophole exploited to make them hack themselves - 41


title: “Amazon Echo Security Loophole Exploited To Make Them Hack Themselves” ShowToc: true date: “2022-12-31” author: “Rebecca Payne”


Researchers at Royal Holloway University in London and the University of Catania in Sicily found that it was possible to get an Alexa speaker to perform any number of functions by playing commands through the speaker itself.  Dubbed “Alexa vs Alexa,” the hack could be performed with only a few seconds of proximity to a vulnerable Echo device. Researchers were able to use voice commands to pair an Echo with a Bluetooth device, and provided the Bluetooth device stayed within range, attackers could use that device to issue the Echo commands. So long as the command included the wake-up word (Alexa or Echo), an exploited Echo could be made to buy products, control smart home devices, and even unlock doors. Researchers even added a single “yes” command that would automatically play after six seconds, just in case the Echo needed a verbal confirmation before continuing. Another version of the attack used malicious skills or radio stations to “infect” the Echo and make it vulnerable to attackers’ voice commands. A third exploit also enabled a skill that ran silently, with the attacker intercepting and replying to commands as if they were talking to Alexa. In these cases, attackers could use text-to-speech apps to stream voice commands to the Echo. The attack also relied on something called the “Full Voice Vulnerability,” which prevented the Echo from automatically lowering its volume once it heard the wake-up command. But, despite all the measures in place to prevent misuse, this is not the first time Echos and other voice assistants have been caught out like this. Past cases include workers being able to listen to user audio and approved apps (opens in new tab) eavesdropping on users in an attempt to phish for passwords, for instance. So if you have one of the best Alexa devices in your home, you might want to follow the advice of the researchers who uncovered this particular set of problems and mute your device when it’s not in use. And don’t miss our own list of 5 ways to secure your Alexa device.

Amazon Echo security loophole exploited to make them hack themselves - 93Amazon Echo security loophole exploited to make them hack themselves - 83Amazon Echo security loophole exploited to make them hack themselves - 12Amazon Echo security loophole exploited to make them hack themselves - 30Amazon Echo security loophole exploited to make them hack themselves - 98Amazon Echo security loophole exploited to make them hack themselves - 91