- March 8, 2022
- Posted by: administrator
- Category: AWS
Researchers have discovered how to remotely manipulate the Amazon Echo through its own speakers.
Researchers from the University of London and the University of Catania have discovered how to weaponize Amazon Echo devices to hack themselves.
The – dubbed “Alexa vs. Alexa” – leverages what the researchers called “a command self-issue vulnerability”: using pre-recorded messages which, when played over a 3rd– or 4th-generation Echo speaker, causes the speaker to perform actions on itself.
How to Make Alexa Hack Itself
Smart speakers lay dormant during the day, waiting for a user to vocalize a particular activation phrase: i.e., “Hey, Google,” “Hey, Cortana” or, for the Amazon Echo, “Alexa,” or simply, “Echo.” Usually, of course, it’s the device’s owner who issues such commands.
However, researchers found that “self-activation of the Echo device [also] happens when an audio file reproduced by the device itself contains a voice command.” And even if the device asks for a secondary confirmation, in order to perform a particular action, “the adversary only has to always append a ‘yes’ approximately six seconds after the request to be sure that the command will be successful.”
To get the device to play a maliciously crafted recording, an attacker would need a smartphone or laptop in Bluetooth-pairing range. Unlike internet-based attacks, this scenario requires proximity to the target device. This physical impediment is balanced by the fact that, as the researchers noted, “once paired, the Bluetooth device can connect and disconnect from Echo without any need to perform the pairing process again. Therefore, the actual attack may happen several days after the pairing.”
Alternatively, the report stated, attackers could use an internet radio station, beaming to the target Echo like a command-and-control server. This method “works remotely and can be used to control multiple devices at once,” but would required extra steps, including tricking the targeted user into downloading a malicious Alexa “skill” (app) to an Amazon device.
Using the Alexa vs. Alexa attack, attackers could tamper with applications downloaded to the device, make phone calls, place orders on Amazon, eavesdrop on users, control other connected appliances in a user’s home and more.
“This action can undermine physical safety of the user,” the report stated, “for example, when turning off the lights during the evening or at nighttime, turning on a smart microwave oven, setting the heating at a very high temperature or even unlocking the smart lock for the front door.”
In testing their attack, the authors were able to remotely turn off the lights in one of their own homes 93 percent of the time.
Smart Speakers Are Uniquely Vulnerable
Because they’re always listening for their wake word, and because they’re so often interconnected with other devices, smart speakers are prone to unique security vulnerabilities. The Echo series of devices, in particular, has been linked with a series of privacy risks, from microphones “hearing” what people text on nearby smartphones to audio recordings being stored indefinitely on company servers.
The physical proximity required for Bluetooth, or having to trick users into downloading malicious skills, limits but does not eliminate the potential for harm in such a scenario as the Alexa vs. Alexa report described, according to John Bambenek, principal threat hunter at Netenrich. Those living in dense cities are potentially at risk, and individuals “at most risk are those in domestic violence scenarios,” he wrote, via email. For that reason, “simply accepting the risk isn’t acceptable.”
The research prompted Amazon to patch the command self-issue vulnerability, which is the benefit of having a robust threat-hunting culture.
“Most people aren’t evil,” wrote Bambenek. “It is hard to test new technology against criminal intent because even testers lack the criminal mindset (and that’s a good thing for society). As technology gets adopted, we find things we overlook and make it better.”
NOTE:: This article is copyright by threatpost.com and we are using it for educational or Information purpose only