Over the past few years we’ve seen voice assistant software baked into everything from smartphones and watches to smart speakers. They make it easy to control media playback, launch apps, check the weather, and even control smart home devices including locks, lights, and thermostats.

And it turns out they’re generally vulnerable to a relatively simple method of attack.

Researchers at China’s Zhejiang University have demonstrated a method that lets anyone control popular voice assistants by sending ultasonic commands that are inaudible to humans, but which are interpreted as a valid command by the voice assistant software.

They call this method DolphinAttack, and demonstrate that it can be used to make your voice assistant perform the same tasks it would if it heard your voice. For example you can ask Siri to dial a phone number or visit a website with malware. You can ask Alexa or Google Assistant to unlock the door to your house or dim the lights.

Of course, this attack only works if you’re close enough to transmit sound that can be picked up by the speaker. So it’s unlikely that anyone would be able to unlock your door if they weren’t already inside the house. But hijacking phones or smartwatches via their voice assistant software seems like a real possibility.

DolphinAttack is that it’s designed in a way that causes microphone hardware to demodulate the signal so that sounds that you can’t hear seem like they’re within the normal range of a human voice to your gadgets. That could make it difficult for companies to block this kind of attack through a simple software update.

The team says they’ve tested DolphinAttack with multiple popular speech recognition systems including Microsoft’s Cortana, Amazon’s Alexa, Apple’s Siri, Google Assistant, and Samsung S Voice.

via Fast Company

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign


Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,501 other subscribers

One reply on “DolphinAttack: Siri and other voice assistants can be controlled via ultrasonic commands”

  1. “Siri and other voice assistance” – so is this voice recognition (no need to mention least popular apple), or just apple’s voice recognition. This fault has been obvious from the start – other platforms had voice recognition long before apple finally added it, but they didn’t have the obvious security holes that apple’s Clippy has. It’s no wonder voice recognition has largely turned out to be a failure on phones, despite the failed predictions of apple fans.

Comments are closed.