Over the past few years we’ve seen voice assistant software baked into everything from smartphones and watches to smart speakers. They make it easy to control media playback, launch apps, check the weather, and even control smart home devices including locks, lights, and thermostats.
And it turns out they’re generally vulnerable to a relatively simple method of attack.
Researchers at China’s Zhejiang University have demonstrated a method that lets anyone control popular voice assistants by sending ultasonic commands that are inaudible to humans, but which are interpreted as a valid command by the voice assistant software.
They call this method DolphinAttack, and demonstrate that it can be used to make your voice assistant perform the same tasks it would if it heard your voice. For example you can ask Siri to dial a phone number or visit a website with malware. You can ask Alexa or Google Assistant to unlock the door to your house or dim the lights.
Of course, this attack only works if you’re close enough to transmit sound that can be picked up by the speaker. So it’s unlikely that anyone would be able to unlock your door if they weren’t already inside the house. But hijacking phones or smartwatches via their voice assistant software seems like a real possibility.
DolphinAttack is that it’s designed in a way that causes microphone hardware to demodulate the signal so that sounds that you can’t hear seem like they’re within the normal range of a human voice to your gadgets. That could make it difficult for companies to block this kind of attack through a simple software update.
The team says they’ve tested DolphinAttack with multiple popular speech recognition systems including Microsoft’s Cortana, Amazon’s Alexa, Apple’s Siri, Google Assistant, and Samsung S Voice.
via Fast Company