Voice assistants could be fooled by commands you can’t even hear

May 11, 2018 Abhimanyu Ghoshal


Many people already consider voice assistants to be too invasive to let them listen in on conversations in their homes — but that’s not the only thing they should worry about. Researchers from the University of California, Berkeley, want you to know that they might be also be vulnerable to attacks that you’ll never hear coming. In a new paper (PDF), Nicholas Carlini and David Wagner describe a method to imperceptibly modify an audio file so as to deliver a secret command; the embedded instruction is inaudible to the human ear, so there’s no easy way of telling when Alexa…

This story continues at The Next Web

Previous Article
Infinito Wallet partners with Blockpass to maximize cryptocurrency potential
Infinito Wallet partners with Blockpass to maximize cryptocurrency potential

Bitcoin has been around for less than 10 years, with many newer cryptocurrencies younger than that. Yet, de...

Next Article
Fake Electrum blames real Electrum for ‘ruining its reputation’ — pulls exit scam
Fake Electrum blames real Electrum for ‘ruining its reputation’ — pulls exit scam

We had cautioned our readers last month about Electrum Pro being a scam — sadly, we’re now seeing the worst...