Researchers in the U.S. and China have discovered ways to send hidden commands to digital assistants—including Apple’s Siri, Amazon’s Alexa, and Google’s Assistant—that could have massive security implications.
In laboratory settings, researchers have been able to activate and issue orders to the systems via means that are undetectable to the human ear, according to a new report in The New York Times. That could, conceivably, allow hackers to use the systems to unlock smart locks, access users’ bank accounts, or access all sorts of personal information.
A new paper out of the University of California, Berkeley said inaudible commands could be imbedded into music or spoken text. While at present this is strictly an academic exercise, researchers at the university say it’s foolish to assume hackers won’t discover the same methods as well.
“My assumption is that the malicious people already employ people to do what I do,” said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors.
Last year, researchers at Princeton University and China’s Zhejiang University also found voice-activated devices could be issued orders using inaudible frequencies. Chinese researchers called the technique DolphinAttack.
Amazon told The New York Times it has taken steps to ensure its speaker is secure. Google said its platform has features that mitigate such commands. And Apple noted an iPhone or iPad must be unlocked before Siri will open an app.
Still, there are several examples of companies taking advantage of weaknesses in the devices, from Burger King’s Google Home commercial to South Park‘s stunt with Alexa.
And the number of devices in consumers’ homes is on the rise. Digital assistants have been among the hottest gifts of the past two holiday seasons. And Amazon, alone, is expected to sell $10 billion worth of the devices by 2020.