Researchers at Ruhr University Bochum and the Max Planck Institute for Security and Privacy in Germany have identified more than 1,000 word sequences that incorrectly trigger voice assistants like Alexa, Google Home, and Siri.
The researchers found that dialogue from TV shows and other sources produce false triggers that activate the devices, raising concerns about privacy.
Depending on pronunciation, the researchers found that Alexa will wake to the words "unacceptable" and "election," while Siri will respond to "a city," and Google Home to "OK, cool."
They note that when the devices wake, a portion of the conversation is recorded and transmitted to the manufacturer, where employees may transcribe and check the audio to help improve word recognition. This means each company’s logs may contain fragments of potentially private conversations.
From Ars Technica
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA