For anyone with privacy concerns, let me feed your paranoia, especially if you have a digital assistant. New research is showing that many everyday words and phrases in normal conversation can trigger your assistant, which means part of what you say can be sent to your device’s manufacturer.
If you mention you’ve received “a letter,” don’t be surprised if Alexa responds. Or Siri might answer if you yell, “Hey, Jerry!” And it doesn’t have to be you speaking. The same thing can happen when the TV is on. Researchers have found more than 1,000 word sequences that can activate these devices.
“The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans,” one researcher said. “Therefore, they are more likely to start up once too often rather than not at all.”
And when they start up, they automatically record a portion of what they’ve heard and transmit it to the manufacturer. The stated purpose is to improve the device’s word recognition, but the end result is a part of your conversation can end up in company records.
Although the final paper has yet to be published, the general findings support the suspicion that voice assistants can intrude on privacy, even when people don’t think their devices are listening.
For more examples and their implications, see “Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant” by Dan Goodin (https://arstechnica.com/information-technology/2020/07/uncovered-1000-phrases-that-incorrectly-trigger-alexa-siri-and-google-assistant/?).