You Should Mute Your Smart Speaker’s Mic More Often

You Should Mute Your Smart Speaker’s Mic More Often

Does your voice assistant often seem a little too eager to chime in? A recent study by Ruhr University Bochum and Max Planck Institute for Security and Privacy found over 1,000 words and phrases that Alexa, Siri and Google Assistant frequently misidentified as activation commands (also known as “wake words”). Here are a few examples, via Ars Technica’s reporting on the study:

Alexa: “unacceptable,” “election” and “a letter”

Google Home: “OK, cool,” and “OK, who is reading”

Siri: “a city” and “hey jerry”

Microsoft Cortana: “Montana”

According to the study, these false positives are very common and easy to initiate, which is a major privacy concern.

Alexa, what’s the problem?

Voice assistants are always “listening” for an activation command. While they’re not necessarily recording, they’re clearly on alert. Once the AI recognises a command ” whether through a smart speaker or your phone’s mic ” it records any subsequent audio it “hears” and then sends it to a remote server, where it’s processed by various algorithms that determine what is being asked. Sometimes, this audio is saved and listened to later by employees working to refine a voice assistant’s speech recognition capabilities, which is where the privacy concerns come in: Even if the captured audio doesn’t activate anything server-side, it still may be recorded, saved and even listened to by engineers to see if a command was missed or misinterpreted.

This isn’t speculation; we know this is how these “machine learning” algorithms actually work ” by having humans manually help the machines learn. They’re not autonomous beings. This practice often leads to privacy breaches and subsequent public backlash and legal ramifications. Google is constantly under fire for selling user data to advertisers, and Amazon has repeatedly leaked or mishandled its users’ video and audio recordings. Apple has the “best” data privacy policies overall, but its employees have been caught transcribing overheard audio.

The point is: if Alexa, Siri and Google Assistant are being activated accidentally, more of your personal interactions are going to be recorded and potentially accessed by outsiders ” and who knows what they’re doing with that data. While each of these companies let users manage and delete audio after it’s recorded, you should also take precautionary measures to make sure your smart devices are only listening when you want them to.

Tips for preventing mistaken voice assistant activations

  • Change the activation word/phrase if able. Alexa lets you change the wake word to “Echo,” “Amazon” or “computer,” which I imagine would feel pretty Star Trek to say out loud. Google lets you select between “OK, Google” or “Hey, Google,” but Siri only responds to “Siri.
  • Reduce your Google Home device’s activation sensitivity.
  • Mute a device’s microphone. Most have a physical “mute” button somewhere on the device.
  • Unplug your smart speakers and other smart home devices and when not in use.
  • Delete recordings and update your Amazon, Apple and/or Google account security options so your audio is never saved or listened to.
  • Just don’t use smart speakers or voice assistants at all. (Sorry! But it’s the only surefire method.)

[Ars Technica]


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments