Amazon and Google’s smart speakers both allow you to supplement them with extensions of sorts, the same way you install third-party add-ons to make your web browsing experience even better. Here’s the kicker: As with browser add-ons, you’re entirely at the mercy of a developer. And should they use their powers for evil, you could be giving up everything you’re saying to your device to some random person.
At least, that’s the scenario presented by Germany’s Security Research Labs (SRLabs), who built a number of dummy Skills (Amazon) and Actions (Google) that passed both company’s checks and were actually listed for download to your Echo or Google Home devices. The catch? As Ars Technica describes:
“The malicious apps had different names and slightly different ways of working, but they all followed similar flows. A user would say a phrase such as: ‘Hey Alexa, ask My Lucky Horoscope to give me the horoscope for Taurus’ or ‘OK Google, ask My Lucky Horoscope to give me the horoscope for Taurus.’ The eavesdropping apps responded with the requested information while the phishing apps gave a fake error message. Then the apps gave the impression they were no longer running when they, in fact, silently waited for the next phase of the attack.
The security researchers actually developed two kinds of apps — one for eavesdropping, one for phishing — that both worked similarly. In the former, the app would simply do whatever it is you told it to, but it wouldn’t stop recording your voice; in the latter, the app would pretend to accomplish a task, wait a bit, then give you a fake message that your device was updated and you needed to provide your password for the update to complete. And any password you then provided was shuffled off to the developer’s servers.
Both Amazon and Google have since pulled the offending skills/actions — after being notified of their existence by SRLabs — and are working on extra “mechanisms” and “mitigations” to ensure these kind of exploits don’t make their way into other skills and actions. Here are snippets of the statements they provided to Ars Technica:
“Customer trust is important to us, and we conduct security reviews as part of the skill certification process. We quickly blocked the skill in question and put mitigations in place to prevent and detect this type of skill behaviour and reject or take them down when identified.”
“All Actions on Google are required to follow our developer policies, and we prohibit and remove any Action that violates these policies. We have review processes to detect the type of behaviour described in this report, and we removed the Actions that we found from these researchers. We are putting additional mechanisms in place to prevent these issues from occurring in the future.”
Use caution when downloading skills
Here’s the thing. People will always try to find new ways to steal your data. Amazon and Google are smart, but not infallible. Going forward, you should treat smart speaker skills as if they were as critical as browser extensions, if not more so. That means not installing skills or actions that sound neat, but come from a third-party source or independent developer you’ve never heard of. And if you absolutely cannot live without a special add-on for your device, at least do your diligence: Has anyone else used that add-on? Do the reviews seem authentic and not spammy? Is the add-on absolutely necessary for your day-to-day activities, or just some fun quirky thing that you’ll use a few times and forget about?
And when you’ve installed your smart speaker add-ons, make sure you’re checking your device to see if it remains on — and recording — once the add-on finishes whatever you asked it to do. If so, stop it, and uninstall the add-on, because that’s not a good practice.
Similarly, be wary of when your devices asks you to do things out of the blue, especially when that came shortly after you used a particular skill. I’m no Sherlock, but it’s an awfully strange coincidence if your smart speaker suddenly wants you to verify your password, especially if it’s never asked you to do that before, right after you use a brand-new add-on you just downloaded. Maybe... don’t do that. (And delete the add-on.)
Or don’t download at all
I realise it sounds a bit paranoid to say this, but I’d just go ahead and not use any of these extra skills, actions, add-ons, or whatever you want to call them with your smart speaker. These always-on devices — or, at least, devices that have the power to record what you say — open up brand-new methods for exploiting your privacy, and I remain convinced that no add-on, not even some hilarious joke skill or amazing horoscope action, is worth the risk. Your smart speakers are smart enough. Unless you fully trust what you’re adding to them, you don’t need the extra hassle (or anxiety).