Microsoft Is Eavesdropping On Your Skype And Cortana Activity

A number of companies are starting to have reservations about using real people to “improve” their digital assistants by reviewing what you’ve said to your smart speaker or phone. I’m willing to bet that Microsoft will also soon about-face on this practice, but right now, contractors might be listening to what you tell Skype Translator and Cortana.

According to Vice’s Motherboard, an unnamed Microsoft contractor was able to provide recordings — which tend to vary in length from 5–10 seconds, but aren’t limited to that — of people using Skype’s translation feature. To help Microsoft improve the feature’s capabilities, these contractors listen to what users have said and select from a list of possible translations or, in some cases, provide their own.

When asked about this setup, Microsoft representatives told Motherboard that the company makes these recordings available through a secure online portal, and that it takes steps — not described — to remove any associated information that could be used to identify a user after the fact. However, that doesn’t stop people from revealing information about themselves (like their address) when talking to a digital assistant like Cortana, and it doesn’t appear as if there’s any setup in place to prevent Microsoft’s contractors from analysing that kind of spoken data.

According to a statement Microsoft provided to Motherboard:

“Microsoft collects voice data to provide and improve voice-enabled services like search, voice commands, dictation or translation services. We strive to be transparent about our collection and use of voice data to ensure customers can make informed choices about when and how their voice data is used. Microsoft gets customers’ permission before collecting and using their voice data.”

“We also put in place several procedures designed to prioritise users’ privacy before sharing this data with our vendors, including de-identifying data, requiring non-disclosure agreements with vendors and their employees, and requiring that vendors meet the high privacy standards set out in European law. We continue to review the way we handle voice data to ensure we make options as clear as possible to customers and provide strong privacy protections.”

Can you stop Skype from sending what you say to Microsoft?

In a word, no. At least, when we published this article, I didn’t see any indication on Microsoft’s privacy FAQ for Skype Translator that you can restrict the company from collecting voice data. The practice is spelled out somewhat clearly:

“When you use Skype’s translation features, Skype collects and uses your conversation to help improve Microsoft products and services. To help the translation and speech recognition technology learn and grow, sentences and automatic transcripts are analysed and any corrections are entered into our system, to build more performant services. To help protect your privacy, the conversations that are used for product improvement are indexed with alphanumeric identifiers that do not identify participants to the conversation.”

I say somewhat, as Microsoft doesn’t indicate in its FAQ that your speech is being analysed by real people. In fact, this description almost implies that it’s a fully mechanical process, which it is not — nor could it be, since a machine wouldn’t be able to pick the correct translation. The entire point is that a human being has to train the system to get better.

I also didn’t see any settings within the iOS Skype app that would let you opt out of this “improvement” process, but it’s possible that Microsoft will change this approach going forward. It would be great to have an opt-out switch or, even better, an opt-in switch for permitting analyses of voice data.

What about Cortana?

As Vice’s report notes, Cortana commands are also fair game for contractors to listen to. However, you can opt out of this practice. To do so:

  1. Pull up the Settings app in Windows 10

  2. Click on Privacy

  3. Click on Speech on the left-hand sidebar

  4. Disable the “Online speech recognition” feature

The problem? Disabling this feature also hamstrings Cortana. You can still use the digital assistant to access information, but you won’t be able to talk to it and have it respond to your commands.

Your better bet might be to remind yourself to regularly review the Cortana voice data Microsoft is storing. To do that, visit your Microsoft Account page and click on the Privacy tab at the top. Scroll down to “Voice Activity” and click the “View and Clear Voice Activity” button. Look for the “Clear activity” link in the upper-right corner of your data list, and click that. Delete all the things.

I couldn’t get my data to clear, of course, but I hope you have better luck.

Also note that this still might not prevent a Microsoft contractor listening to what you’ve told Cortana — it all depends on whether you delete this data before it’s used to “improve Microsoft’s feature.” We have no idea how much time you have to delete your recordings before Microsoft uses them for something else, or even if this process deletes the single and only instance of the recording. It’s certainly possible that Microsoft simply makes a copy of what you’ve said, “anonymises” it, and uses that instead.

Ultimately, not using services that process your voice on a company’s servers is the best way to ensure nobody else can hear what you’ve said, but that’s the trade-off we make for convenience in today’s digital world.

If you want a digital assistant or an app to figure out what you’re saying and act on that information, you’re going to have to give up a little privacy to benefit from it. At least, that’s the setup until more companies recognise that it’s important to give customers a choice about whether they want their speech potentially processed by another person.

Comments


Leave a Reply