If you're on the internet, there's a good chance you're probably using a program or app that listens to audio recordings without your knowledge. Some of the biggest companies in the world have admitted in recent weeks they've been hiring human contractors to monitor snippets of your queries and in some cases, your conversations with other users.
While it initially seemed to be just Amazon Alexa, the issue is a lot more widespread across popular apps than previously thought. Here's a quick overview of who's listening in when you think it's just the two of you.
We can control so many devices through voice interactions with digital assistants like Alexa, Siri and Google Assistant. This can be incredibly convenient, but it can also be nerve-wracking, knowing that technology companies now possess recordings of your voice and interactions.
Back in April 2019, when we were still mostly naive about privacy invasions, Bloomberg reported Amazon had been using a team of humans to review questions you'd asked the Echo. They explained the team, made up of a mix contractors and full-time Amazon workers, listened to recordings and then "transcribed, annotated and then fed [them] back into the software."
While there are probably millions of queries fed into the Echoo, the report found one employer listened to around 1000 recordings during a single nine-hour shift. The workers admitted they'd heard mostly benign things such as a "woman singing badly off key in the shower, say, or a child screaming for help" but they'd also heard what sounded like sexual assaults. They were told by Amazon it wasn't their job to interfere.
"We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow," an Amazon spokesperson told Bloomberg.
"All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it."
After a Guardian story revealed Apple uses human contractors to review some of its Siri recordings, Apple made the decision to temporarily suspend the program. Google followed suit with its virtual assistant, and now Amazon has done the same for Alexa — except instead of ditching the human review it’s just added the option to expressly opt-out of it.
Apple was also revealed to be listening in to your queries via the OG virtual assistant, Siri. In a Guardian report, contractors, paid to listen to questions you ask Siri, heard "confidential medical information, drug deals and recordings of couples having sex."
It refers to this process as 'grading' and it's admitted to The Verge the recording reviews are designed to help "improve Siri and dictation" and only a "small portion" being selected.
Apple has since suspended the grading program while it undergoes a review.
Google too admitted it used humans to listen back to your questions. First reported on by Belgian broadcaster, VRT NWS, 1000 recordings from Google Assistant queries were leaked and revealed the extent of the information included in them.
"We could clearly hear addresses and other sensitive information," VRT NWS explained in their report. "This made it easy for us to find the people involved and confront them with the audio recordings."
While Google had repeatedly denied listening in on people's questions, it's since admitted to the news and defended its use in a lengthy blog.
"These language experts review and transcribe a small set of queries to help us better understand those languages," the blog read. "This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant."
Google has since paused its recordings review process.
Vice Motherboard obtained leaked documents, screenshots and recordings proving Microsoft were no better in regards to listening to recordings.
"The fact that I can even share some of this with you shows how lax things are in terms of protecting user data," the leaker told Motherboard.
While it's less surprising humans were reviewing recordings obtained by Microsoft's virtual assistant Cortana given Google, Amazon and Apple were doing the same, it's the revelations about Skype that seem most concerning.
Recordings from Skype's real-tome voice translator, according to Motherboard's report, were also reviewed by humans. That means a conversation you're having with another person via the program was specifically listened to rather than just a query to an AI-powered assistant.
"Microsoft collects voice data to provide and improve voice-enabled services like search, voice commands, dictation or translation services," Microsoft told Motherboard.
"We strive to be transparent about our collection and use of voice data to ensure customers can make informed choices about when and how their voice data is used. Microsoft gets customers' permission before collecting and using their voice data."
A number of companies are starting to have reservations about using real people to “improve” their digital assistants by reviewing what you’ve said to your smart speaker or phone. I’m willing to bet that Microsoft will also soon about-face on this practice, but right now, contractors might be listening to what you tell Skype Translator and Cortana.
By the time Facebook also admitted they listened to nobody was surprised. Bloomberg reported the social media monolith had a team of humans analysing audio transcriptions provided by the app.
The human workers were asked to check if the AI transcription service had correctly interpreted the messages but weren't provided with information as where and how the audio clips and transcription had been obtained.
Facebook has also paused the recordings review process for the time being.
"Much like Apple and Google, we paused human review of audio more than a week ago," the company said.
In what has become a thing over the past few months, Facebook is the latest of the tech giants to admit that outside contractors were given access to users’ voice recordings.