Updated: Amazon Admits Alexa Recorded And Shared Conversation

Image: Amazon

A family in Oregon had a private conversation recorded and sent to a third party by their Amazon Alexa. Amazon has confirmed the incident saying the incident is very rare and under investigation.

The incident was reported when the victim, who only gave her name as Danielle, called into radio station KIRO7 and said she received a call from a friend in Seattle, telling her a recording of a conversation she had with her husband has been sent to her husband's colleague from her contact list. They were able to play it back to her to confirm the breach was true.

She told the radio station "I felt invaded. A total privacy invasion. Immediately I said, I'm never plugging that device in again, because I can't trust it".

Danielle, who has Alexa devices in almost every room in her home, contacted Amazon immediately with an engineer able to confirm the incident through logs and thanking her for making them aware. The devices are now unplugged.

Amazon has not made a public statement about the incident yet but they did respond to questions from the radio station, giving a boilerplate-style response saying "Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future".

This has been one of the great concerns of such devices. Not necessarily that they record personal conversations for the specific purpose of spying. But that they malfunction and inadvertently open our lives to outside parties. Danielle was fortunate her data went to a friend who alerted her to the incident.

Given this is the first report of an incident of this type, it's unclear whether we are dealing with a widespread bug or an isolated and, as Amazon put it, rare incident. I've not authorised Alexa to access my contacts but am still concerned that it may be triggering without me saying "Alexa..."and capturing audio and, at the very least sending it to Amazon without permission.

Is this the sort of incident that stops you from buying a home assistant device? Or, if you already have one, will you be turning it off?

Since we originally published this story Amazon issued the following statement:

Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, “Do you have a story for us? All tips confidential., right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.


Comments

    These devices 'malfunction'...
    I highly doubt that this is simply a case of an inadvertent leak of private data. These devices are just data collection and advertising devices masquerading as personal gadgets

    Zero sympathy.
    When I tell people this sort of thing is bound to happen to them sooner or later, they look at me as if I'm mad (well i assume its because of that ;-) ). CA/Facebbok has certainly helped. But this is likely to help more.

    This is just the beginning of a lot of future pain. development of this concepts are a terrific idea. They should however not be developed until guidelines are set in place to plan against issues such as the one noted in the article.

    I refuse to have any listening devices in my house until the country laws catch up. So that's a no to TVs that follow verbal commands, same with Alexa and all those other systems. Although in saying that I do use Siri on my iPhone. I guess it comes down to whom do you feel you can trust.

Join the discussion!

Trending Stories Right Now