AMAZON has admitted that employees sometimes eavesdrop on people as they interact with its Alexa digital assistant – but there's a way to stop them.

The e-commerce giant reportedly employs thousands of people across the world to listen in on customers talking to Alexa via Echo speakers, a recent Bloomberg report revealed.

Those employees listen to voice recordings captured in homes and offices, and then transcribe and annotate those conversations.

Amazon calls the process "supervised machine learning," and insists it's an "industry-standard practice where humans review an extremely small sample of requests to help Alexa understand the correct interpretation of a request and provide the appropriate response in the future."

The Jeff Bezos-owned company also insists it has "strict technical and operational safeguards" in place, in addition to a "zero-tolerance policy for the abuse of our system."

"Employees do not have direct access to information that can identify the person or account as part of this workflow," Amazon continued.

"While all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it, customers can delete their voice recordings associated with their account at any time.”

Records will also only be stored or sent to the cloud when an Alexa device is activated by a "wake word" – such as "Alexa" – or a button, Amazon says.

But despite the company's assurances, anyone Alexa users fearful of being listened in on by an Amazon-employed stranger can take steps to stop the eavesdropping.

'ALEXA, DELETE WHAT I SAID'

While Amazon argues that listening in on its customer's demands is an essential requirement to help improve their digital assistants, Alexa allows you to listen back to everything you've said and delete your queries daily.

The company recently introduced a feature where you can simply say, "Alexa, delete everything I said today" which will wipe your history for the day.

HOW TO DELETE PAST RECORDINGS FROM CLOUD

Follow the six steps below to delete past Alexa recordings that have been stored on the Amazon cloud.

  1. Log into your Amazon account.
  2. Go to the Alexa privacy settings page.
  3. Select the "Privacy Settings" tab.
  4. Under "View, hear, and delete your voice recordings," select "Review voice recordings."
  5. Where it says "Today," hit the drop-down menu and select "All History."
  6. Select "Delete all of my recordings."

STEPS TO STOP ALEXA RECORDING YOUR VOICE

You can also instruct Amazon to stop saving the recordings of your interactions with Alexa.

  1. Log into your Amazon account.
  2. Go to the Alexa privacy settings page.
  3. Select the "Privacy Settings" tab.
  4. Under "Review and manage smart home devices history," select "Manage Your Alexa Data."
  5. Under "Choose how long to save recordings," select "Don't save recordings," then hit "Continue."

TELL AMAZON TO NOT SHARE RECORDINGS WITH STRANGERS

There are five simple steps you can take to ensure none of the thousands of Amazon employees stationed across the world can eavesdrop on your conversations.

  1. Log into your Amazon account
  2. Go to the Alexa privacy settings page.
  3. Select the "Privacy Settings" tab.
  4. Under "Manage how you help improve Alexa," select "Manage how you help improve Alexa."
  5. Under "Help improve Alexa," deselect "Use of voice recordings."

LEGAL WOES

A class-action lawsuit was filed against Amazon in May, accusing the company of secretly recording and storing the information of 200 million users.

Lead plaintiff, David Terpening, filed the complaint in California federal court seeking to represent Alexa users nationwide.

Terpening claims that he and other consumers are unknowingly submitting personal information to Amazon when they use their Alexa devices, including their use, location, and even outside conversations.

Terpening says that he purchased an Alexa in 2016. He says he was not aware that the device was recording and permanently storing his and others’ conversations and personal information.

He alleges that he and other users relied on Amazon’s representations that Alexa would only listen and respond to their commands, including a “wake word” that triggers the device.

However, hundreds of millions of Alexa devices are allegedly permanently recording storing user information, including outside conversations with others.

The lawsuit contends that thousands of words that bear “little familiarity” to “wake words” can trigger Alexa to record.

Numerous other similar lawsuits have been filed against the company since Bloomberg's report was released in 2019.

    Source: Read Full Article