Around the world tens of millions of people use Amazon’s ‘Echo’ line of the home assistant, in fact, Amazon have sold over 100 million devices. According to recent headlines, if you own one of these speakers, you could have your privacy invaded by Amazon’s employees. Is this really true?
In a recent Bloomberg report, it was revealed that Amazon employs hundreds, possibly thousands of people to listen to audio recordings from random echo devices around the world. This is in order to help train Alexa, as they manually transcribe unclear statements sent from the devices. This is known internally as the ‘Alexa Voice Review Process’. The reviewers are tasked with transcribing audio from the devices, to check whether Artificial Intelligence is successfully detecting commands.
A recent Amazon job posting says: “Every day she [Alexa] listens to thousands of people talking to her about different topics and different languages, and she needs our help to make sense of it all. This is big data handling like you’ve never seen it. We’re creating, labelling, curating and analyzing vast quantities of speech on a daily basis.”
According to insiders of the company, the work is mundane. Occasionally, however, Amazon employees become privy to things which Amazon customers might not want to be shared. Amazon says it even has procedures in place for employees who accidentally hear things they might find distressing, like illegal activity. Amazon’s policy when it comes to this is to not interfere.
Amazon says that Alexa will only ever record speech after the wake word is said unless the device is triggered accidentally. Of all the times the wake word is said, only a very small minority are reviewed, so the chance of your speech, in particular, being reviewed is relatively low.
Amazon’s Response
An Amazon spokesman said in an email: “We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”
“We take the security and privacy of our customers’ personal information seriously, We only annotate an extremely small sample of Alexa voice recordings in order to improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.”
So no, you probably don’t need to worry about whether Amazon employees are listening to your conversations. The chance at any point in time of an employee hearing your message to Alexa is very low. The workers do have some information about the audio clip they are listening to. This includes the users’ Account ID number, the user’s first name, and the device serial number. If an employee does note that sensitive information is stored on an audio recording; like bank details, then the audio file is permanently deleted.
The audio files are strictly confidential, and the workers are not to share what they’ve heard, unless what they heard was distressing, in which case they are allowed to share the experience on internal chatrooms as a method of relieving built up stress.
The voice review process takes place in many places all across the globe, including Boston, Costa Rica, India and Romania. The shifts take up to 9 hours, with some workers reviewing 1,000 audio clips per shift.
How to Prevent Prying Eyes
Photo: Fabian Hurnaus / Adobe StockIf you are that worried about prying eyes, you can opt out of voice recordings. This is done by opening the Alexa app, selecting ‘Alexa Account’, then ‘Alexa Privacy’. From here you can select ‘Manage how your data improves Alexa’, and turn off ‘Help Develop New Features’. This should stop Amazon from reviewing audio from your devices, but it’s unclear whether ‘Help Develop New Features’ refers to the Voice Review Process. In fact, Amazon said in a statement that people who opt out of the new features program may still have their audio recordings analysed as part of the Voice Review Process.
What a lot of users don’t understand about these types of devices is that some degree of human interaction is required to train them. Without someone to tell an artificial intelligence when it is wrong or right, it wouldn’t be able to learn or improve. This intervention is especially necessary when Alexa makes mistakes or hears unknown words like new slang. This is why Amazon has recruited employees to check the audio.
Apple’s Siri also uses a voice review process, but Apple says that the audio files are not connected to any identifiable information, and they are deleted permanently after 6 months. With Google’s voice review process, the audio isn’t tied to any identifiable information and is also distorted.
The problem arises that Amazon has not disclosed in any marketing or privacy policy materials that humans may be listening to recordings of Alexa conversations. They do say in an FAQ that: “We use your requests to Alexa to train our speech recognition and natural language understanding systems”, but this doesn’t imply that people are listening to the recordings and is slightly misleading. If you ask Alexa, “Is someone listening to us?”, she responds by saying: “I only listen when you say the wake word.”. It sounds a bit like she’s avoiding the question.
This isn’t the first time concerns have been raised over the privacy of home assistants and Internet-of-Things devices. With the exponentially growing scale of the IoT, I sincerely doubt this will be the last.