Amazon Employees Listen to Your Alexa Conversations for Product Improvement Purposes
Amazon's Alexa digital assistant has been caught doing several wonky things: from sending recordings of private conversations to random people, to mistakenly emailing audio recordings kept by the company to the wrong person, but if a new Bloomberg investigation is to be believed, a real-life stranger may have been listening to your Alexa conversations.
According to the report, "Amazon.com Inc. employs thousands of people around the world to help improve the Alexa digital assistant powering its line of Echo speakers. The team listens to voice recordings captured in Echo owners’ homes and offices. The recordings are transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa's understanding of human speech and help it better respond to commands."
The report further states that the review team, a mix of contractors and full-time Amazon employees working from Boston to Costa Rica, India and Romania, parse as many as 1,000 audio clips per shift, at times even picking up things Echo device owners would rather they stayed private, or even downright upsetting: a woman singing badly off key in the shower, a child screaming for help, and a sexual assault.
With voice-assisted AI continuing to establish a more pervasive foothold in our daily lives in the form of Alexa, Google Assistant and Siri, the lack of transparency in its marketing and privacy policy rings alarm bells, especially at a time when AI technology deployed by companies like Amazon, Facebook, Google, Microsoft and Apple remains a black box and seldom disclosed.
AI algorithms aren't just known for their thirst for big data, but also for their inability to perform a variety of tasks, particularly in the realm of understanding human language. The fact that both "New York City" and "NYC" refer to the same place is easy for us, humans, to comprehend, but unless the algorithm is fed this piece of meta-information, can it return the same result irrespective of what the user asks, say, "What's the weather like in NYC?" or "Will it snow today in New York City?". And the task only gets more complex when the algorithm needs to take into account different accents, different dialects, and different languages.
This is where data annotation comes in. Annotations provide additional information about the text, thereby improving the effectiveness of machine learning algorithms, whether be it natural language processing, or machine translation, or image, object and speech recognition.
By letting human reviewers listen to audio snippets and label the associated information (e.g. annotate "Taylor Swift" as the musical artist as opposed to something meaningless and random, as quoted in the Bloomberg story), this supervised learning approach ensures the system, in this case, Alexa, "can better understand your requests, and ensure the service works well for everyone."
While there is clearly nothing nefarious about recording snippets of conversations, it raises larger questions about user consent, and the duration for which such recordings might be stored, and if there has been any case of employee misuse.
Neither does Amazon Alexa's privacy policy explicitly state that humans may be listening to recordings, other than mentioning in its FAQs that "Alexa uses your voice recordings and other information, including from third-party services, to answer your questions, fulfill your requests, and improve your experience and our services", and that "we use your requests to Alexa to train our speech recognition and natural language understanding systems."
By design, Alexa and other voice-activated digital assistants are programmed to listen only after detecting the chosen wake word, thus eliminating privacy concerns with such devices eavesdropping on all conversations. In addition, Amazon also provides in its privacy settings an option to "opt out of having voice recordings used to help develop new features."
But the bigger question is how many customers are aware this is occurring, or by consenting to it, they are letting Amazon associate the audio recordings with the their first name, their device's serial number, and a unique account number for God knows how long.
That Amazon Alexa is recording snatches of audio for product improvement purposes may not be exactly news, but the revelation, which comes close on the heels of development that Amazon has partnered with six health care providers to allow users access some of their medical information through Alexa-supported devices, goes on to show the price we, as consumers, have agreed to pay in the name of convenience.
Update on Apr. 24: Amazon employees tasked with listening to Alexa commands to improve the service have access to geographic data that can make it possible to find a customer's home address, according to a new report by Bloomberg. "Team members with access to Alexa users' geographic coordinates can easily type them into third-party mapping software and find home residences, according to the employees, who signed nondisclosure agreements barring them from speaking publicly about the program," it added.
According to the report, "Amazon.com Inc. employs thousands of people around the world to help improve the Alexa digital assistant powering its line of Echo speakers. The team listens to voice recordings captured in Echo owners’ homes and offices. The recordings are transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa's understanding of human speech and help it better respond to commands."
The report further states that the review team, a mix of contractors and full-time Amazon employees working from Boston to Costa Rica, India and Romania, parse as many as 1,000 audio clips per shift, at times even picking up things Echo device owners would rather they stayed private, or even downright upsetting: a woman singing badly off key in the shower, a child screaming for help, and a sexual assault.
With voice-assisted AI continuing to establish a more pervasive foothold in our daily lives in the form of Alexa, Google Assistant and Siri, the lack of transparency in its marketing and privacy policy rings alarm bells, especially at a time when AI technology deployed by companies like Amazon, Facebook, Google, Microsoft and Apple remains a black box and seldom disclosed.
AI algorithms aren't just known for their thirst for big data, but also for their inability to perform a variety of tasks, particularly in the realm of understanding human language. The fact that both "New York City" and "NYC" refer to the same place is easy for us, humans, to comprehend, but unless the algorithm is fed this piece of meta-information, can it return the same result irrespective of what the user asks, say, "What's the weather like in NYC?" or "Will it snow today in New York City?". And the task only gets more complex when the algorithm needs to take into account different accents, different dialects, and different languages.
This is where data annotation comes in. Annotations provide additional information about the text, thereby improving the effectiveness of machine learning algorithms, whether be it natural language processing, or machine translation, or image, object and speech recognition.
By letting human reviewers listen to audio snippets and label the associated information (e.g. annotate "Taylor Swift" as the musical artist as opposed to something meaningless and random, as quoted in the Bloomberg story), this supervised learning approach ensures the system, in this case, Alexa, "can better understand your requests, and ensure the service works well for everyone."
While there is clearly nothing nefarious about recording snippets of conversations, it raises larger questions about user consent, and the duration for which such recordings might be stored, and if there has been any case of employee misuse.
Neither does Amazon Alexa's privacy policy explicitly state that humans may be listening to recordings, other than mentioning in its FAQs that "Alexa uses your voice recordings and other information, including from third-party services, to answer your questions, fulfill your requests, and improve your experience and our services", and that "we use your requests to Alexa to train our speech recognition and natural language understanding systems."
By design, Alexa and other voice-activated digital assistants are programmed to listen only after detecting the chosen wake word, thus eliminating privacy concerns with such devices eavesdropping on all conversations. In addition, Amazon also provides in its privacy settings an option to "opt out of having voice recordings used to help develop new features."
But the bigger question is how many customers are aware this is occurring, or by consenting to it, they are letting Amazon associate the audio recordings with the their first name, their device's serial number, and a unique account number for God knows how long.
That Amazon Alexa is recording snatches of audio for product improvement purposes may not be exactly news, but the revelation, which comes close on the heels of development that Amazon has partnered with six health care providers to allow users access some of their medical information through Alexa-supported devices, goes on to show the price we, as consumers, have agreed to pay in the name of convenience.
Update on Apr. 24: Amazon employees tasked with listening to Alexa commands to improve the service have access to geographic data that can make it possible to find a customer's home address, according to a new report by Bloomberg. "Team members with access to Alexa users' geographic coordinates can easily type them into third-party mapping software and find home residences, according to the employees, who signed nondisclosure agreements barring them from speaking publicly about the program," it added.
Comments
Post a Comment