Siri, Alexa, Echo Trackers Share Private Conversations
By now you must have figured out that our new digital assistants, like Alexa or Siri are a lot like the old saying “little pitchers have big ears.”
And according to articles on theguardian.com and futurism.com, humans in other countries are listening in to not just your conversations with Alexa, but any and all conversations or whatever is generating audio signals that your digital assistant has recorded for review.
Reports of sex being recorded and played back later by reviewers for Apple are just one example. Drug deals are also in the top 10 of questionable sounds and events being tracked. Apple contractors who review these soundbites could even backtrack and locate the person and address of the clips they hear.
Apple told The Guardian it regularly sends Siri activations to “secure facilities” where human reviewers listen to the clips, which are usually just a few seconds long and stripped of the Apple user’s ID and name. However, privacy is one of the main concerns related to this issue.
“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” they said, later adding that “it’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
And its not just Apple: Amazon and Google, too, are sampling conversations to make sure their assistants are being triggered by the correct words and that they are answering questions properly.
Apple and Google, too, have not informed consumers that they may be listened to and recorded for quality assurance. In fact, it’s been reported that some Amazon contractors share private conversations and mock people while sharing the joke with their co-workers.
Maybe these tech giants should all focus on training the assistants to listen only to when spoken to.
read more at futurism.com
Leave A Comment