Apple admits Siri listens to peoples conversations: Report

After Google and Amazon, Apple has now come under the scanner for eavesdropping on users' conversations. A whistleblower who works for Apple, has told The Guardian on the condition of anonymity that the contractors of the iPhone-maker regularly hear confidential Siri recordings like drug deals and recordings of couples making love, as part of their job in providing quality control, or “grading”.  These contractors grade the responses/recordings on the basis of several factors like “whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate”.

According to the whistleblower, accidental activations were the main reason for the private conversations to be sent to Apple. Apple’s AI-powered virtual assistant Siri is incorporated in several Apple devices, including the Apple HomePod smart speaker and Apple Watch, which are claimed to be the most frequent sources of mistaken recordings. “The regularity of accidental triggers on the watch is incredibly high. The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on,” the whistleblower was quoted as saying. The whistleblower also said that the recordings were accompanied by user data showing location, contact details, and app data.

Apple does not explicitly disclose in its consumer-facing privacy documentation that “a small proportion of Siri recordings” are sent to contractors globally. Apple says that the data “is used to help Siri and dictation … understand you better and recognise what you say”. “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple was quoted as saying. The company added that the recording is less than 1 percent of daily Siri activations and are typically only a few seconds long.

Virtual assistants can be accidentally activated when they mistakenly hear their wake words -- in Apple’s case it’s “hey Siri”. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. Further, the recording apparently does not have any name or any information that could be easily linked to other recordings, that is, it is anonymous.

Interestingly, Google was also mired in a similar controversy a few weeks back. It was revealed that Google’s AI assistant, called the Google Assistant, listens to users’ conversations. At that time, Google also gave a similar explanation. “As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language. These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant,” Google had said.



from Latest Technology News https://ift.tt/2yiXqfZ

Comments

Popular posts from this blog

The Twitter board is reportedly not interested in Elon’s takeover offer

Amazon is acquiring a podcast hosting and monetization platform