The Limited Times

Now you can see non-English news...

New Policy: Apple asks for permission to listen to Siri recordings

2019-08-28T17:46:20.589Z


For years, people have analyzed audio recordings of language assistants - and heard part very private. Especially Apple was criticized after being disclosed - and now wants to give users more control.



Apple wants users of its language assistant Siri expressly to ask for a permission for subsequent listening to recordings by employees. The iPhone group is thus the first provider of language assistants, which gives its users more control. The regulation should apply from autumn.

Apple had already announced this step earlier this month, after the announcement of the practice had triggered massive criticism. In addition, the group announced that it would no longer record user conversations with Siri in the long term.

For assistance software like Amazon's Alexa, the Google Assistant and Siri, fragments of recordings have also been heard and tapped by people for years to improve the quality of speech recognition. For example, cases where the language assistants did not understand the command, the incorrect recognition of activation words, and how to deal with new languages ​​and dialects.

Users were often unaware of practice

The providers emphasize that the recordings before that would be anonymized. The users were largely unaware of the practice, however, until a few months ago first media reports appeared. Google and Apple suspended the evaluation by humans a few weeks ago, Amazon gave users the opportunity to refuse the use of recordings to improve the service.

The focus on Siri had been channeled through a Guardian newspaper report in late July. An employee of an Apple service provider told, on the recordings are sometimes heard very private details. So Siri also picks up fragments of conversations with medical or business content, possible criminal activity, or even users having sex, he said.

One of the announced changes is that the selected recordings are only listened to by Apple employees and no longer by external service providers. Already last week, notice was given of redundancies at a relevant service provider in Ireland.

Revelations especially spicy for Apple

Apple came under pressure from the revelations because the company wants to profile itself for years as a guardian of the privacy of its users against competitors such as Google or Facebook.

Originally, however, the corporation referred only in a security document to the fact that even "a small number of transcriptions" could be used to improve the service. After the paper, however, users had to first search - and they were not explicitly pointed out in the installation of Siri on this possibility.

A particular problem is the erroneous activations, where the software believes to have heard the wake-up words "Hey, Siri". Because sentences and conversations can be recorded that were not addressed to the language assistant.

Upon subsequent listening, employees should find out what words or sounds triggered the accidental activation in order to adjust the software accordingly. "Our team will work to erase any image that was detected as an unintentional trigger by Siri," Apple promised.

"Siri uses a random identifier - a long string of letters and numbers associated with a single device - to track data during processing, rather than linking it to the identity via Apple ID or phone number," the company emphasized anonymity for the users.

Earlier data from Apple, less than one percent of the images were evaluated in mostly just a few seconds long fragments of people. According to the developers' safety paper, recordings have been used to improve the service for up to two years, with a copy being stored without personal information after six months.

The announcement of Apple coincides largely with the demands of German privacy advocates, which require, among other things, that users should be explicitly asked for permission to evaluate recordings by people.

Source: spiegel

All tech articles on 2019-08-28

You may like

Trends 24h

Tech/Game 2024-04-16T05:05:15.331Z
Tech/Game 2024-04-16T05:05:07.406Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.