The Limited Times

Now you can see non-English news...

"Your intelligent assistant spies on you." Do Siri, Google and Alexa snoop on us?

2023-06-01T10:43:38.556Z

Highlights: We analyze what information the main attendees collect and what security measures they take, according to the companies, to maintain privacy. Amazon, Google or Apple reassure their users and bet on transparency, letting them know what data they collect and allow them to manage at all times what is done with them. Apple is characterized by something, it is by allowing the user to control what data is shared, with which apps and how they are handled in each case (both their own and those of third parties) And nothing is shared with advertisers.


We analyze what information the main attendees collect and what security measures they take, according to the companies, to maintain privacy


Do Siri, Alexa or Google Assistant really spy on us? Surely more than once you have asked yourself, taking into account that they are already in all kinds of devices (televisions, mobiles, speakers ...) that accompany us at home, at work or that we always carry with us. Nor does it help that sometimes, when we comment on something in private, then ads appear related to what we have said when browsing the Internet or that news is published that the workers of one of the companies responsible for these systems have accessed the private conversations of thousands of users.

Of course, and despite recognizing specific security failures, Amazon, Google or Apple reassure their users and bet on transparency, letting them know what data they collect and allowing them to manage at all times what is done with them. To know exactly what each of the main attendees really hears, how they keep that information and what treatment is given, we have analyzed their terms and conditions and consulted with the firms themselves.

On the device itself

If Apple is characterized by something, it is by allowing the user to control what data is shared, with which apps and how they are handled in each case (both their own and those of third parties), facilitating adjustments accordingly. And nothing is shared with advertisers. A priori, the signature of the apple is the most restrictive with Siri and guarantees that, every time something is requested, the audio of the requests does not leave the iPhone, iPad or HomePod, unless you decide to share it voluntarily.

More informationThis is how Android 'apps' reveal our secrets without us knowing

Of course, there are small differences in what is done with the data depending on which application uses the wizard. For example, queries made to apps like Notes or Messages don't even send the information to Apple's servers; but it is done when the request involves searching the Internet or using the dictation function, although in this case everything is anonymous: none of the queries are associated with the user's ID. Instead, random identifiers composed of a long sequence of letters and numbers are used: with Safari and Spotlight this identifier changes every 15 minutes, and in Dictation they are deleted (with all transcripts) when you deactivate Siri and turn it back on. In this sense, it is worth making a point: Apple does not ensure that requests made more than six months ago or the textually "small sample of requests" that could have been analyzed, since they will no longer be associated with the random identifier.

Contacting servers

Google Assistant, on the other hand, sends all queries to its servers and cannot be configured otherwise: it is an indispensable requirement to obtain an answer. However, by default, none of these requests are saved, so it is impossible for anyone to access the recordings or identify who has made them. But Google warns: if you choose to store them in the user's account, you will be helping the system to work better in the first place (specialized reviewers analyze the audio to check if it was understood correctly, for example); and, secondly, to personalize the experience based on the information that Google has about each user and the queries they have made in the past.

If you have chosen to enable this option, in the event that the assistant is activated by mistake (something very common in any service of this type, as regular users know), it would be enough to say "Hey Google, I was not talking to you" to delete any conversation from the activity log. In this sense, it is also possible to review all interactions and delete them manually, schedule it to be done automatically every 3, 18 or 36 months, ask you with your voice to disappear all conversations of the last week, etc.

Something similar happens with data. Although Google Assistant does not need to access them to work, if granted the permissions it could warn if it detects that on the usual route to work there is traffic (without having to tell you anything) or know when it is the birthday of one of the contacts. What the company does state emphatically is that it never sells audio recordings or any other personal information.

And Alexa?

Amazon uses customer data to personalize purchases, recommend playlists, books... And also to customize Alexa based on who is using it. Since its launch in Spain, almost five years ago, the multinational has wanted to focus on the fact that there are no risks to privacy and that users have control of what information it stores and what is done with it.

That said, whenever Alexa is used, the requests go to the cloud and there they are stored encrypted. At any time you can check what you have heard (and recorded) and even play a cut and manage all the recordings: delete some, sort them by date, depending on who has made them, on which device ... In the same way, it is possible to delete all at once, program when it should be done or choose not to save them in any case. All this from the app or directly with the voice.

But, without a doubt, one of the functions of the Amazon assistant that has generated the most doubts in terms of security is Drop In: something similar to an intercom that makes it easier for family and friends to communicate with each other through their smart speakers. Could cybercriminals access this feature to spy? Amazon's answer is a resounding no. To use this functionality you have to activate it manually and, in addition, authorize contact by contact who could call. And, of course, you have to accept the incoming call once it occurs.

You can follow EL PAÍS Tecnología on Facebook and Twitter or sign up here to receive our weekly newsletter.

Source: elparis

All tech articles on 2023-06-01

You may like

Life/Entertain 2024-04-15T16:22:29.125Z
Life/Entertain 2024-03-19T07:19:44.696Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.