_
_
_
_
_

Are Siri, Google and Alexa spying on us?

We analyze what information the leading virtual assistants collect, and what security measures they take to maintain privacy

Siri
Launching of the HomePod (an Apple speaker and digital assistant) in 2018.James D. Morgan (Getty Images)
Laura Pajuelo

Are Siri, Alexa and Google Assistant really spying on us? You must have wondered about it more than once, considering that they are already in all kinds of devices (televisions, cell phones, speakers) that we have at home, at work, or that we always carry with us. It also does not help that sometimes, when we comment on something in private, advertisements related to what we have said while surfing the internet appear later, or that news is published about the fact that employees at one of the companies responsible for these systems have accessed the private conversations of thousands of users.

Of course, and despite acknowledging occasional security flaws, Amazon, Google and Apple reassure their users that they are committed to transparency, letting them know what data they collect and allowing users to manage what is done with it at all times. To find out exactly what each of the top assistants actually listens to, how they store that information and how it is treated, we analyzed their terms and conditions and consulted the firms themselves.

On the device itself

If there is one thing Apple is known for, it is allowing users to control what data is shared, with which apps and how it is handled in each case (both their own and those of third parties), making it easier to make adjustments. And nothing is shared with advertisers. A priori, Apple is the most restrictive with Siri and guarantees that, every time it is asked for something, the audio of the requests does not leave the iPhone, iPad or HomePod, unless you decide to share it voluntarily.

However, there are small differences in what is done with the data depending on which application uses the assistant. For example, queries made to apps such as Notes or Messages do not even send the information to Apple’s servers; but this happens when the request involves searching the Internet or using the dictation function, although in this case everything is anonymous: none of the queries are associated with the user’s ID. Instead, random identifiers composed of a long sequence of letters and numbers are used: with Safari and Spotlight this identifier changes every 15 minutes, and in Dictation they are erased (with all transcriptions) when Siri is deactivated and reactivated. In this regard, it is worth pointing out that Apple does not ensure that requests made more than six months ago or the “small sample of requests” that may have been analyzed will be deleted, since they will no longer be associated with the random identifier.

Contact with servers

Google Assistant, on the other hand, sends all queries to its servers and cannot be configured otherwise: it is a prerequisite to get a response. However, by default, none of these requests are saved, making it impossible for anyone to access the recordings or identify who made them. But Google warns: if you choose to have them stored in the user’s account, you will be helping the system to work better (specialized reviewers analyze the audio to check if it was understood correctly); and, secondly, to personalize the experience based on the information Google has about each user and the queries they have made in the past.

If you have chosen to enable this option, in the event that the assistant is activated by mistake (something very common in any service of this type, as regular users are well aware), it would be enough to say “Hey Google, I wasn’t talking to you” for it to delete any conversation from the activity log. In this sense, it is also possible to review all interactions and delete them manually, schedule it to be done automatically every 3, 18 or 36 months, make a voice request to delete all conversations from the last week, etc.

Something similar can be done with data. Although Google Assistant does not need access to it to function, if granted permissions it could alert you if it detects that there is traffic on the usual route to work (without needing to tell you anything) or know when one of your contacts’ birthdays is. What the company does state emphatically is that it never sells the audio recordings or any other personal information.

What about Alexa?

Amazon uses customer data to personalize purchases, recommend playlists, books, etc. And it personalizes Alexa depending on who is using it. Since its launch in Spain, almost five years ago, the multinational has wanted to focus on the fact that there are no privacy risks and that users have control over what information is stored and what is done with it.

That said, whenever Alexa is used, the requests go to the cloud and are stored there encrypted. You can consult what you have listened to (and recorded) at any time, and even play back a clip and manage all the recordings: delete some, sort them by date, sort according to who made them, on which device and so on. It is also possible to delete them all at the same time, schedule when to do so, or choose not to save them at all. All this from the app or by voice.

But, undoubtedly, one of the functions of Amazon’s assistant that has generated the most concern in terms of security is Drop In: something similar to an intercom that makes it easier for family and friends to communicate with each other through their smart speakers. Could this feature be accessed for spying by cybercriminals? Amazon’s answer is a resounding no. To use this feature you must activate it manually and, in addition, authorize contact by contact who can call. And, of course, you have to accept the incoming call once it occurs.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_