Why the iris offers the most precious biometric data

In Spain, the authorities have prohibited Worldcoin from collecting this kind of information, opening a debate about whether protecting privacy is an individual or collective responsibility

In Madrid, people line up in front of a Worldcoin iris-scanning orb.ANTOINE DEMAISON (Reuters)

The Spanish Data Protection Agency (AEPD) made an unprecedented decision this past Wednesday. For the next three months, the Worldcoin orbs will no longer be allowed to operate in the country. Since July 2023, these devices have scanned the irises of some 400,000 Spaniards, in order to validate their accounts and reward them with a batch of cryptocurrencies (which have a cash value of about $80).

The data collected to-date by Worldcoin — a company linked to Sam Altman, the godfather of ChatGPT — is currently blocked. It cannot be processed or shared, until an international investigation decides whether or not it’s legal for a private company to collect this type of data.

In Spain, it’s the first time that the AEPD has taken such precautionary measures. The director of the agency — Mar España — has highlighted the exceptional nature of the move: “We acted urgently because the situation required it. Our decision is justified so as to avoid potentially irreparable damage. Not taking action would have deprived people of the protection that they’re entitled to.”

Why the sudden rush to freeze the collection of high-resolution photographs of users’ irises? “Because a state of alarm has been generated in our society. I think that the queues that have formed in shopping centers [for the orbs] and the fact that there are cryptocurrencies involved have forced the AEPD to move quickly,” Borja Adsuara notes. He’s a consultant and expert in digital law, who expresses his concerns about Worldcoin’s orbs. “The problem isn’t whether they give you money for an image of your iris, but whether that data is being treated correctly.”

The value of biometric data

There are many types of personal data. The most used in day-to-day procedures are names and surnames, addresses, or phone numbers. All of them can be used to identify a specific individual, but they share another characteristic: the interested party can modify them.

Other personal data, however, stays with us for life. This is the so-called biometric data, which refers to the unique characteristics of each person, whether physiological, physical, or behavioral. This type of information can be encrypted and often remains unchanged over time. We have the same DNA from the moment we’re born until we die. The same is true of fingerprints (unless we burn them). The face evolves over the years (we gain weight, we age, we lose hair), but there are algorithms capable of establishing unique patterns. For example, they can measure the distance between the eyes, or the distance between the eyes from the nose or the mouth. This allows for the recognition of people — with a high degree of success — over time.

The iris is — among the different types of biometric data — the one that most accurately identifies a person, according to David Arroyo, principal researcher of the Cybersecurity and Privacy Protection working group at the Spanish National Research Council (CSIC). He warns that “if an image of your iris is stolen — or, rather, the alphanumeric template with which that biometric trait is stored — your identity can be impersonated in many contexts. Iris-scanning is much more accurate than facial recognition. It’s not used as much, because the necessary sensor is more expensive and the adjustment of these systems is more complex.”

In Madrid, people wait to have their irises scanned at the Worldcoin booth. The images are taken by Orb operators, who are subcontractors. Pablo Monge

In addition to its value as a personal identifier, an iris analysis can provide lots of other information, both physiological and behavioral. “Through your gaze and how your pupil dilates, you can tell what someone likes, what scares them, what interests them. You can even see certain cognitive characteristics, such as whether they have Parkinson’s,” says Carissa Véliz, a professor of philosophy at the University of Oxford and author of the book Privacy is Power (2020).

Iris-scanning is usually limited to high security environments, as an additional means of identification to access certain facilities. “It allows for very robust authentication, but it entails many privacy problems, because the iris is something that’s directly and unequivocally linked to a specific person,” Arroyo points out.

Special treatment

The particularities of biometric data make its legal treatment stricter than other forms of data. “European legislation considers it to be a special category of data. Biometric data can be captured either when Spanish legislation expressly allows it in certain cases, or when there’s consent,” argues Ricard Martínez, director of Privacy and Digital Transformation at the University of Valencia. “Spanish regulations say that, supposedly, in health and biometric data, you should be able to consent. But that doesn’t mean everything is possible. You could have the consent of the affected person and pursue an illegal or disproportionate activity, or violate a fundamental right. It’s more complicated than it seems.”

Proportionate use of this data is key. In 2021, the AEPD fined the Spanish supermarket chain Mercadona €3.5 million ($3.83 million) for using cameras containing facial recognition systems in 48 of its stores. The company argued that it installed this technology to detect people who are barred from entering its establishments. The agency resolved that the goal pursued — of identifying convicted people — didn’t justify collecting facial patterns from all customers who entered the chain’s supermarkets.

Returning to the case of Worldcoin, the orbs scan the iris and convert that image into an alphanumeric code. That template is what identifies the user. “The problem isn’t that Worldcoin has collected this data from 400,000 people, but that they make all these databases and images available to other algorithms, and they don’t disclose exactly why they do this,” says Jorge García Herrero, a lawyer who specializes in enforcing regulations surrounding data privacy.

An American soldier scans the iris of an Afghan man south of Kandahar.Chris Hondros (Getty Images)

The great danger of biometric data is that it can be used for illegitimate purposes. In China, for example, facial recognition systems are used to monitor and persecute those from the Uyghur ethnic group. There’s also a suspicion that, when the Taliban regained control of Afghanistan in 2021, they turned to biometric identification technologies — such as iris-scanning — to detect and repress those who collaborated with the former regime. Biometrics are an unrivalled tool if you’re looking to repress a population and, of course, biometric data can also be used to impersonate people.

What if I don’t care about privacy?

“I’m an ordinary citizen. Google already has all my data. I don’t think the eye contributes much,” a young man shrugs. He was preparing to have his iris scanned at a mall in Madrid two weeks ago, when he was interviewed by EL PAÍS.

This is a recurring argument. But Carissa Véliz — from the University of Oxford — thinks it’s a false one. “We tend to think that when something is personal, it’s individual. But when you share your personal data, in reality, you’re also putting others in danger, as was seen in the case of Cambridge Analytica,” she explains, referencing the scandal carried out by said consultancy during the 2016 presidential campaign in the United States. The firm accessed the personal information of 50 million Facebook users to create profiles of American voters and target them with personalized election advertising.

“You may not care about your privacy, but I don’t see privacy as a right, but rather as an obligation, because you can put your entire environment at risk,” says David Arroyo, from the CSIC. “This type of data is then used to profile other people. From there, more sophisticated attacks are mounted, such as phishing or disinformation,” he emphasizes. Even if the right to rectify is later exercised and the biometric data collected is eventually deleted, it will have already been used to train the system. In other words, to make it more efficient.

What worries experts in the Worldcoin case is that it contributes to the normalization of a technology — iris-scanning — which is a double-edged sword. “If we let it establish itself as a legitimate form of verification, eventually, everyone will end up using it,” Véliz laments. “I’m very upset that the use of facial recognition to unlock phones has become normalized. I think it’s made people perceive that kind of technology as something natural. Let’s hope that the same thing doesn’t happen with iris-scanning.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In