The Limited Times

Now you can see non-English news...

Why the iris is the most precious biometric data

2024-03-08T05:00:24.004Z

Highlights: Spanish Data Protection Agency prohibits Worldcoin from continuing to collect iris data. Worldcoin scanned irises of 400,000 Spaniards to validate their accounts and reward them with a batch of cryptocurrencies. The data collected to date by Worldcoin, a company linked to Sam Altman, the godfather of ChatGPT, is blocked until an international investigation decides whether or not it is legal for a private company to collect that type of data. The iris is, among the different biometric data, the one that most accurately identifies a person.


The decision of the Spanish authorities to prohibit Worldcoin from collecting this information opens the debate on whether taking care of privacy is an individual or collective responsibility


The Spanish Data Protection Agency (AEPD) has taken an unprecedented decision this Wednesday.

For the next three months, the Worldcoin orbs will no longer be able to operate, which since July have scanned the irises of some 400,000 Spaniards to validate their accounts and reward them with a batch of cryptocurrencies, which now have a value of about 80 euros.

The data collected to date by Worldcoin, a company linked to Sam Altman, the godfather of ChatGPT, is blocked, so it cannot be processed or shared until an international investigation decides whether or not it is legal for a private company to collect that type. of data.

It is the first time that the AEPD takes precautionary measures.

The director of the agency, Mar España, has highlighted its exceptional nature: “We have acted urgently because the situation required it.

Our decision is justified to avoid potentially irreparable damage.

Not taking it would have deprived people of the protection to which they are entitled.”

More information

The Data Protection Agency prohibits Worldcoin from continuing to collect iris data, which gave cryptocurrencies in exchange

Why this sudden speed in paralyzing the collection of high-resolution photographs of users' irises?

“Because a state of social alarm has been generated.

I think that the queues that have formed in shopping centers and the fact that there are cryptocurrencies involved have forced the AEPD to move quickly," says Borja Adsuara, consultant and expert in digital law, who expresses his concern that " Don't focus on what's important: the problem is not whether they give you money for your iris, but whether that data is being treated correctly."

The value of biometric data

There are many types of personal data.

The most used in day-to-day procedures are name and surname, address or telephone number.

All of them can be used to identify a specific individual, but they share another characteristic: the interested party can modify them.

Other personal data, however, stay with us for life.

These are the so-called biometric data: those that refer to unique characteristics of each person, whether physiological, physical or behavioral.

This type of information can be encoded and often remains unchanged over time.

We have the same DNA from the moment we are born until we die.

The same thing happens with fingerprints (unless we burn them).

The face evolves over the years (we gain weight, we age, we lose hair), but there are algorithms capable of establishing unique patterns—for example, measuring the distance between the eyes, the eyes with the nose or the mouth—that allow us to recognize people with a high level of success and sustained over time.

The iris is, among the different biometric data, the one that most accurately identifies a person, according to David Arroyo, principal researcher of the Cybersecurity and Privacy Protection group of the CSIC, who warns that “if your iris is stolen, or , or rather, the alphanumeric template with which that biometric trait is stored, can impersonate your identity in many places.

Iris reading is much more accurate than facial recognition.

It is not used as much because the necessary sensor is more expensive and the adjustment of these systems is more complex.”

Queues to have your iris photographed to register in Worldcoin at the small booth at the Avenida de América Exchange (Madrid).

The images are taken by employees subcontracted with the Orb, the silver ball.

The company's only sign says: "The world economy belongs to everyone." Pablo Monge

In addition to its value as a personal identifier, an iris analysis can provide much other information, both physiological and behavioral.

“Through your gaze and how your pupil dilates, you can tell what someone likes, what scares them, what interests them, and even certain cognitive characteristics, such as whether they have Parkinson's,” says Carissa Véliz, professor of philosophy at Oxford University and author of the book

Privacy is Power

.

Iris reading is usually limited to high security environments, as an additional means of identification to access certain facilities.

“It allows for very robust authentication, but it entails many privacy problems, because the iris is something that is directly and unequivocally linked to a specific person,” says Arroyo.

A special treatment

The particularities of biometric data make its legal treatment stricter than the rest.

“European legislation considers them a special category of data.

They can be captured either when Spanish legislation expressly allows it for certain cases, or when there is consent,” argues Ricard Martínez, director of the Privacy and Digital Transformation chair at the University of Valencia.

“Spanish regulations say that, supposedly, in health and biometric data, you should be able to consent.

But that doesn't mean everything is possible.

You could have the consent of the affected person and pursue an illegal or disproportionate activity, or violate a fundamental right.

It's more complicated than it seems."

Proportionate use of this data is key.

In 2021, the AEPD fined Mercadona 3.5 million euros (it paid 2.5 million for accepting the voluntary payment) for using cameras with facial recognition systems in 48 of its stores.

The company argued that it installed this technology to detect people with restraining orders from its establishments.

The agency resolved that the goal pursued, identifying convicted people, did not justify collecting facial patterns from all customers who entered the chain's supermarkets.

Returning to the case of Worldcoin, the orbs scan the iris and convert that image into an alphanumeric code.

That template is what identifies the user.

“The problem is not that Worldcoin has collected this data from 400,000 people, but that they make all these databases and images available to other algorithms and that they do not say exactly why,” says Jorge García Herrero, data protection and lawyer specializing in enforcing this regulation.

An American soldier scans the iris of an Afghan man south of Kandahar.Chris Hondros (Getty Images)

The great danger of biometric data is that it is used for non-legitimate purposes.

In China, for example, facial recognition systems are used to monitor and persecute Uyghurs.

There is a suspicion that when the Taliban regained control of Afghanistan in 2021, they turned to biometric identification technologies, such as iris scanning, to detect and repress collaborators with the former regime.

Biometrics are an unrivaled tool if you are looking to repress and, of course, biometric data can also be used to impersonate people.

What if I don't care about privacy?

“I am an ordinary citizen, Google already has all my data, I don't think the eye contributes much,” a young man who was preparing to have his iris scanned at the La Vaguada shopping center in La Vaguada told EL PAÍS two weeks ago. Madrid.

It is a recurring argument.

Carissa Véliz, from the University of Oxford, considers it fallacious.

“We tend to think that when something is personal it is individual, but when you share your personal data, in reality you are also putting others in danger, as was seen in the case of Cambridge Analytica,” he explains in reference to the scandal carried out by said consultancy, that accessed personal information of 50 million Facebook users to create profiles of American voters and target them with personalized election advertising.

“You may not care about your privacy, but I do not see it as a right, but as an obligation, because you can put your entire environment at risk,” says David Arroyo, from the CSIC.

“This type of data is then used to characterize other people, and with them more sophisticated attacks are mounted, such as phishing or disinformation,” he points out.

Even if the right to rectify is later exercised and the biometric data collected is deleted, it will have already been used to train the tool, that is, to make it more efficient.

What worries experts in the Worldcoin case is that it contributes to the normalization of a technology, iris reading, which has a double edge.

“If we let it establish itself as a legitimate form of verification, eventually everyone will end up using it,” says Véliz.

“I have been very upset that the use of facial recognition to unlock phones is normalized.

I think it has made people perceive that technology as something natural.

Let's hope that the same thing does not happen with iris reading."

You can follow

EL PAÍS Tecnología

on

Facebook

and

X

or sign up here to receive our

weekly newsletter

.

Subscribe to continue reading

Read without limits

Keep reading

I am already a subscriber

_

Source: elparis

All tech articles on 2024-03-08

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.