The Limited Times

Now you can see non-English news...

Facial recognition strips us in front of strangers

2021-07-12T00:59:33.278Z


This technology can violate the law by revealing intimate information of citizens or incurring racial biases


Facial recognition camera at the Mobile World Congress in Barcelona last June.Joan Cros / Corbis via Getty Images

Finding out by chance that you are adopted after uploading your photo to an image search engine and discovering who your biological parents are.

Know the past of your new partner, who is part of their family and where they work.

Or that they realize that addiction that you had years ago and that you kept locked.

All of this is now possible through facial recognition.

As an example, just by posting a photo of yourself in the PimEyes digital tool, the person writing this report has tracked your face on the internet with different hairstyles, in disparate places, alone or with others.

And he has also discovered other people with facial features very similar to his own.

More information

  • Chinese facial recognition arrives at the gates of the European Union

The latest technology in face recognition is already being used, although the questions it raises from a legal point of view are diverse.

Because, depending on the biometric data collected, very intimate information about the person can be known, such as their ethnicity, genetic characteristics, diseases or emotional state.

In fact, the image of a person, to the extent that it identifies or can identify him, constitutes a personal data protected by the European Data Protection Regulation and by national legislation. This is explained by Isabela Crespo, a senior lawyer at the Gómez Acebo & Pombo law firm, who points out that there is a general “prohibition” that prevents companies from using this information.

According to article 9 of the European regulation, “the processing of personal data that reveal ethnic or racial origin, political opinions, religious or philosophical convictions, trade union membership, the processing of genetic data, biometric data aimed at identifying in a unique way to a natural person, data related to health or data related to the sexual life or sexual orientation of a natural person ”.

Explicit consent

To render this prohibition without effect, companies must claim a justification based on the "public interest", provided for in a rule that currently does not exist in Spanish law, or obtain the "explicit consent" of the interested party, explains the lawyer.

But even so, the Spanish Data Protection Agency is very restrictive when it comes to authorizing the processing of personal information based on the face.

For example, in Report 47/2021 on facial recognition of bank customers to verify their identity, the regulator has just resolved that the principles of "necessity, proportionality and minimization" are not met.

In other words, it considers that there are less invasive measures than the use of this technology to identify people.

Indeed, Atlético de Madrid has announced that it is working on the implementation of facial recognition systems in the Wanda Metropolitano for next season.

The measure is part of the club's technological commitment to its stadium, which has capacity for more than 68,000 people.

The question that hangs over the rojiblanco team is whether the processing of these biometric data is lawful or not, since justice has prevented Mercadona from using facial recognition to detect the presence of thieves in its stores.

More information

  • The keys to the controversy over the use of facial recognition in Mercadona supermarkets

“This Chamber cannot share that the measure in question [facial recognition] is protecting the public interest, but rather, the private interests of the company in question, since it would be violating the guarantees in order to protect rights and freedoms of the interested parties, not only of those who have been punished and whose access prohibition is incumbent upon them, but of the rest of the people who access the aforementioned supermarket ”, reasoned the Provincial Court of Barcelona in the car for which it denied to Mercadona the use of this technology.

In the opinion of Alfonso Hurtado, partner of the Écija law firm, the Wanda Metropolitano could defend that the user explicitly consents to the processing of their biometric data by the mere fact of accessing the stadium, with prior information on the matter by the company. The important thing is that this consent is not "flawed" or, in other words, that it "does not discriminate" the fan who "does not allow" to process their data.

However, these are issues against which "regulators are adopting protectionist positions" on the rights of citizens, warns lawyer Hurtado. In this direction points Paloma Bru, partner of the firm Pinsent Masons, who considers that "the explicit consent in no case may be understood to be given by the mere tacit acceptance, by the fact of entering the stadium", since a "manifestation of will specific, informed and unequivocal ”. To ensure that consent is freely given, "alternative solutions to facial recognition, for example, a password or an ID card that are easy to use, should be offered to interested parties", since if they have too many characters or are complicated compared to facial recognition technology,"The election would not be authentic."

In any case, the lawyer explains that if citizens consider that their fundamental rights are being violated through identification through facial recognition, in addition to being able to contact the Spanish Data Protection Agency, they can “file a claim directly with the courts and request compensation, as well as the cessation of data processing ”.

The erasure of women and blacks

Last year it turned out that Twitter gave preference to white people over blacks and men over women when cropping images. Biases that contrast with the errors of some facial recognition algorithms when identifying African Americans and Asians, since they have considered that different people are the same. A situation of which Pablo Fernández, lawyer for PwC, warns of the seriousness, since "the percentage of accuracy of 99% continues to pose a risk that an innocent person will be identified as guilty". To avoid this, human “validation” of facial recognition technology is essential.


Source: elparis

All business articles on 2021-07-12

You may like

Tech/Game 2024-03-08T05:00:24.004Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.