The Limited Times

Now you can see non-English news...

The fears of the experts about the new facial recognition system of the Spanish police: mass surveillance, loss of anonymity and data erasure

2023-01-10T22:49:27.086Z


Interior does not dispel doubts about the dark aspects of the implementation of a tool called to revolutionize police work, which raises doubts about its transparency, proportionality and governance mechanisms


The National Police and the Civil Guard will begin this year to use an automatic facial recognition system in their investigations, as reported by EL PAÍS.

This is ABIS, a tool that uses artificial intelligence algorithms to determine in a few seconds if any image contains a face of which there are records (in this case, of people with a police file).

The technology is already ready;

All that remains is to complete the integration of the database with which it will begin to operate, according to sources from the Ministry of the Interior.

As soon as this process is finished, which is expected to happen in the first half of 2023, it will be possible to start using this computer program, which has been developed by the French military company Thales.

But its application raises several questions, especially related to transparency and the control mechanisms required of tools that work with biometric data.

The Spanish Agency for Data Protection (AEPD) itself has been studying its fit into the legislative framework since September, and must determine whether or not it violates any right.

More information

ABIS: The Spanish Police will use an automatic facial recognition tool

This newspaper has asked Interior about the obscure or unclear points of ABIS, detected with the help of various experts, among which are doubts about its transparency, the way of sharing and managing the databases or how a use will be guaranteed system proportional.

The Ministry has refused to provide additional key information to dispel these doubts.

These are the issues that most concern the engineers, analysts and activists consulted.

1. The shadow of mass surveillance

Police use of automatic facial recognition technologies is tricky because they allow individuals to be unequivocally identified.

The system is capable of detecting human faces in digitized images, whether they come from a mobile phone or from security cameras, and extracting a unique and unmistakable pattern of each person's features, just as happens with fingerprints or DNA.

Although there is an important difference: while in the last two cases it is necessary to establish physical contact with the affected person (to extract their fingerprints or a saliva sample), with facial recognition everything can be done remotely.

That makes it a perfect technology for mass surveillance.

Beijing has known this for years.

The streets of the big Chinese cities are full of cameras equipped with these systems.

The authorities can locate any citizen in minutes by searching his face in real time.

There is no possible escape.

The EU prohibits using real-time facial recognition in public spaces.

Interior completely rules out using ABIS for surveillance work instead of for investigations.

However, the Ministry does not clarify how or who is going to control how the tool is used.

“We need to know how many times the system is going to be used: only for particularly serious cases or for any investigation?

If its use becomes widespread, it can become a mass surveillance tool almost unintentionally," says Carmela Troncoso, a professor at the Federal Polytechnic School of Lausanne (Switzerland) and author of the secure tracking protocol used in infection tracking applications. from covid.

2. Anonymity and algorithmic control

The expert also has doubts about the anonymity that Thales promises regarding the management of ABIS biometric data.

"The information stored is based on alphanumeric data that makes it impossible to identify the owner of their fingerprints," explains the French company.

The objective is that if someone steals that database, they cannot associate the faces stored there with their identities.

“This is a bit doubtful.

Just because the representations are alphanumeric does not mean that they cannot be reconstructed, because there is evidence to the contrary: images can be reconstructed from the models.

The question is what studies have been done to substantiate these statements”, Troncoso abounds.

Regarding the operation of the algorithm developed by Thales, both the company and Interior assure that "it has passed the NIST Vendors Test", an independent company with a non-commercial scope.

"That's like not saying anything," Troncoso replies.

“They have a scale, but they don't say if that is good, bad or regular.

What is the algorithm's score on that test?

Is it the right one for the proposed use case?

Who is going to take care of verifying that it continues to be so over time?

Asked by this newspaper, Interior has not responded to these questions.

An automatic facial recognition system identifies a group of people in China.

3. Enter and exit the database

The other big concern surrounding the use of this technology is who it will be applied to.

According to what Interior told EL PAÍS, the database against which the images to be examined will be contrasted has some five million facial photographic reviews of detainees and suspects who were already on file with the National Police, the Civil Guard and other regional bodies. .

For how long is a suspect registered in the database?

What happens when a suspect ceases to be, for example, by being declared innocent in court?

Is he removed from the database or does he remain in it?

The Ministry has preferred not to answer these questions either.

“Cleaning up these databases is going to be a problem.

It is necessary to have them updated by exchanging them with the databases of other institutions involved in the process”, explains Lorena Jaume-Palasí, an expert in ethics and legal philosophy applied to technology and adviser to the Government of Spain and the European Parliament on related issues. with artificial intelligence.

In other words: the Ministry of the Interior should coordinate with the Ministry of Justice and exchange information,

something that is not the norm neither in this nor in other countries around us.

“There are at least two problems here: on the one hand, you do not have the necessary infrastructure to be able to deploy the system nationally and internationally, and on the other, you have a problem of

enforcement

between institutions that do not cooperate”, adds the researcher.

Staying in that database means being able to be identified as a suspect in any crime.

Because the algorithms fail, and the agents that examine the candidates that ABIS provides in a search can also be wrong.

4. With whom the data is shared

Interior intends to share the facial biometric data it stores with its European partners.

“The ABIS system in Spain can connect with community databases, such as Eurodac, EU-Lisa or VIS”, Thales sources explain.

As the Ministry assured this newspaper, the databases managed by the police will be totally separated from the civilian ones (for example, those that contain ID photos).

"But if the ABIS system is integrated into EU-Lisa, it will do so with asylum seekers, who have not committed a crime and are not criminals," says Javier Sánchez Monedero, Beatriz Galindo researcher in Artificial Intelligence from the Department of Computer Science and Numerical Analysis of the University of Cordoba.

The recent reform of Eurodac, the European database of fingerprints to identify asylum seekers and irregular border crossers, eliminates the need for a court order so that the police can make a query.

If the facial records collected by the security forces are saved in that same database, it will be possible to look automatically without going through the judge.

“It is important to understand what are the limits and casuistry of data management.

Our previous experience is that once these systems get going, they just keep growing,” he adds.

5. Why collect biometric data?

The question that many experts ask is why we need this tool.

Has it been studied that the potential benefits of the implementation of this system will outweigh the possible problems it generates?

Interior assures that ABIS will greatly facilitate the work of the police.

You'll be able to quickly identify suspects from crime scene images that might not otherwise be located.

But implementing an automatic facial recognition system is more than that.

"You are not going to make the process more efficient, but rather you are going to change the process itself," sums up Jaume-Palasí.

“It will take good servers, backup copies, a lot of energy and retraining the professionals who work with these systems in the police, among others.

Furthermore, they are systems that cannot work well because the underlying idea, the methodology itself, is bad.

Identifying people based on any biometric category always implies failures, it is necessary to use other methods and evaluate other aspects”.

“There is no set of algorithms capable of encompassing the multidimensionality necessary to include all the parameters necessary to identify someone”, continues the expert in digital ethics.

“The process of identifying a person from their biometric data is an already problematic process, based on eugenic ideas.

Skin color, for example, is a continuum, you cannot give a specific number of categories.

That, at a technical level, is a problem, because the systems need to delimit the variables.

And this rule applies to a large number of facial features, such as the opening of the eyes and mouth, the size and shape of the nose, etc. ”, she explains.

The more sensitive information is accumulated, the more likely it is to be misused.

“We believe that it is not really necessary to collect biometric data, since it violates fundamental rights and constitutes a violation of privacy,” says Youssef M. Ouled, coordinator of AlgoRace, a group that investigates the consequences of the racist use of artificial intelligence.

“There is a huge data gap on these systems, which are not audited.

They bet on securitization, they go in the opposite direction to what civil society is demanding.

And they have to do with the State's obsession with having a lot of data about us without us knowing very well what they are going to do with it ”, he abounds.

“This technology has enormous potential to be something dangerous, and we don't know what its real potential is to be beneficial,” Troncoso settles.

“We have no guarantee that it will not have bad consequences.

Its use is justified for the sake of efficiency, but what or how much are we going to earn?

Do we have any kind of evidence about it?

We are talking about implementing a technology that applies to all Spaniards, of which you cannot decide not to participate and with which we do not know if we win or lose either”.

You can follow

EL PAÍS TECNOLOGÍA

on

Facebook

and

Twitter

or sign up here to receive our

weekly newsletter

.

Subscribe to continue reading

Read without limits

Keep reading

I'm already a subscriber

Source: elparis

All tech articles on 2023-01-10

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.