The Limited Times

Now you can see non-English news...

Would you agree to have your face 'read' at any time?

2022-09-21T10:39:37.725Z


EU research team identifies facial processing apps that should be banned, limited and allowed From left to right, Isabelle Hupont, Emilia Gómez and Songül Tolan, facial processing researchers, at the headquarters of the EU Joint Research Center in Seville this September. PACO PUENTES Felipe Gómez-Pallete, president of Quality and Democratic Culture, warns that “the technology, once developed, there is no one to stop it”. And he adds: "We can regulate it, temper it, but we are late." This


From left to right, Isabelle Hupont, Emilia Gómez and Songül Tolan, facial processing researchers, at the headquarters of the EU Joint Research Center in Seville this September. PACO PUENTES

Felipe Gómez-Pallete, president of Quality and Democratic Culture, warns that “the technology, once developed, there is no one to stop it”.

And he adds: "We can regulate it, temper it, but we are late."

This is the case of facial processing technologies, those that

read

and they analyze our faces with dozens of intentions, from commercial or security to helping the blind or people with Alzheimer's.

The European Commission has drawn up a proposal to regulate artificial intelligence (AI Act) in order to overcome this gap and anticipate those advances that pose a "high or unacceptable" risk to fundamental rights without hindering their development.

In order to provide the necessary scientific information, the European project Humaint, from the Joint Research Center (JRC), has produced an exhaustive report in which it identifies 187 companies that develop facial processing technologies, what they are like, how they are they use and which ones should be prohibited, limited or allowed: an indicative traffic light so that, this time, they arrive on time.

ScientificReports

.

Any user will be familiar with the technologies that allow access to our mobile with facial recognition or those that ensure at a border that the holder of a passport matches the image printed on it.

They seem few, but they are more common than it seems.

Humaint has focused on 60 generalized uses and of mature technologies already established or accredited that allow from knowing the reaction of a consumer before a product to following the trail of a missing person or locating a criminal suspect or detecting the drowsiness of a driver .

But also those that make it easier to identify someone in a space and who they are with or classify people by their ethnicity, age or sex or monitor their behavior in an exam or a meeting.

To facilitate regulation, the JRC research group has developed a four-color traffic light that corresponds to the risk levels defined by the AI ​​Act: black for uses that should be prohibited, red for those that pose risks and must be subject to requirements strict, yellow for those that must include warnings to users and green for those with minimal risk.

Some utilities push the limits of what is permissible and share colors depending on how they are implemented.

More information

Dutch police present a 'deepfake' video to try to solve the murder of a 13-year-old boy

Isabelle Hupont, co-author of the research, clarifies that it is also necessary to distinguish between facial processing and using biometrics to identify people: “We can analyze the face, its expressions, without identifying the person.

This is widely used in marketing techniques to see the reaction to a product or an advertisement.

You can also see if, in a pandemic situation, people wear a mask, but without identifying them "

High risk

.

Among the black and red applications (prohibited or high risk) are those of monitoring with personal cameras, drones or robots or unrestricted facial recognition, the unjustified search of people or their follow-up, cataloging through personal images, identification of clients to give them special treatment or the location of users of social networks and contacts.

"Proportionality is important, because they can be used for certain cases, such as terrorist threats or the search for the disappeared, dangerous criminals and people wanted by justice," explains Songül Tolan, also a co-author of the investigation.

In this sense, Emilia Gómez, coordinator of the Humaint project, points out how the use of facial recognition for access control can entail risks that vary, for example, whether or not the user has prior authorization.

Transparent

.

Among the applications classified as yellow, those that require transparency or information on their use to people, are those for demographic analysis, detection of personal characteristics (sex or age range, for example), smiles, personalization of ads, measurement of customer satisfaction or the emotional experience of a video game.

In this group, according to Hupont,

deepfakes

(hyper-realistic audiovisual montages) or

chatbots

, artificial interlocutors, would enter.

In both cases, according to the researcher, "it must be clearly stated that it is an image created by artificial intelligence or that there is no human conversing."

Allowed

.

The uses that pose a low risk are those of access at borders, bank authentication or for official procedures, unlocking of devices, perimeter protection, artistic emotional interaction, control of pandemics and gauging, pairing of a real image with a robot portrait or those that involve the consented and voluntary participation of the user.

Again, the lines are not clear.

There are systems that allow a blind person to know who is with them and their position.

Or others that allow an Alzheimer's patient to reconstruct her day, who she was with, to minimize memory failures.

It can also facilitate the improvement of emotional capacities for autistic people.

But those same applications, with a use other than that for which they have been designed, can become red, dangerous and limitable.

This has been the case of the programs designed to promote student motivation based on their reactions and that were used for control and surveillance during exams and classes via the Internet.

There are many uses that help companies or people, that improve life.

They must also be shown because advances in this technology are not always negative

Songül Tolan, co-author of the JRC research

“There are many uses that help companies or people, that improve life.

They must also be shown because advances in this technology are not always negative.

These must be regulated, but also support the development of these systems”, says Tolan.

The European proposal to avoid these areas of faded colors is that all high-risk facial processing applications have prior authorization, with a seal that guarantees a series of requirements summarized by Emilia Gómez: "Have a risk management system , transparency and communication, technical documentation and records, adequate data governance, human vigilance, precision, robustness and cybersecurity”.

Isabelle Hupont demonstrates a facial processing application at the EU Joint Research Center in Seville.

Behind, Emilia Gómez (right of the image) and Songül Tolan.PACO PUENTES

Transparency refers to the clarity of the entire system from its inception to its implementation to verify intentions, uses, biases or dysfunctions, which are evaluable and subject to monitoring;

solid and safe implies that they are not manipulated or susceptible to failure;

and robust, that are effective against cyber attacks.

In this sense, Hupont points out how the intentional incorporation of an element that prevents recognition (known as an adversarial pattern) can prevent identification, for example, of a traffic sign by a vehicle with an image recognition system.

The researcher also points out that the proposed principle of human surveillance is that there be, for facial recognition, a minimum of two people behind each system and Gómez adds that it can be forced that the user of the application is always identified and authenticated.

three models

These would be the basic pillars of the regulation proposal for facial processing technologies, but Songül Tolan warns that proper use cannot be guaranteed 100%.

To this end, Emilia Gómez adds that there will have to be complementary legislation on responsibility for deficiencies, that citizens report any situation of abuse or discrimination that is generated from these technologies and that justice adapts to this reality in which Europe it is at the forefront in terms of regulation, although not in terms of development.

In this sense, the research details how the United States is the country at the forefront of the creation of facial processing applications, with a great weight of large private companies.

Europe, which is in second position, presents an ecosystem more linked to small and medium-sized companies while China, which is in third place, has a unique landscape concentrated in a few large companies linked to the Government.

This x-ray is essential because the key is in the ownership of the databases.

The greater the “raw material”, as Hupont defines information, the greater the capacity.

“The smallest private database of a large company is 12 times larger than the largest public database.

And the tendency is not to share them, ”he warns.

This circumstance is a burden for the development of efficient technologies by small and medium-sized companies subject to the governance advocated by Europe, which is complemented by a commitment to a Single Data Market in Europe.

You can write to us at

rlimon@elpais.es

, follow

EL PAÍS TECNOLOGÍA

on

Facebook

and

Twitter

and sign up here to receive our

weekly newsletter

Subscribe to continue reading

read without limits

Keep reading

I'm already a subscriber

Source: elparis

All tech articles on 2022-09-21

You may like

Life/Entertain 2024-02-22T15:13:00.884Z
Life/Entertain 2024-02-23T06:43:09.444Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.