The Limited Times

Now you can see non-English news...

UNESCO warns of gender bias in generative artificial intelligence

2024-03-07T09:15:57.849Z

Highlights: UNESCO warns of gender bias in generative artificial intelligence. A man is more likely to be presented as a teacher, a driver or a bank employee, while a woman will be presented in at least 30% of generated texts as a prostitute, a model or a waitress. Women represent only 22% of team members working in artificial intelligence globally, according to figures from the World Economic Forum. To combat these prejudices, UNESCO recommends that companies in the sector have more diverse teams of engineers, particularly with more women.


A man is more likely to be presented as a teacher, a driver or a bank employee, while a woman will be presented in at least 30% of generated texts as a prostitute, a model or a waitress.


The large language models of Meta and OpenAI, which serve as the basis for their generative artificial intelligence tools, convey gender bias, warns a study unveiled Thursday by UNESCO, on the eve of International Women's Day. womens rights.

OpenAI's GPT 2 and GPT 3.5 models, the latter being at the heart of the free version of ChatGPT, as well as Llama 2 from competitor Meta, demonstrate "unequivocal bias against women", warns the body UN in a press release.

Real-world discrimination is not only reflected in the digital sphere, it is also amplified there

,” underlines Tawfik Jelassi, UNESCO Assistant Director-General for Communication and Information.

According to the study, conducted from August 2023 to March 2024, these language models are more likely to associate feminine nouns with words like "

house

", "

family

" or "

children

", while masculine nouns are more associated to the words “

business

”, “

salary

” or “

career

”.

The researchers also asked these interfaces to produce stories about people of different origins and genders.

The results showed that stories about "people from minority cultures or women were often more repetitive and based on stereotypes."

An English man is therefore more likely to be presented as a teacher, a driver or a bank employee, while an English woman will be presented in at least 30% of the generated texts as a prostitute, a model or a waitress.

These companies “fail to represent all their users,” laments Leona Verdadero, specialist in digital policies and digital transformation at UNESCO.

Read alsoArtificial intelligence: ChatGPT unblocks for hours, with nonsense sentences

Artificial intelligence increases inequality in the real world

As these artificial intelligence applications are increasingly used by the general public and businesses, they “

have the power to shape the perception of millions of people

,” notes Audrey Azoulay, director general of the UN organization.

So the presence of even the slightest gender bias in their content can significantly increase inequalities in the real world

,” she continues in a press release.

To combat these prejudices, UNESCO recommends that companies in the sector have more diverse teams of engineers, particularly with more women.

Women represent only 22% of team members working in artificial intelligence globally, according to figures from the World Economic Forum, recalls UNESCO.

The UN body also calls on governments for more regulation to implement “

ethical artificial intelligence

”.

Source: lefigaro

All life articles on 2024-03-07

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.