The Limited Times

Now you can see non-English news...

By wanting to fight stereotypes, Google Gemini AI generated historically incorrect images

2024-02-22T12:53:47.982Z

Highlights: By wanting to fight stereotypes, Google Gemini AI generated historically incorrect images. Faced with the controversy, the company suspended the possibility of generating images of humans from Gemini. Google apologized in a first statement published on Wednesday, admitting to having “missed the boat” “We are working to immediately improve this type of representation,” said Gemini product director Jack Krawczyk in a press release published on its X account this Thursday. To discover PODCAST - Listen to the latest episode of our Tech Questions series.


Nazi soldiers of color, founding fathers of the United States of Asian origin... Faced with the controversy, the company suspended the possibility of generating images of humans.


The amplification of racist or sexist stereotypes by generative artificial intelligence is not a new problem.

Companies behind software capable of generating text or images from a simple text query are often accused by users of perpetuating preconceived ideas.

Driven by huge databases, these tend to materialize existing biases in the real world.

To discover

  • PODCAST - Listen to the latest episode of our Tech Questions series

Thus, as the site The Verge points out, queries such as

“a productive person”

often give rise to representations of white people.

While a demand for visuals linked to social difficulties most often generates images of black people.

But by wanting to avoid this pitfall on its new AI Gemini, available since the beginning of February in the United States, Google got entangled in an opposite problem.

Indeed, as Internet users have spotted, its AI ensures that it offers a diverse representation of humanity.

But this leads Gemini to generate historically inconsistent images.

Thus, the automated creation of visuals for

“German soldiers in 1943”

,

“portrait of the Founding Fathers of the United States”

or

“American senator in the 1800s”

leads this AI to produce images of people of color.

Faced with the outcry, Google restricted this Thursday the possibility of generating images of humans from Gemini.

“We are working to resolve issues with Gemini's image functionality.

During this time, we are suspending the generation of images of people and we will soon release an improved version

of the tool, explains the company in a press release published on its X account this Thursday.

Difficulty correcting bias

“We are aware that Gemini offers inaccuracies in certain representations of historical images

,” Google apologized in a first statement published on Wednesday, admitting to having

“missed the boat”

.

“We are working to immediately improve this type of representation.”

Gemini product director Jack Krawczyk said Wednesday that Google

“designs its image generation tools to reflect the diversity of Google users around the world, in accordance with our AI Principles.

We will continue to do this for general queries.

But historical contexts bring more nuances, and we will therefore adapt our models.

Google is far from the only group to face AI bias.

Software like DALL-E developed by Open AI (at the origin of ChatGPT), or its competitor Stable Diffusion, for example, tends to make 97% of business leaders male.

At least, that's what researchers from the start-up Hugging Face, which defends an open source design of AI resources, concluded last March.

To avoid falling into the same pitfall, the latter designed a tool called Fair Diffusion based on semantic guidance.

Concretely, this allows the user to accompany the software and modify the results of the images obtained.

Thus, if the user asks the AI ​​to materialize business leaders, he can then ask for less biased visual suggestions.

And hope to see both women and men materialize for his request.

Source: lefigaro

All news articles on 2024-02-22

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.