The Limited Times

Now you can see non-English news...

Invisible discrimination: This is how sexist artificial intelligence is

2024-02-29T08:33:22.954Z

Highlights: Invisible discrimination: This is how sexist artificial intelligence is. Even in 2023, women will earn 18 percent less per hour than men. If not only people but also technology engage in systematic discrimination, it remains invisible. The anti-discrimination agency is calling for new laws to combat sexism in AI. It is particularly difficult to prove discrimination because, according to the Anti-Discrimination Agency, there are gaps in the General Treatment Act (GAG) when it comes to AI. In the past, even Apple has had to face allegations of sexist algorithms in the Apple Card credit card.



As of: February 29, 2024, 9:24 a.m

By: Cefina Gomez

Comments

Press

Split

If not only people but also technology engage in systematic discrimination, it remains invisible.

The anti-discrimination agency is calling for new laws.

Munich - Equality between women and men is still moving at a snail's pace.

This is also proven by a study by the

European Academy for Women in Politics and Business Berlin e.

V.

, 60 percent of the female politicians surveyed under the age of 45 stated that they had already been exposed to sexualized discrimination.

In comparison, only a fraction of three percent of men reported similar experiences.

The study makes it clear that there is still a lot of room for improvement.

However, coming to terms with and fighting against sexist discrimination will become increasingly difficult if discrimination comes not only from humans but also from automated processes through AI.

Sexism is also noticeable in salaries.

Even in 2023, women will earn 18 percent less per hour than men.

How AI can be trained for sexism © Imago/Christian Ohde

Sexism in AI: When women are disadvantaged in the lending and application process

Banishing discrimination and sexism from our society represents a huge challenge. But there is particularly little room for action when discriminatory structures remain hidden from those affected.

According to the report on protection against discrimination through algorithmic decision-making systems from the

Federal Anti-Discrimination Agency,

it becomes clear what problems AI brings with it.

The independent Federal Commissioner for Anti-Discrimination, Ferda Ataman, warns: “AI makes many things easier – unfortunately also discrimination.”

Since many companies today rely on AI-supported algorithms, especially in application processes or when granting loans, women are systematically disadvantaged.

“Here, probability statements are made based on general group characteristics,” says Ataman.

However, the apparent objectivity would contribute to the reproduction of prejudices and stereotyping.

However, applicants should also be careful when using AI.

Anyone who has their application for a new job written using ChatGPT could be legally prosecuted.

In the past, even Apple has had to face allegations of sexist algorithms

According to a report

by Spiegel

, Apple came under criticism in 2019 for using sexist algorithms in the Apple Card credit card.

Entrepreneur and best-selling author David Heinemeier Hansson complained in a tweet on

My news

  • Time change, money from the state and energy prices - what's new in March 2024 read

  • Income changes in March: Less money for pensioners, more in public services

  • “It's hard to beat the cheekiness”: After seeing this, the customer asks Edeka to give her a written statement

  • “Naive or stupid”: Man doesn’t pay for electricity reading for three years

  • More money for public sector employees: This is how much salary there will be from March reading

  • Telekom change affects existing customers: the streaming change should start in February

So-called deepfakes, which are generated by generative artificial intelligence, are currently being used for misogynistic actions online by circulating fake porn images.

China was the first country to pass laws to curb deepfakes.

AI and algorithms can also increase racism and social inequality

According to the

Anti-Discrimination Agency

, institutional racism was also uncovered by AI in the so-called “child benefit affair” in the Netherlands.

Over a period from 2014 to 2019, the Dutch government incorrectly asked around 20,000 parents, most of whom had dual citizenship, to repay child benefit.

The “robo-debt scandal” in Australia was similarly problematic.

Prejudicial thought patterns were automated on a large scale, resulting in supposed tax debts being claimed in a short period of time without taking seasonal employment or student activities into account.

This even led to some young people committing suicide, as the

FAZ

reported.

Deep racism is also evident in technology, as research from the

MIT Media Lab

makes clear.

Researcher Joy Buolamwini evaluated datasets of 1,279 faces and found that facial recognition software tends to bias toward white men.

Darker-skinned women were found to have the most facial recognition errors.

AI regulation: Anti-discrimination agency calls for the Equal Treatment Act to be adjusted for AI

The decisions made by algorithms depend on the data sets that are entered into the system.

If only men are primarily responsible for this, more subjective information is fed in.

According to an analysis by management consultancy

McKinsey & Company,

the tech industry is largely dominated by men.

In Germany, the proportion of women with bachelor's degrees in the so-called MINT subjects (mathematics, computer science, natural sciences and technology) is only 22 percent.

It is particularly difficult to prove discrimination because, according to the

Anti-Discrimination Agency

, there are gaps in the General Equal Treatment Act (AGG) when it comes to AI.

Ataman is therefore calling for lawmakers to limit this invisible discrimination and adopt AI regulations.

Increasing automation through artificial intelligence offers many opportunities, but also some risks.

(cg)

Source: merkur

All life articles on 2024-02-29

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.