The Limited Times

Now you can see non-English news...

This artificial intelligence has a mania for me

2021-02-10T01:07:08.532Z


Algorithms reproduce the prejudices of their creators and society FOLLOW Follow The time has come when artificial intelligence makes decisions that affect our lives: if we are chosen for a job, if we are granted a mortgage, if they accept us at a university, if we can cross a border, if the police should monitor us before we commit a crime, the price of a health policy. The algorithm becomes a judge that does not allow for appeal. Why didn't my credit card limi


FOLLOW

  • Follow

The time has come when artificial intelligence makes decisions that affect our lives: if we are chosen for a job, if we are granted a mortgage, if they accept us at a university, if we can cross a border, if the police should monitor us before we commit a crime, the price of a health policy.

The algorithm becomes a judge that does not allow for appeal.

Why didn't my credit card limit go up?

Because the computer prevents me, the bank clerk will say, still in the flesh.

And can I know why the computer prevents it?

Don't try to figure out how it works, it will be the answer.

A great debate around artificial intelligence is whether it is free from prejudices - it would be expected, since a program must respond to objective data -;

or if, on the contrary, machines reproduce our manias, because they learn from us that we have them.

The conclusion of those who have studied it is that, even when they learn alone, AI assumes the bias of the society they analyze.

Historical gender, class, ethnic, xenophobic biases.

It is not speculation: it is studied.

As in the

Gender Shades

report

, which in 2018 found that facial recognition programs failed more with women and ethnic minorities.

IBM's Watson program was wrong up to 35% of the time for black women, but only 1% for white men.

The company created a team to correct Watson;

it reduced errors, but has not been able to eradicate them.

A more delicate matter: artificial intelligence applied to defense or public order.

Drones are already used in wars that not only target a target, but choose it.

Police are already beginning to profile suspects using AI: trying to anticipate crimes that no one has committed yet.

We can improve the programs, free them from our deep-rooted misgivings towards what is different, but there is a basic problem: what data is used for the machine to learn to make decisions.

If it's the historical crime data in the US, for example, we dragged centuries of racist bias there, and not just there.

If it is the data of economic solvency anywhere, you will see the effect of centuries of patriarchy.

Catherine D'Ignazio, Professor of Science at MIT and author of the book

Data Feminism

, is clear on this.

“The data will never be neutral because it is never 'raw' data.

They are produced by human beings who come from specific places, have their own identities, their particular stories and who work in specific institutions ”.

The fundamental question is whether artificial intelligence has to respond to how we are or, better, how we want to be.

Other articles by the author

Open close

  • If the network goes out, talk to each other

  • Against the culture of zasca

  • Resist the algorithm, get out of the bubble

  • Yes, I accept (whatever)

Source: elparis

All news articles on 2021-02-10

You may like

News/Politics 2024-04-04T09:37:17.905Z

Trends 24h

News/Politics 2024-04-17T18:08:17.125Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.