The Limited Times

Now you can see non-English news...

Siri, Alexa, Sophia: why are virtual assistants and humanoids named after women?

2022-06-26T10:56:09.167Z


Experts warn of the dangers of feminizing and even humanizing robots. The multinationals argue that they resort to female voices and bodies so that they are accepted by society.


Many hyper-realistic humanoids that burst onto the scene turn out to be a silicone or metal frame with a small nose and white complexion.

And usually with breasts.

That is, a gynoid: a female-looking robot.

Just go through the list.

For example, Sophia, inspired by Audrey Hepburn and the first robot with citizenship, the Saudi, has been described as the perfect caregiver for children and the elderly.

Ai-Da is the first artist.

Erica, the first to star in a movie.

Samantha, the first sexual humanoid with artificial intelligence.

Siri (Apple), Alexa (Amazon) and Cortana (Microsoft) do not have corporeality, but they are female.

Although for a year, after a barrage of criticism, these virtual assistants are now also available with a neutral or male voice.

Tech companies have repeatedly launched their

chatbots

, virtual assistants, and robots with female names, voices, and bodies.

Is there a scientific explanation behind this decision or, while the world advances in rights and equality, what has been achieved in the real world does not seem to permeate the digital world?

Scientist Karl Fredric MacDorman, an expert in the interaction between people and computers, worked with the father of the robot Erica, Hiroshi Ishiguro, and in 2010 published a report in which he concluded that both men and women preferred female voices in their assistants. virtual.

Since then, various companies have based themselves on this study to argue that the female gender attributed to their robots increases the use and sale of their products.

“Women have traditionally been in a nurturing and caring position, responding to questions and requests, and laboratories use this metaphor because they think that this way people will use their systems more,” she explains from the University of Indianapolis.

But for the scientist one thing is what science says and another is what leads these companies to predetermine the femininity of their machines: “The decision that Siri, for example, be a woman was made long before my report was published. make known.

I don't think some companies' decisions have been based on science.

There is probably something else: the reality is that there are many men in the laboratory.

Specifically, only 26% of jobs in the field of data and artificial intelligence are held by women, according to data published by the 2020 World Economic Forum.

In line with the MacDorman report, a study translated as

The Most Humane Bot: Female Gender Increases Bots' Perception of Humanity and AI Acceptance

she concludes that injecting these lifeless bits of metal and elastomer with a dose of femininity makes them more human and therefore acceptable.

According to the aforementioned study, positive human qualities are associated more with women than with men: they are imagined to be kind and warm.

According to Sylvie Boreau, coordinator of the study and professor of Ethical Marketing, biases in real life are reproduced in robotics, but the underlying reason "is not only that people see women as servants or simple tools, but the desire for humanizing these machines.

We prefer to interact with entities and people that we perceive as more trustworthy.”

Then a great ethical dilemma emerges: by humanizing the machine with feminine characteristics, it can fall into the objectification of women.

In a scene from

Mad Men

—series set in a 1950s ad agency—one secretary tells another newcomer: “[Bosses] may act like they want a secretary, but most of the time they're looking for something between a mother and a waitress ”.

Eleonore Fournier-Tombs, a data and technology analyst at the United Nations Institute, explains, over the phone, that there are two types of bias in technology: those that are reproduced because artificial intelligence feeds on texts that are already stereotyped —and are not filtered—and those that are intentionally encrypted.

For example, until 2019, when someone hurled sexist slurs at Siri, she would say, "I'd blush if she could."

According to Fournier-Tombs, “there is an overrepresentation of young men in programming, and all these stereotypes and jokes get codified.”

That submissive response from Siri gave the title in 2019 to a UNESCO report against the "programmed submission" of female voice assistants operated by Apple, Amazon and Microsoft.

The complaint pushed programmers to eradicate docile responses to sexism.

But the stereotype continues in their names, in the voice that sounds by default or out of habit, and in the answers that artificial intelligence systems track in a digital sea full of stereotypes.

Coming home and commanding: “Alexa, turn off the light”, “Tell me what time it is”, “Shut up”.

People learn about behaviors and social norms through their interaction with the environment, and according to Harvard sociologist Calvin Lei, “the gender associations that people unconsciously adopt depend on the number of times they are exposed to them” .

It is not trivial then that a boy or a girl hears that domestic orders are addressed to a woman or that it is a gynoid who attends to her grandfather and not a male robot.

That same robot —which has been on the cover of magazines like

Elle

and

Cosmopolitan—

stars in an interview broadcast on YouTube with Will Smith, for whom the meeting is more than just a conversation: it's a date and he tries to kiss her.

There are two schools among those trying to break stereotypes in artificial intelligence.

For Boreau it is necessary to humanize the machines and, therefore, the neutral gender "is not the solution that consumers prefer".

His wager is a random gender assignment: “You interact with a

chatbot

online and you are randomly assigned female or male, just like in real life.”

In the opposite position, the director of the UNESCO Division for Equality, Saniye Gülser Corat, lamented in an interview in 2020 with EL PAÍS: “Many large companies seek to make their voices even more human, natural and emotional.

I don't think it's positive.

Only in some cases, like caring for the elderly, can it be valuable because a human voice makes them feel better.

On those occasions it is important to indicate that it is a machine”.

“Robots and artificial intelligence are closed in on themselves, they don't evolve with society,” Fournier-Tombs denounces.

“I don't know why they need a particular gender, especially when we know that it can be risky for women to continue with these systems.

At the end of her meeting with Will Smith, Sophia looks into his eyes and concludes: “There is no reason to assign human motives to something that is not”, she says, “dogs are our company, for example”.

50% off

Exclusive content for subscribers

read without limits

subscribe

I'm already a subscriber

Source: elparis

All news articles on 2022-06-26

You may like

News/Politics 2024-03-17T04:45:56.798Z

Trends 24h

News/Politics 2024-03-27T16:45:54.081Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.