The Limited Times

Now you can see non-English news...

Zoom video conferencing tool: civil rights activists warn of emotion scanners

2022-05-13T16:01:12.044Z


Kindly please: The developers of the Zoom chat software are apparently planning to read emotions from the faces of conference participants. Critics fear that there will be abuse and violations of the law.


Enlarge image

Zoom app on a smartphone and a notebook (icon image)

Photo: Thiago Prudencio / ZUMA Wire / IMAGO

Is the colleague still listening properly?

Do the students find the lecture boring?

Is the price too high for the customer?

Civil rights activists are appalled by the plans of the US company Zoom to develop an emotion meter for online video conferences.

With the help of artificial intelligence (AI), the provider of the video telephony tool apparently wants to analyze the body language of conference participants and read their mood from it.

A few weeks ago, the magazine "Protocol" reported on Zoom's plans.

In an open letter to Zoom, almost 30 organizations such as the Electronic Privacy Information Center (Epic) and the American Civil Liberties Union (ACLU) are now speaking out against plans to analyze emotions in video conferences.

According to the letter, wanting to read moods is based on the misconception that AI can capture human emotions.

This move violated privacy and human rights.

"Zoom needs to halt plans to continue developing this feature."

Critics fear penalties for wrong feelings

The civil rights activists criticize that facial recognition has been proven to be "misleading, flawed and racist".

Such tools would assume that all people would use the same facial expressions, voice tones, and body language.

But don't be like that.

With this basic assumption, the software is programmed to discriminate against certain ethnic groups and people with disabilities.

In fact, studies show that facial recognition tends to fail more often, especially among blacks and people of Asian appearance - and even after years of research, it still tends to be racist.

The human rights organizations also criticize that the software could be misused.

Companies or universities may penalize workers or students for "wrong feelings," the letter says.

Zoom did not respond to a request from SPIEGEL by Friday afternoon.

AI puts pressure on sellers

Software conversation analysis is nothing new.

Zoom already offers a tool to optimize online dialogues.

In the tool called Zoom IQ, an AI checks afterwards, among other things, whether you used too many filler words, showed too little patience or spent too much time on slides.

The better a seller follows given rules, the more points are awarded.

The civil rights groups already see a line crossed with this software.

Sellers would be put under great pressure with such analysis tools.

Zoom advertises the IQ software with the fact that teams can be evaluated according to performance.

A blog post on the subject includes a list of employees who are measured and evaluated according to their conversational skills.

Executives should be able to use this data "to make management decisions about their sales teams," writes Zoom.

Companies like Uniphore are a step further.

AI is already being used there to read emotions from the camera image.

The target group includes call centers that monitor their employees with the software and use the data to train them further.

The Indian developers measure how carefully someone listens, whether they are satisfied, angry or surprised.

An emotion score indicates a percentage of how well the conversation is going.

Source: spiegel

All tech articles on 2022-05-13

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.