The Limited Times

Now you can see non-English news...

Facebook mistakenly labeled black men as 'primates'

2021-09-04T16:27:11.307Z


The company apologized for 'an artificial intelligence error'. 09/04/2021 12:30 Clarín.com Technology Updated 09/04/2021 12:30 Facebook announced on Friday that it disabled a topic recommendation feature after it mistakenly associated black men with "primates" in a video posted on the social network. It happened in a video of a British media where, when finished, Facebook asked if the user wanted to "continue watching videos about primates." A company spo


09/04/2021 12:30

  • Clarín.com

  • Technology

Updated 09/04/2021 12:30

Facebook announced on Friday that it disabled a topic recommendation feature after it

mistakenly

associated

black men with "primates"

in a video posted on the social network.

It happened in a video of a British media where, when finished, Facebook asked if the user wanted to "continue watching videos about primates."

A company spokesperson called it "a clearly unacceptable mistake" and said the recommendation software involved was taken offline.

"

We apologize to anyone who has seen these offensive recommendations

," the company retracted.

"We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again."

The facial recognition program has come under fire from civil rights advocates, who point to accuracy problems particularly

with non-white people.

Facebook users in recent days saw a video of a British tabloid starring black men and received an automatically generated notice asking if they wanted to "continue watching videos about primates," according to The New York Times.

Darci Groves, former head of content design at Facebook, shared a screenshot of the recommendation.

"This 'keep watching' notice is unacceptable," Groves wrote on Twitter.

"This is outrageous."

Um.

This “keep seeing” prompt is unacceptable, @Facebook.

And despite the video being more than a year old, a friend got this prompt yesterday.

Friends at FB, please escalate.

This is egregious.

pic.twitter.com/vEHdnvF8ui

- Darci Groves (@tweetsbydarci) September 2, 2021

Facial recognition, the controversy

Facial recognition is a technology highly developed by Amazon.

AFP photo

Dani Lever, a Facebook spokesperson, gave more details in a statement.

And artificial intelligence that recognizes faces was the protagonist in the controversy: "As we have said, although we have made improvements to our AI, we know that it is not perfect

and we have more progress to make."

The problem is not new.

Google, Amazon, and other tech companies have been under scrutiny for years for biases within their artificial intelligence systems, particularly around racial issues.

Studies have shown that facial recognition technology is biased against people of color and has more trouble identifying them, leading to incidents

where black people have been discriminated against or arrested due to computer errors.

For example, in 2015 Google Photos wrongly labeled images of black people as "gorillas."

Google said it was "really sorry" and would work to fix the problem immediately.

More than two years later, Wired discovered that Google's solution was to censor the word "gorilla" from searches, while also blocking "chimp," "chimp," and "monkey."

The artificial intelligences used to recognize faces generate controversy.

Photo EFE

Facebook has one of the world's largest repositories of user-uploaded images in which to train its facial and object recognition algorithms. The company, which tailors content to users based on their past browsing and viewing habits, sometimes asks people

if they'd like to continue viewing posts in related categories

. It was not clear whether messages such as "primates" were widespread.

Racist problems also previously caused internal problems at Facebook.

In 2016, Mark Zuckerberg, the CEO, asked employees to stop crossing out the phrase

"Black Lives Matter" and replace it with "All Lives Matter"

in a common space at the company's headquarters in Menlo Park, California.

Hundreds of employees also held a virtual strike last year to protest the company's handling of a publication by President Donald J. Trump on the murder of George Floyd in Minneapolis.

Look also

Facebook wants to return to its social role: it will show less political content at the beginning

Instagram removes the gesture of swiping to access a link and changes it to stickers

Source: clarin

All tech articles on 2021-09-04

You may like

News/Politics 2024-02-28T16:25:16.215Z
Life/Entertain 2024-03-29T05:15:24.037Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.