The Limited Times

Now you can see non-English news...

Moderation: Meta Oversight Board Criticizes Facebook and Instagram Personality Privileges

2022-12-06T12:51:08.153Z


Meta's supervisory board on Tuesday criticized the social media giant's platforms for giving preferential treatment to content...


Meta's supervisory board on Tuesday criticized the social media giant's platforms for giving preferential treatment to problematic content posted by politicians, bosses, celebrities and other personalities.

"

The council is concerned about the way Meta has put its economic interests ahead of content moderation

," said the entity described as independent, but funded by the company.

In her report, she calls for a “

significant overhaul

” of the double verification program called “

cross-check

”, to make it more transparent, more responsive and fairer.

Currently, when posts or images potentially violating Facebook or Instagram policies are flagged, they are promptly removed if they are deemed high risk and come from unknown users.

But if their author is on the "

whitelist

", this content remains online while it is examined more closely - a process that usually takes several days, and sometimes several months.

This double-speed, “

unfair

” system has therefore “

offered additional protections to the expression of certain users, selected in part on the basis of Meta's economic interests

”, details the report.

Because of "

cross-check

", "

content identified as contrary to Meta rules remains visible on Facebook and Instagram, while they are spreading virally and could cause damage

", admonishes the supervisory board .

It recommends accelerating secondary reviews of content from personalities who may post important human rights messages, and removing high-risk ones pending the internal verdict.

It also asks the company to publish the eligibility criteria to benefit from the program, and to publicly identify, on the platforms, the accounts of the users concerned.

Provide a faster response

"

We created the double-check system to avoid potentially overly harsh decisions (when we take action on content or accounts that don't actually violate our policies) and double-check cases where there might be a higher risk. high error rate or when the potential impact of an error is particularly severe

," said Nick Clegg, Meta's head of international affairs, in a statement.

The group has also undertaken to provide a more comprehensive response to the recommendations of the supervisory board within 90 days.

Read alsoRegulatory troubles are piling up for Meta

This entity is made up of 20 international members, journalists, lawyers, human rights defenders and former political leaders.

It was created in 2020 on the proposal of boss Mark Zuckerberg and is responsible for evaluating the Californian group's content moderation policy.

Members launched a '

cross-check

' review in October 2021, following revelations from whistleblower Frances Haugen, who leaked internal documents to the press and accused the company of putting '

profits before security

” of its users.

Meta's responses to the survey were "

insufficient at times

" and "

his own understanding of the practical implications of the program leaves much to be desired.

“, said the council.

Source: lefigaro

All news articles on 2022-12-06

You may like

News/Politics 2024-02-05T03:50:13.765Z

Trends 24h

News/Politics 2024-03-27T16:45:54.081Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.