The Limited Times

Now you can see non-English news...

Facebook's supervisory board asks it to clarify its moderation policy

2021-01-28T23:46:43.738Z


The independent body responsible for settling disputes between the platform and its users has issued its first decisions.


Facebook's supervisory board has made its first decisions on the very controversial policy of moderation of the social network, but questions remain about the real scope of this council of elders beyond a handful of disputes.

Read also: Should we continue to ban Donald Trump?

To decide, Facebook turns to its supervisory board

The independent body considered

certain platform rules

"vague"

and overturned four decisions to remove user content, out of five cases examined.

For example, she asked Facebook to hand over the photo of a breast (published as part of a campaign on cancer screening), according to a statement released Thursday, January 28.

It also ruled in favor of a French user who spoke of an alleged scandal at the National Medicines Agency, the regulator of the health sector, and considered that hydroxychloroquine could save lives, in a group devoted to Covid- 19.

For Facebook, this publication presented

"an imminent risk

(...)

of physical danger"

, but for the supervisory board, the site's policies on disinformation and imminent physical danger were

"unduly vague

.

"

No advice can replace a truly independent control, anchored in the rule of law.

Let us not be distracted by a small number of handpicked and limited scope cases.

"

Marietje Schaake, Director of the Technology and Governance Research Center at Stanford University

The members of the supervisory board - personalities from civil society - are responsible for evaluating the platform's decisions on certain content deemed problematic.

Their decisions are binding and they hope, through their recommendations, to set a precedent on the major issues that plague the network.

Facebook has been torn for years between respecting freedom of expression and the need to protect individuals from incitement to violence, online harassment and other publications that can make the platform infrequent for some users, and therefore for advertisers.

To read also: Trump banned from Facebook: "It's not up to GAFA to decide what to say or not to say"

The handful of early cases before the council concern four continents - Europe, North America, Asia and South America - and geopolitical situations such as political tensions in Burma or the conflict between Azerbaijan and Armenia.

This is content published in October and November, deleted by Facebook for breaking its rules on hate speech, dangerous organizations or nudity.

The Californian group said in a statement to have already restored three contents.

“The council raises legitimate concerns when it says we could be more transparent about our rules regarding disinformation and Covid-19,”

admitted Monika Bickert, a vice president of Facebook.

She said the recommendations would have a

"lasting impact"

, but also clarified that the network's approach to the need to provide correct information on the coronavirus, endorsed by health organizations,

"would not change

.

"

"Junk surveillance"

The founder and boss of Facebook, Mark Zuckerberg, announced at the end of 2019 this project of a sort of "Supreme Court", having the last word on controversial content on Facebook and Instagram.

It was born at the end of 2020, but many observers believe that its real influence will be limited.

“No advice can replace truly independent oversight, anchored in the rule of law.

Let's not be distracted by a small number of hand-picked, limited-scope cases

,

tweeted Marietje Schaake, director of the Technology and Governance Research Center at Stanford University.

"It's junk surveillance

,

"

added an anti-Facebook association, ironically called "Real Facebook Oversight Board" ("True Facebook Oversight Board").

The council

"will not significantly tackle the many wrongs facilitated by Facebook"

, she said in a statement, where she evokes the presence on Facebook of violent far-right groups, determined to do battle. with the US government.

Read also: Trump's social media ban opens debate on the power of Gafa

The platform is regularly criticized for its lack of responsiveness to certain calls for violence, from the massacres of Rohingyas in Burma in 2017 to the pro-Donald Trump riots that interrupted Congress on January 6, and left five dead.

These latest events have led Facebook to take more radical action against extremist groups, and even indefinitely suspend the account of the former US president, who encouraged his fans to come to Capitol Hill.

The supervisory board must also decide whether the billionaire can return or not.

The decisions presented Thursday

"touch on fundamental subjects"

, and

"clearly show that the council does not intend to serve as a simple refueling point for Facebook in its journey to connect the world,"

said Evelyn Douek, professor of law at Harvard , in a blog post. But the wise have not yet confronted

"the question of what happens if Facebook's rules contradict international human rights laws

," she added.

Source: lefigaro

All business articles on 2021-01-28

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.