The Limited Times

Now you can see non-English news...

Facebook and Instagram: who are the "wise men" of the council responsible for monitoring networks closely?

2021-02-12T07:52:09.989Z


THE PARISIAN WEEKEND. When Facebook and Instagram delete a post, their users can protest to a supervisory board.


Thousands of kilometers away, each behind their screen, five of the 20 members of the supervisory board of Facebook and Instagram are currently debating the fate to be reserved for Donald Trump's accounts on these two social networks.

Did Facebook do well to withdraw access to the former president, the day after the assault on the Capitol by his supporters on January 6?

The “supreme court” of the Californian group, as it is nicknamed, has been working on the issue since January 29.

She has 90 days to decide.

It is the management of Facebook itself which requested the examination by the wise men of this controversial choice.

A heavy responsibility, which does not frighten Helle Thorning-Schmidt, co-president of the council and former Prime Minister of Denmark: “Examining the most difficult decisions, that's what we were created for,” she says.

This explosive file is also a golden opportunity to introduce this legal UFO to 2.8 billion Facebook users and one billion Instagram followers.

Responsible for examining the moderation decisions taken on these social networks, the organization, created at the initiative of Facebook at the start of the 2020 school year, was set up in a way that guarantees its independence.

To do this, the company put $ 130 million into an endowment fund, funding the court and its administration for at least six years.

Former Danish Prime Minister Helle Thorning-Schmidt is one of the four chairmen of Facebook's supervisory board.

Oversight Board  

"We have no obligation to Facebook, no one can tell us what to decide," insists Helle Thorning-Schmidt.

The council, which made its first verdicts public at the end of January, has also disowned the social network in four of the five cases concerned.

Once the decisions are announced, Facebook and Instagram have seven days to come into compliance, and thirty days to respond to recommendations made by the council to improve content moderation, the vast majority of which is carried out automatically by algorithms, with the support of 15 000 moderators.

Mark Zuckerberg cornered by "fake news"

This astonishing private supreme court is the result of a long reflection, announced by the CEO Mark Zuckerberg in November 2018, when the responsibility of his social network was more and more often pointed out in the dissemination of "fake news" .

"I have come to believe that Facebook should not alone make so many important decisions about freedom of expression and security," wrote the founder on his site.

After interviewing numerous human rights and freedom of expression experts, and opening a public consultation, in September 2019 the group published the charter that defines the functioning of the council, and appoints four co-chairs.

Helle Thorning-Schmidt is one of the first hires, after several direct discussions with Mark Zuckerberg.

Having gone through politics and humanitarian aid, she does not hesitate to accept this mission of a new kind.

Facebook's “supreme court” was created on the initiative of CEO Mark Zuckerberg.

Reuters / Erin Scott  

“Content moderation is perhaps one of the most important issues of our time,” she tells us.

Three other co-chairs support her in her task: former US federal judge Michael McConnell, professor of constitutional law at Columbia University Jamal Greene, and Colombian lawyer Catalina Botero-Marino.

Morning essentials newsletter

A tour of the news to start the day

Subscribe to the newsletterAll newsletters

It is up to them four to freely recruit the sixteen other members of this "dream team" of freedom of expression, which includes many law professors, some journalists, NGO officials and civil rights activists, such as the Yemeni Tawakkol Karman, 41, 2011 Nobel Peace Prize winner. “We have ensured the diversity of gender, ethnicity and age, says Helle Thorning-Schmidt, and selected people who are not afraid to express their opinion. .

"

20,000 requests

Cameroonian Julie Owono, 34, director of the NGO Internet Without Borders, is one of the elected officials.

She hopes to bring a different perspective.

"The debates around moderation of content focus on Europe and the United States, while Africa, Latin America or Asia are very important markets for these platforms," ​​she insists. .

The position, remunerated "according to the standards in force", we were told without further clarification, is part-time, for about fifteen hours per week.

“We all also have a

day job

,

a day job

, as they say in English,” jokes Julie Owono.

The “judges” being scattered all over the world, from Australia to the United States, via Great Britain, Taiwan, Pakistan and Indonesia, discussions take place on the Zoom application, in English, when the time difference permits.

"I often have meetings at 5 or 6 in the morning, to accommodate my colleagues," says the director of Internet Without Borders.

Julie Owono, director of the French NGO Internet Without Borders, is one of the 20 members  

The Supervisory Board, which will be enlarged to 40 members this year, relies on its own administration, with around 40 employees, spread between San Francisco, Washington and London.

Upstream, these agents filter the many requests submitted by Internet users, to offer the institution a preselection of files that could have a significant impact for users and serve as an example for Facebook in similar situations.

Of the first 20,000 requests received, ten were presented to the board, which retained six, unveiled in early December.

Each case is then assigned to a panel of five members, including one from the relevant geographic area.

To deliberate, they can ask Facebook for additional information about the offending post, consult outside experts and be inspired by public comments that anyone can send, for each file, on the council's website.

READ ALSO>

Facebook will create 1000 jobs to moderate “dangerous content”


"We received a hundred out of the first six cases", specifies the director of the administration of the council, the Briton Thomas Hugues, who expects their number to increase with the notoriety of the institution.

Based on the principles of international law and the standards of the Facebook community, the panel deliberates and, finally, submits the results of its reflection to the majority vote of the entire Supervisory Board.

Facebook disowned for the first time

The first five decisions were unveiled on January 28.

Four choices of the Californian group were therefore invalidated, one was confirmed.

“It's historic!

For the first time, the content moderation decisions were taken outside the company, ”insists Helle Thorning-Schmidt.

This inaugural series illustrates the spectrum of difficulties.

"No case called for a straightforward answer," admitted co-chairman of the board Michael McConnell at the press conference call accompanying the announcement, "the language and cultural background of the author had to be taken into account. of the deleted publication.

We have tried to do our best.

"

READ ALSO>

End of life, nudity, violence: why Facebook is blocking ... or not


Facebook had thus deleted the post of a user from Myanmar (ex-Burma), declaring that "something was wrong with Muslims".

Hate speech, according to the social network.

But a more scrupulous translation of this text into Burmese changed its meaning.

The sentence was not actually aimed at Muslims in general, but at those who remained indifferent to the treatment reserved for Uyghur Muslims, while they were indignant at the Charlie Hebdo cartoons.

The board requested that the post be republished.

He also reinstated the posting of photos of breasts on Instagram, violating the nudity standards of the network, but which were part of an awareness campaign against cancer, as well as a Facebook post featuring a quote from Joseph Goebbels.

Hitler's Minister of Propaganda is certainly on Facebook's list of “dangerous people”, but the message in question did not justify Nazism.

He compared Donald Trump's presidency to the Third Reich.

Outrageous, perhaps, but allowed.

Debate on Professor Didier Raoult's remedy

Finally, the text of a Frenchman believing that the National Medicines Safety Agency should have authorized hydroxychloroquine - the remedy against Covid-19 promoted by Marseille professor Didier Raoult - has been put back online.

Facebook had erased it, likening the comment to "disinformation" and considering that it constituted a "risk of endangering others".

Solicited by the company, the wise did not go in this direction.

For them, the drug is not available in France without a prescription, there is no endangerment.

Facebook could therefore have intervened in a less brutal way, for example by providing background information.

READ ALSO>

Facebook ready to work with France to fight against hate content


On the other hand, the judges maintained the deletion of a message towards the Azeris, qualified as "taziks", a pejorative term in Russian, classified by the algorithm as "hate speech".

This time, the members of the council struggled to come to an agreement, a minority of them judging the remarks "offensive, but not dehumanizing", as indicated in the report published on the site of the court.

"The final declaration is a compromise, not everyone finds their exact point of view", admitted Helle Thorning-Schmidt at a press conference.

“The disagreements show the difficulty of these issues and the diversity of views within the board,” added Michael McConnell.

“Some members have a more American conception of freedom of expression.

I studied law in France, where we are also aware of the limits of this principle, ”Julie Owono warned us a few days earlier.

Wise men plead for more human moderators

The “supreme court” of Facebook and Instagram accompanied its verdicts with nine recommendations, aimed at improving moderation methods, in particular at increasing their transparency.

The council thus requested that the list of people and organizations that the company classify as dangerous be made public, and that each user be informed of the reasons why their post was deleted.

Facebook will also have to define what it qualifies as “disinformation”.

READ ALSO>

Facebook will compensate its traumatized moderators up to 48 million euros


“It's not easy,” admits Helle Thorning-Schmidt, “but they have to try.

We also recommended that there be more human moderators.

This will prevent some absurd decisions taken by artificial intelligence, such as removing photos against breast cancer.

Facebook is not obligated to follow these tips, but must post a response.

Some already point out the limits of this system, by which the company of Mark Zuckerberg - like a real state!

- created its own justice, the rules of which it itself defined.

"In an ideal world, there would be an independent council, like ours, but organized as a multinational institution, like the UN," admits Helle Thorning-Schmidt.

Nothing like it on the horizon.

"This advice is the second best option", slices the former Prime Minister.

Source: leparis

All news articles on 2021-02-12

You may like

News/Politics 2024-03-29T18:28:49.651Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.