The Limited Times

Now you can see non-English news...

US presidential election: how Wikipedia protects itself from disinformation

2020-11-03T23:00:08.631Z


The collaborative encyclopedia has put in place specific measures to avoid becoming an ideological battleground.


Limiting the spread of fake news for the US presidential election is a major issue for platforms.

In recent months, most social networks have put in place measures to limit the virality of content that could undermine the legitimacy of the poll.

But for the collaborative encyclopedia Wikipedia too, the stake is particularly important.

This website is indeed a source of information for many Internet users, but also for several search engines including Google.

Wikipedia's participatory operation means that its users can edit any articles they want to improve.

In the context of an election as polarized as the one taking place in the United States, and whose legitimacy the Republican candidate Donald Trump has regularly attacked, the risk is great that the pages of the site will turn into an ideological battlefield. .

To avoid this, Wikipedia administrators have taken several steps.

First of all, the pages that relate to the 2020 presidential election have been protected from savage changes.

Only accounts created more than 30 days ago and which have already made more than 500 changes to various articles will be able to edit them, until the inauguration of the elected president.

The objective: to prevent malicious actors from massively editing articles to stir up trouble in the election.

A real but limited risk, argues Wikipedia, however, which believes that its operation makes it less likely to receive fake news than social networks.

Unlike Facebook, Twitter and YouTube, whose moderation policies are enforced by employees and algorithms, Wikipedia claims an open debate around resolving publishing conflicts.

A slow and supervised procedure, which makes the site less sensitive to disinformation campaigns, says the latter.

We are obviously vulnerable to disinformation.

But our free model, without advertising and our principles of neutrality, transparency and reliability of sources allow us to greatly limit false information on our site,

”argues the director of human resources of the Wikimedia Foundation, Ryan Merkley.

At least three reliable sources

Still, the result of the US presidential election will have to be known with great certainty when it is listed on Wikipedia.

Usually, information on the site may come from multiple sources, press articles or academic reports.

But in the case of the American ballot, it is the American agency Associated Press which acts as arbiter.

No winner will therefore be declared on Wikipedia before it is pronounced.

To increase the reliability of this crucial information, some members of the encyclopedia are even considering waiting for the results of at least two other media, including Reuters, CNN, Fox News, or the New York Times.

These rules are the result of an extensive debate that has taken place over the past few months between Wikipedia editors.

An important step, since the risks of disinformation come as much from malicious external actors as from the zeal of certain “Wikipedians” too in a hurry, who may want to update an article as quickly as possible.

In the future, the latter will be supported by machines.

Indeed, if Wikipedia often emphasizes its community aspect and its principle of collaborative moderation, its size is such that it can benefit from automating part of this work.

In its announcement regarding the US elections, the site announces several algorithm projects to assist publishers in their task.

These tools will help them more easily spot unsourced edits, edit wars, malicious accounts, and inconsistencies between different articles.

Source: lefigaro

All news articles on 2020-11-03

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.