The Limited Times

Now you can see non-English news...

"Social networks must be responsible for the content they host!"


FIGAROVOX / TRIBUNE - According to Dany Cohen, associate of law faculties, it is time for digital platforms to have the same legal responsibilities as traditional media.

Dany Cohen is associate of the Faculties of Law, Professor of Universities at Sciences Po.

The atrocious act perpetrated against Samuel Paty raises, among other questions, that of social networks.

Without them, the poisonous slanders emitted by a parent of a student would not have had the echo that alerted and galvanized a fanatic.

The horrors disseminated by these networks (from racism to bloody acts, sometimes on video) can no longer be counted.

With each overflow, we talk - and sometimes we try - to legislate, but we leave the essential in the shade: it is our law which has made a tailor-made regime, designed to make Facebook, Twitter and other networks to the consequences of their actions.

Created by a law of June 21, 2004 (resulting from a directive), the status of host protects these privileged people from any hassle, whatever the content they convey.

Read also:

"Europe must free itself from GAFAM to become an independent union"

The principle is, however, in French law as in so many other countries, that everyone is responsible for the consequences of their actions - in a word: is responsible.

But thanks to the law of 2004, any social network has two parts: the individuals who frequent it and are, like all of us, free but responsible for what they say (and therefore punishable) and on the other hand the network, which disseminates these words or images but assumes no responsibility.

Over time, this system has only been amended at the margins: the networks remain exempt from all liability whatever the images or comments they spread and only become at fault if, after having been warned of the content "

Manifestly unlawful

", they did not withdraw them within a reasonable time.

The argument used by the networks is simple: we are not the authors of the messages.

Their situation is however the same as that of any publisher of the press (written or audiovisual): a newspaper which publishes a platform, a radio station which receives a guest, are not the authors of the message either;

they are however jointly responsible, like the author, because this message reaches the public only thanks to the diffusion which they assure.

The media have never disputed their responsibility and no one claims that our press law hinders the essential freedom of expression.

They are however co-responsible, like the author, because this message reaches the public only thanks to the distribution they provide.

Social networks add that, unlike other media, they do not invite anyone (free access) and are simple pipes passively conveying content to which they remain foreign: they do not choose.

These Pontius Pilates are in reality Tartuffe.

Try to put on Facebook the image of a piece of nipple or buttock: painters, sculptors or photographers have suffered this censorship.

Each network therefore makes choices, just like a press organ, but unlike the latter, it enjoys immunity.

It doesn't matter whether what the network spills and what it blocks is decided by algorithms: these are written by humans.

These networks also employ moderators who obey them.

Would the tide of trade make it impossible to examine them all?

Simple and striking, because apparently common sense, the argument is however flawed: with their algorithms which block or let flow, these filters with sophisticated technology belong to some of the richest companies in the world.

To read also:

United States: "A campaign in the post-television era"

It is therefore disconcerting that the public authorities only consider trembling to apply the common rule to networks: you are responsible for what you broadcast, because it is because of you that the subject, the image, spreads its harmfulness.

It would not treat social networks worse than other media, but only put them in the same boat.

This is called equality before the law.

This should only create

ab initio


- that is to say an anomaly, a break with our principles - only if there is an overriding reason, of public interest, to grant a privilege.

Where is he here?

The few measures envisaged by the government (forcing networks to remove illegal content more quickly and under penalty of higher fines) are only plasters, because they maintain the principle of initial immunity.

Rejected by the Constitutional Council, they have many drawbacks, including that of giving a few dominant private companies the right to decide what can be said or shown and what should not be ... therefore replacing the late Index of the Catholic Church by the conceptions of a Facebook (for example), ultra-puritan on all that can seem sexual and very lax for barbarism and hatred.

In a democracy, however, the law confers this power to the judge;

For 139 years, French judicial judges have performed this task to perfection, preserving a broad freedom of expression while sparing us ignominy.

It would not treat social networks worse than other media but only put them in the same boat

My proposal raises three objections:

1) Will it not be less efficient than the current system (possibly strengthened), which encourages the network to promptly remove content to escape any responsibility?

No, since it could very well be foreseen that quickly removing content will reduce the network's liability, so that the virtuous incentive would remain ... which for the time being is not that clear: during confinement, three anti-racist organizations French people reported to Twitter about 1,200 hateful tweets over a few weeks.

After a few days, only 17% of them had been deleted.

Even doubling this rate, a majority of horrible content would remain.

While subjecting all media to the same rule would leave the networks free to broadcast what they want… on the condition of assuming responsibility.

2) Will we not be witnessing a myriad of trials?

Without doubt, but there is nothing that is very normal: these giants have an activity based on the exchange of messages, images, the buzz, the reactions which result from it.

This risk is therefore inherent in their activity.

No one can predict if this would affect their business plan, but is this concern really essential in a public debate that brings into play the fundamental values ​​of our society?

Read also:

Will teachers now have to tolerate permanent exposure to danger?

3) Will the judges be able to deal quickly with this type of dispute?

The objection is only half admissible: for a long time, there has been an hour-to-hour summary procedure, a procedure by which a judge rules on the same day, for example to authorize or suspend the broadcasting of a television program.

French justice is certainly very poor in human resources, but it is legitimate and of exceptional quality on this point;

its decisions, reasoned, are taken at the end of a public debate.

Recruiting a few dozen additional judges to deal with these essential disputes would mark the political will to reconcile order and freedoms.

Conversely, to set up as arbiters of the authorized and the forbidden firms which have no other legitimacy than their hyperpower would sign the abandonment to private interests of an essential part of the democratic debate.

Source: lefigaro

All news articles on 2020-10-26

You may like

Trends 24h

News/Politics 2020-12-01T10:14:40.063Z


© Communities 2019 - Privacy