The Limited Times

Now you can see non-English news...

Legal, but harmful: why networks allow harassment, but not nipples


It's time to let go of hot cloths. If we break the obscenity rules and post a nipple, they get our account suspended or content taken down at lightning speed, but if we harass someone to the point of making their life unbearable, they pull out the free speech card

A social network user consults news on a platform.Unplash

It has always been said that in Law those of us who have not obtained a sufficient mark for Medicine or Fine Arts finish;

a rote and boring race, but with “many exits”.

I started studying Law before Spain entered the European Union (EU) and in the midst of a hairless crisis very similar to the one we are experiencing now.

It is no longer a secret to anyone that I am older than the black thread.

At that time, there was no room for dreams.

Parents did not consider you special or encourage you to have fun working.

In what head could one of the Lord's punishments for original sin be funny?

What was due was done without much expectation of life.

I was bored as a porpoise and I didn't study neither state law nor any other opposition.

However, when I finished, I found the practice of law exciting and Law, an infinite combination of social, technical and psychological factors that never ends.

Regulating the social realities, expectations and needs of citizens in a time of extreme fluidity is as exciting as it is complicated.

We deal with old problems, but on such a scale that they become new.

The rules that we had to regulate the relationships of the same humans are useless and the collisions with classic rights, a tangle difficult to disentangle.

Especially if corporate and economic interests appear in the appropriate offices.

Disinformation, legal but harmful content is a problem we have been dealing with since Brexit and the election of Donald Trump as President of the United States.

When the phenomenon became uncontrollable, heads turned towards lawyers and politicians demanding a solution to speeches that, protected by freedom of expression, are clearly harmful.

Something so intuitive has turned out to be very difficult to regulate for various reasons.

We start from diffuse classifications of what

fake news is

, of what misinformation is, of what is not convenient, or of expressions that, not being convenient or polite, are protected by freedom of expression.

It was not in the manuals how to protect society from the amplification that the idiots, the wicked or the indolent benefit from.

UK seeks a definition of "legal, but harmful": self-harm, bullying or eating disorders, but they have not gone further

The Anglo-Saxons who, due to the very flexibility of their system, are always capable of finding effective and imaginative solutions to complex problems, have opened the complicated melon of defining "

legal but harmful speech

", (legal but harmful speech) and They have encountered the same problem as everyone else.

The British project “Online safety bill”, (the proposal for an online security law, a regulation that, if approved, would be the strictest in the world in the surveillance of

online content)

is muddled trying to find a definition of what it is a “legal, but harmful” speech.

They are clear that posts about self-harm, bullying or eating disorders would fall under this definition, but they haven't gone any further.

It is so complicated that the EU has not even tried.


Digital Services Act

or DSA, for its acronym in English), approved this July, establishes the obligation of large platforms and search engines to avoid any "abuse".

For that, they will have to do a risk analysis of that abuse occurring, implement measures to mitigate that risk and then audit everything to an independent third party.

The risks are so generic and difficult to assess (electoral misinformation or manipulation, cyber-violence against women or harm to minors) that any analysis and evaluation can do us any good.

What European legislators have been careful about is defining what “illegal content” is: it will be what each member state says and content can only be removed where it is illegal by national law, not in other countries.

In the US, they have chosen to protect minors, which is always simpler and more grateful.

Senators Richard Blumenthal (Democrat of Connecticut) and Marsha Blackburn (Republican of Tennessee) introduced a bipartisan bill inspired by some of the obligations of the DSA, such as the creation of algorithmic transparency requirements, although they would only apply to minors , leaving adult users unprotected.

The debate is why there are no public places not subject to the interests of companies that are outside the jurisdiction of their judges

All this complication to avoid declaring the responsibility of service providers and search engines, such as Facebook, TikTok, Twitter or Google, for the content they publish.

These operators take refuge in the irresponsibility granted to them by section 230 and the European regulations that copy it to allow the publication of anything without any filter or moderation.

At the dawn of the commercial internet, back in the 1990s, the principle of the librarian or bookseller who does not have to know the content of everything he sells was established.

If a book is illegal, the book is removed, but the bookstore is not closed.

Although, if the editor of a newspaper publishes illegal content, the responsible is the newspaper, except for this tribune that is of opinion for which they are not responsible (and they do very well).

But we are not in the internet of



Lenders are outrageous profit companies that are teaching cars to drive themselves and are perfectly capable of reading the entire library.

They already do this by applying the algorithm to offer us not the content we have chosen, but rather those that are in line with their economic interests and those of their clients, which are not us.

They already do editorial work from which it is impossible to escape from the absolute tranquility of not being responsible for anything.

In fact, that editorial work is at the root of the problem.

Many think that revoking or eliminating the irresponsibility of these providers would mean that, by protecting themselves from global claims, they would take a protective stance of their interests by censoring left and right.

It's time to let go of hot cloths.

Our relationship with these companies is contractual.

If we break the obscenity rules and post a nipple or make the company Christmas video with a copyrighted song, we'll get our account suspended or content taken down in lightning speed, but if we harass someone to make their life unbearable, they take out the letter of freedom of expression.

Because it's free.

The debate is not whether making social networks and search engines responsible for the content they already publish is an attack on freedom of expression or whether it is technically possible,

That it is.

That is already happening by prioritizing crap over other content.

The debate is why there is no competition so that citizens can debate in public places not subject to the interests of the shareholders of companies that are outside the jurisdiction of their judges.

It is not a question, as the DSA says, of "carefully weighing" whether timid and technical measures imply a restriction on freedom of expression, but of tackling the problem at its root.

It is about applying the environmental principle that whoever pollutes, pays.

You can follow EL PAÍS






or sign up here to receive our

weekly newsletter


50% off

Subscribe to continue reading

read without limits

Keep reading

I'm already a subscriber

Source: elparis

All tech articles on 2022-08-03

You may like

Trends 24h


© Communities 2019 - Privacy