The Limited Times

Now you can see non-English news...

Social networks: seven ideas to tame them

2021-01-17T14:31:44.316Z


Anyone who wants to take away the power of expression of opinion from the social networks must also say who should get it. The politic? The police? The community? Seven possible strategies against online extremism.


Icon: enlarge

Demonstrators in the Capitol: planned openly and uninhibitedly on the Internet

Photo: Manuel Balce Ceneta / AP

Jack Dorsey sounded contrite: "I'm not celebrating and I'm not proud that we had to ban @realDonaldTrump from Twitter," wrote the Twitter boss on his platform on Thursday.

"I think being kicked out is ultimately a failure on our part to foster healthy debate."

In this case, the decision was correct, it was ultimately a question of not endangering public safety.

But she also demonstrated something that he himself considered "dangerous", namely "the power that an individual or a company has over part of global public entertainment."

After the storm on the Capitol in Washington, several politicians reacted less thoughtfully and rather reflexively with vague demands for stricter regulation of social networks.

One example is the Dutch MEP Kim van Sparrentak from the group of the European Greens.

She said, "We have to take power over our freedom of expression out of the hands of private corporations and return them to that of democratic institutions." As if that were the easiest thing in the world.

As if governments were automatically the better guardians of freedom of expression.

Who should be allowed to determine how a statement on the Internet is to be interpreted, legitimized by whom?

Who can have the power to mute a president and thus withhold from millions of people what their head of state is saying?

Who could make such decisions at a speed appropriate to the technology?

Who is allowed to draw the boundaries for more or less globally available content where on earth?

Experts have racked their brains over such questions for years.

Every answer so far collides with reality at one point or another, which is why some get the impression that nothing is happening.

In addition to the experiments and constant adjustments to the networks themselves, there are several other concepts and alternatives for dealing with expressions of opinion, digital mobs and their cheerleaders:

1. Consistently pursue criminal offenses in alternative social networks

In alternative social networks and forums such as

Parler

,

Gab

,

TheDonald

or some

Telegram channels

, the storm on the Capitol was planned particularly openly and uninhibited in advance.

Anyone who wants to fight right-wing extremism must investigate here.

Because even if the pages only have a fraction of users compared to Facebook, YouTube and Twitter, they are no longer a small niche.

Several million users are registered on sites like Gab and Parler.

There are German-language

channels on the QAnon conspiracy ideology

with tens of thousands of members.

Nevertheless, the German investigative authorities are concentrating on the major social networks such as Facebook, Twitter and YouTube in their fight against hate speech.

The

Network

Enforcement Act, which is intended to combat online hate speech and right-wing extremism on the Internet, only regulates the large platforms.

"The uninhibited nature of the posts on the alternative platforms leads to the movement's supporters becoming radicalized and violence being normalized," says network expert Miro Dittrich about the importance of alternative networks for violence spilling over onto the streets.

"What happens in digital spaces is still not taken seriously by the security authorities and seen as a real problem," criticizes Dittrich, who has been observing radicalization on the Internet for years.

The case of the QAnon movement shows how right-wing extremist conspiracy myths first spread on smaller sites, then circulate on established platforms and finally lead to street violence.

"Especially the wrong theses about a falsification of the election were posted again and again first in obscure Internet forums by QAnon supporters and then gratefully taken up by the Trump environment on larger platforms," ​​said Dittrich.

This

interaction between alternative social networks and some conservative politicians and media

should not be underestimated, warns Dittrich: "The right-wing extremists can rely on the right-wing media and sections of the Republicans to take up their theses and legitimize them," he says with regard to media such as Fox News or the television network One America News Network (OANN) and some Republican politicians who openly repeat QAnon theses.

2. Contradiction from civil society

In addition to prosecution, state regulation and deletions by companies, there is another strategy in the fight against online hate speech: With the

counter-speech

approach

, volunteers like Katja Schrickel try to argue against inflammatory content in social networks.

When she and the other volunteer members of the

"I am here" association

notice that racist content and false statements are threatening to prevail in comment columns on Facebook, they try to counter it with facts and arguments.

"We want to prevent commentators from becoming more and more radical without being contradicted and the mood in the comment columns only changing in one direction," says Schrickel, who has been involved in the association that started as a Facebook group for almost four years.

Of course, the concept of counter-speech has its limits, said the 59-year-old, but the

fight against online hate speech should not be left to corporations alone

.

"I am against the fact that we want to shape the public debate solely by the fact that the corporations delete accounts."

In addition to “I am here” there are other civil society organizations that are involved in counter-speech.

Promoting them would be a relatively straightforward measure to at least deradicalize the mood on the Internet a little.

3. Uniform rules, uniform enforcement

Facebook employs around 15,000 so-called

content moderators

who check videos, images and comments on behalf of the company.

Their purpose is to determine whether or not content is against the company's rules.

YouTube and Twitter also employ such fire extinguishers.

But

the rules according to which the moderators work is not transparent

and inconsistent.

Human rights activists have also been criticizing tech companies for years for the fact that their moderators do not delete calls for extreme violence.

In countries such as Sri Lanka, India and Myanmar, numerous calls for violence circulated online, while serious crimes took place on the streets.

4. Regulation that provides opportunities for complaints and transparency

It is still only a proposal from the EU Commission, but

the Digital Services Act

in its current form would have a noticeable impact on social networks.

Among other things, it should ensure that the reporting of illegal content to the operators of the networks is simplified and that people who are sanctioned have the right to object.

A platform such as Facebook is supposed to carry out an assessment once a year on "significant systemic risks" that could arise from the posts and advertisements as well as from the moderation and recommendation systems.

The companies would also be obliged to allow independent studies of their risk management.

In short:

more external controls and stricter transparency requirements

are intended to force the large social networks to act more sharply than before against attempted abuse of their platforms or to prevent it.

Systematic violations face high penalties.

All of this takes up long-standing demands from digital experts.

Even an anti-troll paragraph is included in the Digital Services Act: Anyone who "regularly" reports content as illegal even though it is not, should first be warned and then temporarily excluded from the reporting system.

What may look like a detail shows that the commission has also dealt with the tactics of troublemakers who try to overwhelm the moderation systems of the networks and to get them to block innocent users.

5. Independent platform council oversight

External supervision

is another approach to create more control and transparency.

The media scientist Bernhard Pörksen brought so-called platform councils into play in 2018.

By this he understands "an institution to be newly founded as a point of contact, referee and correction instance for wrong decisions", as he writes in his book "Die große Iriztheit".

“Platform operators, journalists, publishers, scientists and representatives of various social groups would come together in such a platform council.” They should “make the now dangerously normal and strangely natural-looking lack of transparency of the preliminary journalistic decisions made by platform operators accessible to general analysis and public criticism " do.

In short: more

information about the processes involved in moderating

content would help users to determine whether it is the right platform for them.

Facebook's oversight board, which was launched in October 2020, is roughly comparable to this concept, even if it is primarily a complaint office for highlighted cases of deletions and bans on Facebook and Instagram.

6. A public service social network

For some, the answer to excessive hatred and agitation does not lie in reforming the Silicon Valley platforms at all, but in

completely different networks

.

Every now and then, experts call for a public social network that would at least be free from the economic constraints of private companies - and organized and controlled like ARD and ZDF, for example.

A basic assumption is that algorithms that do not weight content according to its conflict and thus click potential, as well as various supervisory bodies, would create a more civil environment.

7. Voluntary self-moderation

Further alternatives can already be found in the so-called

Fediverse

.

Many smaller communities are connected to one another via a common technical protocol.

However, they are maintained independently of one another and

moderated by volunteers from the respective community

.

That makes it - at least as long as the individual groups are manageable - easier to enforce civil treatment.

In addition, the last option always remains the possibility of “defederating”: For example, the Gab network, which is used by many rights, tried in 2019 to gain a foothold in the Fediverse with its own presence.

But the other communities managed to develop a strategy against Gab: They decided to isolate Gab and not allow any exchange.

The "defederating" consisted in cutting the technically possible exchange between Gab and all other mini-networks.

Gab users could not do mischief in the entire Fediverse, but only in the few areas that allowed this.

Quickly gave up and left the federated, decentralized network of networks again.

Icon: The mirror

Source: spiegel

All tech articles on 2021-01-17

You may like

News/Politics 2024-03-30T09:35:24.815Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.