The Limited Times

Now you can see non-English news...

EU wants to examine chats in the fight against child abuse

2022-05-11T17:01:23.081Z


WhatsApp, Telegram and other messengers should be obliged to search chats for abuse images. Civil rights activists and experts are alarmed - and even investigators are not necessarily happy.


Enlarge image

Chat apps: Brussels plans “world standard” against abuse recordings

Photo: stnazkul / Getty Images

The two EU Commissioners Dubravka Suica (demography) and Ylva Johansson (interior) stood in front of a blue video wall with shadowed pictures of children and presented a summary of the horror.

Every fifth child becomes a victim of sexual abuse, Suca complained, and suffers “often from the traumatic experience for a lifetime”.

Johansson complained that 85 million images showing sexual violence against children could be found on the Internet last year alone.

In 90 percent of the cases, the material is stored "on servers in the EU," which is why Brussels is now particularly called upon to develop "a world standard" against illegal recordings.

The core of the plan presented on Wednesday is a comprehensive monitoring system controlled from Brussels.

A new “EU Center against Child Abuse” should be able to force online services to scan all communication from all their users for prohibited content.

If they find depictions of abuse, the providers must inform the central office, which in turn alerts the responsible national law enforcement authorities after its own examination.

"We protect you" is the message to the children, said Johansson, to the perpetrators it is: "We'll get you".

However, critics fear that the new authority could become a kind of Big Brother office and that the EU will use the emotional issue of sexualized violence to restrict privacy.

Some of them gathered in Berlin on Wednesday for a demonstration against the plan they called “chat control”.

Which online services and apps could be monitored?

According to the 135-page draft by the EU Commission, the EU could ask numerous companies to screen their users' communication: In addition to e-mail providers, operators of services in which the chat function is only an add-on should also be able to be forced to monitor .

This means online games, applications for sharing images – such as Instagram – and video hosters.

However, it is likely to be particularly controversial that the measure can oblige messenger operators such as WhatsApp, Telegram, Signal or Threema to scan private messages from their users.

If companies find abusive material, they must notify the competent authority and a new EU center that is to be set up and located in The Hague, as close as possible to Europol.

The EU center should also offer suitable tools for the search, especially for those providers who cannot afford to develop their own.

You can also put it this way: If you don't want to monitor your user base with your own methods, you should use government-developed software.

For example, providers with a branch in the EU or if they have “a significant number of users” can be obliged.

In other words, the EU Commission hardly wants to leave anyone unconsidered.

Services to which none of this applies, but which are suspected of being used to disseminate illegal content, should be able to be blocked by Internet providers in the EU.

This could block smaller providers whose operators remain anonymous or evade access by authorities - they should no longer work in Europe.

Why could the plans weaken the encryption of all users?

Critics fear that the EU plans will be an attack on a basic encryption technology that is particularly used today to protect the privacy of millions of users in messenger apps: so-called end-to-end encryption.

Even if the term is only mentioned once in the EU draft and encryption is hardly mentioned elsewhere, it is actually not clear how the surveillance that the EU wants can go together with the technical reality of this encryption.

The EU requires that service providers may search not only for known abuse material, but also for images and videos of child abuse that are not yet known.

Known, i.e. already reported material,

is stored as a so-called hash value in a database.

Providers already use such databases to compare whether users are sending known illegal content via their services.

A hash is like a digital fingerprint of a file, an image with the same content should always have the same hash value.

Unknown material would also have to be identified first, which requires software based on so-called artificial intelligence.

They must be able to distinguish between permitted nude photos and illegal ones.

But both approaches do not work within an encrypted communication.

The comparison of a chat message with a hash value database and the examination with recognition software must therefore take place on the user's device, before it is sent or after it is received.

This bypasses the end-to-end encryption of WhatsApp, Signal or Threema, for example – it is intended to explicitly prevent anyone other than the sender and recipient from viewing the communication.

Experts speak of a back door: It is not technically possible to develop such a monitoring mechanism that only weakens the privacy of individual suspects.

The data security of all users is always weakened because the companies either develop a mechanism to be able to bypass the encryption in principle or not.

It is the squaring of the circle that has led to disputes between politicians and security authorities on the one hand and technology experts on the other for decades: the Commission would like service providers to “avoid undermining the security and confidentiality of user communications”. .

They should design the search in such a way that misuse by their employees or third parties is ruled out.

Nonetheless, the confidentiality of millions of innocent people's communications is undermined, and no one can forever prevent abuse of such a surveillance structure once it is in place.

Under what conditions should communication be screened?

EU Commissioner Johansson affirmed that numerous precautions should be taken to protect the privacy of EU citizens.

For example, national authorities and data protection officers would have to give their permission before each scan.

To filter communication, the new EU agency may only use software that "least endangers the privacy of citizens," Johansson assured.

"To find the needle in the haystack, you have to use the right magnet" - and that's exactly what you intend to do.

In fact, the draft law names a hurdle before a provider like WhatsApp would be obliged to scan the chats of its users: First, the service providers would have to carry out a risk assessment and, if necessary, introduce measures to protect minors.

If the competent authority of a Member State decides that these measures are not sufficient, it can apply for a "detection order" from the court or another independent authority.

The prerequisite for a “detection order” that is basically limited in time, i.e. an official order to search through communication content, is firstly a “significant risk” that a service will be used to disseminate depictions of abuse.

Secondly, the reason for the instruction must outweigh the negative consequences for all those affected – “a fair balance between the fundamental rights of those involved should be ensured”.

However, it is stated elsewhere that a service that is newly launched could be ordered to intercept communications even if comparable services have been used in the past to disseminate criminal content.

This could mean that WhatsApp can also be committed if another messenger has been noticed in the past as a platform for pedo criminals.

EU auditors could argue with corresponding criminal proceedings in which a corresponding use of the messenger had been proven.

Could EU officials accidentally read legal chat messages?

In addition, the draft envisages various measures against grooming, i.e. the contact of adults with intentions to abuse minors.

Providers of communication services whose risk assessment concludes that grooming is possible should introduce age verification to identify underage users.

App store providers such as Google and Apple are supposed to ensure that minors cannot download these apps in the first place.

And a detection order can also include searching text messages for instances of grooming.

The latter means: EU citizens would have to expect that each of their private messages will be automatically checked before they are sent - including lawyers, journalists and other secret bearers.

The Commission tries to appease: This scanning is "often the only possible way" to recognize grooming, according to the draft.

But the technology does not "understand" the content of a communication, but only looks for known patterns that indicate possible grooming.

Such technologies are now extremely accurate, “however, human supervision is still necessary”.

This means that it could happen that employees of tech companies or, in a second step, EU officials view private and actually encrypted messages from EU citizens, even though their content is not illegal at all.

The commission does not mention that the evaluation of metadata can be at least as effective.

Without searching through the content of the communication, it is possible to tell from time stamps and similar data alone whether an adult is repeatedly trying to contact a person who is probably a minor and does not belong to the immediate circle of relationships.

The Commission does not admit that "highly accurate" still results in a huge number of false-positive hits in absolute numbers with billions of chats.

What do providers and experts say about this?

No wonder civil rights activists, IT experts and politicians are appalled.

The MEP Patrick Breyer from the pirates, for example, speaks of a "giant step towards a surveillance state based on the Chinese model".

The Green politicians Konstantin von Notz, deputy parliamentary group leader, and Tobias Bacherle, chairman of the digital committee and member of the foreign affairs committee, have “massive doubts that this is compatible with applicable European and German fundamental rights and with ECJ case law”.

Cryptography professor Matthew D. Green wrote on Twitter that the draft describes "the most sophisticated mass surveillance machinery ever deployed outside of China and the Soviet Union."

The technology that could correctly identify illegal content and grooming does not exist, so there will be errors.

And once the faulty technology is in use, other states would want to use it.

Alexandra Koch-Skiba, head of the complaints office of the eco Association of the Internet Industry, said: "In our view, the draft has the potential to create a free pass for state surveillance.

This is ineffective and illegal.

Instead, sustainable child and youth protection would require more personnel for investigations and comprehensive criminal prosecution."

Will Cathcart, the head of WhatsApp, sharply criticized a leaked version of the EU plans on Tuesday evening.

»Strong encryption protects the privacy and security of billions of people.

Weakening that is a mistake,” tweeted Cathcart.

The EU proposal is a "terrible idea." (Read a detailed SPIEGEL interview with the WhatsApp boss on the subject of encryption and data protection here.)

Martin Blatter, head of the Swiss crypto messenger Threema, told SPIEGEL that the plan "would perhaps do credit to a totalitarian regime, but has no place in a democracy".

According to Blatter, it is also not technically possible to reliably detect illegal attempts at grooming and images of abuse using only an algorithm.

"The Commission's plans could therefore lead to a mass criminalization of innocent citizens."

Will the law help arrest more pedo criminals?

German prosecutors are also criticizing the EU plans behind closed doors.

Several investigators welcome the fact that the EU wants to pass new uniform rules against child abuse and is also making various internet services compulsory.

However, the current plans would not necessarily result in more pedophiles being arrested, according to several long-time investigators.

One of the reasons: Although the law would lead to more reports of abuse images, these do not automatically lead to more investigative successes.

Prosecutors already have enough data points today.

The problem is to process all the cases with the available resources and to find the particularly dangerous perpetrators.

“We already have a haystack problem today.

The mass of new reports after the EU plans threatens to paralyze our criminal prosecution,” says one investigator.

That is why the plan is commendable, but the current implementation is counterproductive, according to law enforcement officials.

In addition, there would be other ways to get more perpetrators on the track.

"If it's just a matter of having more cases and catching more offenders, then you don't need such an intrusion into fundamental rights," says another veteran child abuse investigator.

What's next?

The Commission's draft has yet to be approved by both the EU Member States and Parliament.

The particularly controversial sections can either simply be nodded, modified or stopped.

The German federal government, for example, has expressly committed itself to the right to encryption in its coalition agreement and has pledged end-to-end encryption.

Source: spiegel

All tech articles on 2022-05-11

You may like

News/Politics 2024-03-13T08:03:27.346Z

Trends 24h

Tech/Game 2024-04-16T05:05:15.331Z
Tech/Game 2024-04-16T05:05:07.406Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.