The Limited Times

Now you can see non-English news...

Fighting Child Abuse: Who Is Copying What Apple Is Showing?

2021-08-09T12:35:40.080Z


Photos and chats on iPhones and Macs will in future be automatically analyzed to find pictures of abused children. Apple is giving politicians leverage, including against Facebook, Google and Microsoft.


First some advertising, from Apple: "What happens on your iPhone stays on your iPhone." In 2019, it said on a giant poster in Las Vegas.

It was an allusion on the one hand to the city itself, which is famous for the phrase "What happens in Vegas stays in Vegas", on the other hand to companies like Google and Facebook, whose business model is based on user data-based advertising.

In retrospect, the sentence sounds a little cynical when you think of the revelations of the "Pegasus Project" about the surveillance software of the same name from the Israeli company NSO. Anyone who captures them reveals absolutely everything that happens on their iPhone. But we're talking about maybe a few thousand out of the billion plus iPhones that are in use.

Last week, however, Apple announced new functions for the fall, including the iPhone. In the first step, they will affect all US users who install iOS 15. So tens of millions of people. Apple wants to scan your photos for known images showing sexual abuse of children before uploading them to iCloud. As a precautionary measure, the company also wants to check children's iMessages for “sexually explicit photos” that are sent to or sent by them. Both should only happen locally on the devices, but if the systems strike, others may find out about them. My colleagues and I have described the first details in detail here - more should follow soon.

The planned innovations actually require a small adjustment to the advertising.

In future, the sentence should be: "What happens on your iPhone, Apple employees, the police or your parents may find out."

I would quietly doubt whether Apple customers would jump on such an advertisement.

Even if images of abuse are distributed millions of times via popular services and by no means only on the Darknet: Such a terrible, criminal thing is only done by a fraction of users.

But governments and regulators would like posters like this.

On the one hand, they have long been calling for the tech industry to be more active in the fight against child abuse.

On the other hand, security authorities have been demanding access to encrypted communication and encrypted devices for years, actually decades, in order to be able to monitor terrorists and criminals.

Apple is now taking a step towards them all.

The Cupertino-based company tries to emphasize that, first of all, it is all about finding images of child abuse and thereby identifying perpetrators and protecting victims.

Second, there are security measures that should protect innocent people from anyone else ever seeing their pictures and iMessages.

But once the infrastructure is up and running, two things will happen: Certain governments will argue that the same technology could be used to search for other content - and they will ask Apple to do the same.

For now, what kind of content it will be about, I leave it to your personal dystopian imagination.

Today, Monday, the company published an FAQ in which it answers the question about this scenario: "Apple will reject such demands" and the process is designed in such a way that a search for other content is prevented.

But Apple makes compromises with governments like in China when it comes to business, this should not be forgotten.

In addition, the political pressure on other providers to follow suit Apple will now grow. Cloud storage, e-mails and unencrypted chats have been scanned accordingly for a long time, not least by Google, Facebook and Microsoft. Now it's also about end-to-end encrypted chat apps. I am sure that the first politicians and security authorities will soon contact WhatsApp, for example, and ask when the Facebook subsidiary plans to catch up with Apple. WhatsApp boss Will Cathcart has already said that he will not participate.

Apple is now also showing how something like this can be implemented at the operating system level instead of at the application level.

The choice of operating systems for mobile devices is much smaller than that of chat apps.

If Google and Microsoft decided to rebuild Android and Windows respectively, as Apple is doing with iOS 15 and macOS Monterey, billions of people would only have the choice between systems that contain a new type of surveillance infrastructure and alternative systems for which there may be many of them familiar applications do not exist.

In other words, Apple's decision will have repercussions, not just for Apple customers.

External links: three tips from other media

  • "When a hacker calls the CDU" (three minutes' reading)


    Nico Ernst spoke to Lilith Wittmann for "heise online": The security researcher found and reported a serious vulnerability in the CDU-connect app - and was reported by the party for this.

  • “What if Facial Recognition Technology Were in Everyone's Hands?” (English, six minutes to read)


    I am of the opinion that Clearview AI's facial recognition is already a danger.

    Philosophy professor Firmin DeBrabander thinks ahead in this readable piece on Slate.com: What would change if not only investigative authorities but we all could fall back on such a technology?

  • "Drone Delivery Is Bullshit" (English, three minutes to read)


    Amazon's delivery

    drones have crashed

    : The entire development department is closed.

    Cory Doctorow writes that it was foreseeable.

    And he has a theory as to why tech companies are so fond of promising the impossible.

I wish you a pleasant week!

Patrick Beuth

Source: spiegel

All tech articles on 2021-08-09

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.