Enlarge image
iPhones in the Apple Store: "When this type of content is received, the photo will be blurred"
Photo: LOREN ELLIOTT / REUTERS
Apple took a bold move to combat the proliferation of images depicting child sexual abuse.
The company announced that it will be introducing corresponding new functions with the iOS 15, iPadOS, watchOS 8 and macOS Monterey operating system versions planned for autumn.
Initially, this only applies to US customers.
On the one hand, it will be possible in the future for parents to receive a warning message if their child receives or sends nude photos via iMessage. The nudity in the pictures is detected by software on the device. Apple itself does not find out about it. "When this type of content is received, the photo will be blurred and the child will be warned," the company said.
On the other hand, Apple wants to compare the photos on iPhones, iPads and Macs with a list of known abuse material before they are uploaded to the online storage service iCloud.
For this purpose, a file with so-called "hashes" of already known content should be loaded onto the devices - a kind of digital fingerprint of the image.
A copy of the photo can be identified when comparing with special processes, but the original cannot be restored from the hash.
Alert when a publicly unknown number of hits has been reached
If there is a match, suspicious images are provided with a certificate, thanks to which Apple can exceptionally open them after uploading them to iCloud and subject them to an examination. However, the system only sounds the alarm when there is a certain number of hits. Apple has not yet explained how many there must be for this. If abusive material is actually discovered during the inspection, Apple reports this to the American non-governmental organization NCMEC (National Center for Missing & Exploited Children), which in turn can call in the authorities.
While the function is only activated for Apple customers with US accounts, the file with the hashes is an integral part of the operating system.
It should be loaded onto all iPhones on which this system version is installed.
The list on the devices should be updated accordingly with the appearance of new versions of the operating systems.
Before the function can be introduced internationally, the legal requirements must first be clarified.
Users who find known child pornographic material as a result of the comparison will not be informed of this.
However, your account will be blocked.
This type of comparison is also used by other tech companies
The comparison via hashes is also used, for example, by online platforms to discover such content while it is being uploaded and to prevent it from being published.
Well-known companies that rely on comparable processes include Microsoft, Facebook and Google.
According to the industry, they work practically flawlessly for photos, but do not yet apply to videos.
Critics of the end-to-end encryption of private communication in chat services and on smartphones, which is also common at Apple, often cite the fight against child sexual abuse as an argument to demand back doors for authorities.
Apple's announced system is an attempt to solve the problem in a different way.
The company repeatedly resisted demands from US security agencies to break the encryption of its devices during investigations.
Other content could also be identified in this way
Matthew Green, cryptography expert at the US University of Johns Hopkins, criticized the decision nonetheless.
He specifically sees the danger that authoritarian governments could impose rules on how to search for other content.
Greg Nojeim of the Center for Democracy and Technology in Washington said that "Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship."
This makes users "vulnerable to abuse and spying not only in the United States, but around the world."
The computer scientist Henning Tillmann wrote on Twitter: »Pandora's box is being opened.
To be more precise: end-to-end encryption could in principle be undermined by generally performing checks at the device level. "
Apple replied to critics that it did not have direct access to the images and that it had taken measures to protect privacy and security.
pbe / dpa / AFP