08/06/2021 15:26
Clarín.com
Technology
Updated 08/06/2021 3:26 PM
Apple
announced this Friday that it will launch a revolutionary software later this year that will
analyze stored photos
of users in search of
sexually explicit images of children
, and then report the instances to the relevant authorities.
As part of the new security measures involving children, the company also announced a feature that will analyze photos sent and received in
the Messages
To or From Children
app
to see if they are explicit.
Apple is also developing features in its
Siri digital voice assistant
to intervene when users search for abusive material.
The Cupertino, California-based tech giant previewed the
three new features
Thursday and said they will be available later in 2021.
In case the signature of the apple detects photos of sexually explicit children in a user's account, the instances will be
manually reviewed
by the company and reported to the National Center for Missing and Exploited Children (NCMEC, for its acronym in English) that works with judicial agencies.
The new Apple software will analyze the photos sent and received on iPhones.
Photo: AFP.
The
Electronic Frontier Foundation
(EFF) said Apple would be acting with these new tools inconsistently with its much touted privacy features.
"It is impossible to build a client-side scanning system that
can only be used for sexually explicit images
sent or received by children," the EFF said in a post on its website.
"As a consequence, even a well-intentioned effort to build such a system
will break the key promises of message encryption
and open the door to other types of abuse."
Other researchers were also concerned.
"Regardless of Apple's long-term plans,
the company sent a very clear signal
," Matthew Green, a professor of cryptography at Johns Hopkins University, wrote on Twitter.
"In his (very influential) opinion, it is safe to create systems that scan users' phones for prohibited content," he added.
How the new system works
This cryptographic technology for
iOS and iPadOS
,
with which to limit the distribution of child sexual abuse material on services such as
iCloud
,
It is characterized by using
machine learning
on the device to warn of explicit sexual content in its Messages app.
The technology company shared the news regarding child protection, new tools designed with experts that seek to
protect children from predators
who use digital services to reach them.
Apple explained that this method does not scan the images in the cloud.
The company explained that this method
does not scan the images in the cloud
, but is based on an on-device comparison of known images provided by child safety organizations before they are uploaded to iCloud.
What is being compared is not the image itself, but the 'hashes' of the images,
a kind of fingerprint
.
A cryptographic technology called "private set intersection" determines whether there is a match without revealing the result, and
is attached to the image once it is uploaded to iCloud
.
The "secret exchange threshold" technology
ensures a high level of match
, and that is when Apple receives an alert, for human teams to review.
If confirmed, the user's account is deactivated and a report is sent to the relevant associations and the Police.
SL
Look also
Step by step, what to do if your cell phone was lost or stolen
Twitter offers a reward to anyone who detects a bias in its algorithm