Apple will scan all photos uploaded by its users to iCloud and saved on devices, looking for child pornography images that it will report to the relevant authorities.
This was announced by the same company, which will use a method based on encrypted codes that should not jeopardize privacy.
The system, as CNN reports, will transform the photos into a set of symbols that will be compared with a database provided by some associations against child pornography, such as the National Center for Missing and Exploited Children, and is able to recognize them even if there are states of small changes such as cutting or applying filters. An additional software, the company always explains on its website, will make further checks, such as to make the probability of error less than one in a trillion. Any correspondence will trigger a warning that will involve operators who will verify if the photos are actually part of the database. In this case the user's account will be closed and the reporting to the authorities will be triggered. "The method - Apple assures in a post on its website - was designed with the user's privacy in mind ".
The initiative has already aroused some controversy in associations for the right to privacy, as the CNN always reports. The new feature will be available with one of the next software updates.