Apple To Scan, Report iCloud Photo Uploads For Child Sex Abuse Imagery

Apple Inc announced that it would implement a system on iPhones in the United States to find Child Sex Abuse Material (CSAM) before they’re uploaded to its iCloud.

Apple To Scan, Report iCloud Photo Uploads For Child Sex Abuse Imagery

On Thursday, Apple Inc announced that it would implement a system on iPhones in the United States to find Child Sex Abuse Material (CSAM) before they’re uploaded to its iCloud to make sure it doesn’t match already known CSAM. If a match is found, Apple said a human reviewer would then assess and report that user to law enforcement.

Apple explained at a news conference that the latest service would “turn on-devices photos into an unreadable set of hashes or complex numbers.” Those numbers will then be matched with a database of hashes provided by the NCMEC (National Center for Missing and Exploited Children). The company says the system will also catch edited but similar versions of real images.

However, there’re privacy apprehensions that the technology could be extended to scan user’s phones for prohibited content. Moreover, experts concern that authoritarian administrations could use the system to spy on their residents.

System Accuracy Level

Furthermore, new versions of iOS and iPad OS, anticipated to release later in 2021, will have new cryptography applications to help restrict the CSAM spread online while designing for their user’s privacy, according to the company.

Apple also claimed the technology had an extremely high-level of accuracy and makes sure less than a 1T chance/year of false positives.

A security researcher at John Hopkins University, Matthew Green, cautioned that regardless of what iPhone maker’s long-term intentions are, they have sent a very obvious signal. In their very influential opinion, it’s safe to build that scan phones of users for prohibited content, whether “they turn out to be right or wrong on that point barely matters, that would break the dam, administrations would demand it from everyone.”

Apple To Scan, Report iCloud Photo Uploads For Child Sex Abuse Imagery
Apple To Scan, Report iCloud Photo Uploads For Child Sex Abuse Imagery
Source: Web

Chief executive and president of the NCMEC, John Clark, said in a statement that the reality is that child protection and privacy can co-exist. They praise Apple Inc and anticipate collaborating to make this world “a safer place for children.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here