Apple unveils plans to scan US iPhones for child sex abuse images

by Jeremy

Apple will begin scanning its US customers’ devices for known child sexual abuse material (CSAM) later this year but already faces resistance from privacy and security advocates.

Iphone mobile apple getty

The CSAM detection tool is one of three new child-safety measures being introduced by Apple, including monitoring children’s communications with machine learning for signs of nudity or other sexually explicit content and updating Search and Siri to intervene when necessary users make CSAM-related queries.

In its announcement, Apple said the new detection tool would enable the company to report instances of CSAM to the National Center for Missing and Exploited Children (NCMEC), which works in collaboration with law enforcement across the US. Apple said that instead of scanning images in the cloud, the system would perform on-device matching against a database of known CSAM image hashes provided by NCMEC and other child safety organizations, and it would transform this database into an “unreadable set of hashes” to be securely stored on users’ devices.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” said the company. “This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.

Related Posts

Leave a Comment