Published on: Monday, August 9, 2021

Apple intends to install software on American iPhones to scan for child abuse imagery, drawing applause from child protection groups but raising concern among security researchers that the system could be misused, including by governments looking to surveil citizens (article available here). Apple detailed its proposed system — known as “neuralMatch” — to some US academics.

The tool designed to detected known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

In a blistering critique, the Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company's guarantee of "end-to-end encryption." Scanning of messages for sexually explicit content on phones or computers effectively breaks the security.