In a move that’s both interesting and worrying, Apple plans to scan US iPhones for child abuse imagery. This means that any images of abuse, including those of real physical abuse, could be carried over onto an iPhone in the same way as images of other adult content would be carried.
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.
Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicised more widely as soon as this week, they said.
The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.
Apple declined to comment.
Along with the iCloud feature, the company plans to launch a new tool within the Messages app that would warn children and their parents about the receiving or sending of sexually explicit photos. Additionally, Apple announced its intention to expand guidance in Siri and Search to protect children from “unsafe situations.”
News of these updates was first reported in the Financial Times where the paper wrote that the detection feature would “continuously scan photos that are stored on a U.S. user’s iPhone” with harmful material being alerted to law enforcement. This announcement caught some privacy experts by surprise given the route Apple took in 2016 when it refused to unlock the San Bernardino terrorists’ phone upon receiving a request from the FBI.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.