The Future Is Here
We may earn a commission from links on this page

Apple Is Working on Problematic iOS Tool to Scan for Child Abuse Photos on iPhones

This tech might help in cracking down on child pornography, but it can also be misused.

Image for article titled Apple Is Working on Problematic iOS Tool to Scan for Child Abuse Photos on iPhones
Photo: Victoria Song/Gizmodo

Apple is purportedly poised to announce a new tool that will help identify child abuse in photos on a user’s iPhone. The tool would supposedly use a “neural matching function” called NeuralHash to detect if images on a user’s device match known child sexual abuse material (CSAM) fingerprints. While it appears that Apple has taken user privacy into consideration, there are also concerns that the tech may open the door to unintended misuse—particularly when it comes to surveillance.

The news comes via well-known security expert Matthew Green, an associate professor at Johns Hopkins Information Security Institute. Green is a credible source who’s written extensively about Apple’s privacy methods over the years. Notably, he’s worked with Apple in the past to patch a security flaw in iMessage. Apple has also confirmed to TechCrunch that the tech will be rolling out later this year with iOS 15 and macOS Monterey. It’s also published technical details of how the tech works and says it was reviewed by cryptography experts.

Advertisement

“I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea,” Green tweeted in a thread late last night. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.”

Advertisement

The crux of the issue is that while various tech companies, including Apple, have added end-to-end encryption to their services and products, it’s been opposed by various governments. While end-t0-end encryption is a win for consumer privacy, the argument is it also makes it difficult for law enforcement in attempts to crack down on illegal content like child pornography. According to Green, a “compromise” is to use these scanning technologies on the “client-side” or, on your phone before they’re sent and encrypted on the cloud. Green also claims that Apple’s version wouldn’t initially be used on encrypted images—just your iPhone’s photo library if and only if, you have iCloud Backup enabled. In other words, it would only scan photos that are already on Apple’s servers. However, Green also questions why Apple would go through the effort of designing this type of system if it didn’t have eventual plans to use it for end-to-end encrypted content.

Advertisement

No one wants to go to bat for child pornography, but Green points out this tech, while nobly intended, has far-reaching consequences and can potentially be misused. For instance, CSAM fingerprints are purposefully a little vague. That’s because if they were too exacting, you could just crop, resize or otherwise edit an image to evade detection. However, it also means bad actors could make harmless images “match” problematic ones. One example is political campaign posters that could be tagged by authoritarian governments to suppress activists, and so forth.

The other concern is that Apple is setting a precedent, and once that door is open, it’s that much harder to close it.

Advertisement

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Green writes. “That’s the message they’re sending to governments, competing services, China, you.”

Update, 08/05/2021, 3:45 pm: Since initial publication, Apple has confirmed to TechCrunch that this tech will roll out. This article has been updated to include that information.