The Future Is Here
We may earn a commission from links on this page

Apple Hits Pause on Controversial CSAM Detection Feature

After backlash from security experts, Apple now says it will "take additional time" to improve the feature before public launch.

Image for article titled Apple Hits Pause on Controversial CSAM Detection Feature
Photo: Mladen Antonov/AFP (Getty Images) (Getty Images)

Early last month, Apple announced it would introduce a new set of tools to help detect known child sexual abuse material (CSAM) in photos stored on iPhones. The feature was criticized by security experts as a violation of user privacy, and what followed was a public relations nightmare for Apple. Now, in a rare move, Apple said today that it’s going to take a step back to further refine the feature before public release.

In a statement sent to Gizmodo, Apple said:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Advertisement

Initially, the CSAM features were set to roll out with iOS 15 later this fall. However, the backlash from security experts and privacy groups was fierce, with thousands signing an open letter to Apple asking them to rethink the feature. Internally, Apple employees were also reported to have raised concerns.

Advertisement

While critics agreed that child pornography is a serious problem, the fear was that Apple had essentially built a “backdoor” in users’ iPhones that could be easily abused to scan for other materials. Doing so could lead foreign governments to potentially use a tool intended for noble purposes as a means of surveillance and censorship. There were also concerns that innocent photos of children in bathtubs may be flagged as child porn. Yet another fear was that the tools could be used as a workaround for encrypted communications.

Advertisement

Apple initially doubled down, releasing lengthy FAQs and hosting several briefings with reporters to clarify how the feature worked and the company’s intent. The company also tried to allay fears by promising that it wouldn’t allow governments to abuse its CSAM tools as a surveillance weapon. However, despite its best efforts—they even trotted out software chief Craig Federighi in a Wall Street Journal interview—most remained confused as to how the CSAM feature worked and the risks it posed to individual privacy.

As of right now, Apple has offered few clues as to when it now plans to roll out the feature, or what its revision process will look like.

Advertisement

This story is developing...