Apple Delays the Launch of Its CSAM Detection Features as It Makes Changes wired.com

Apple sent this statement to media today — alas, not yours truly — and has now posted it at the top of its Child Safety webpage:

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Brian Barrett and Lily Hay Newman, Wired:

It’s unclear at this point what specific changes Apple could make to satisfy its critics. Green and Pfefferkorn both suggest that the company could limit its scanning to shared iCloud albums rather than involving its customers’ devices. And Stamos says the NeuralHash issues reinforce the importance of incorporating the research community more fully from the start, especially for an untested technology.

Others remain steadfast that the company should make its pause permanent. “Apple’s plan to conduct on-device scanning of photos and messages is the most dangerous proposal from any tech company in modern history,” says Evan Greer, deputy director of digital rights nonprofit Fight for the Future. “It’s encouraging that the backlash has forced Apple to delay this reckless and dangerous surveillance plan, but the reality is that there is no safe way to do what they are proposing. They need to abandon this plan entirely.”

I doubt that this thing will be entirely scrapped, especially if — as Green hopes — Apple is on a path toward end-to-end encryption for iCloud storage. If you think Apple lacks the backbone to resist political pressure for expanding the CSAM matching database, you definitely cannot hope for wholly encrypted iCloud storage without any way of detecting abuse.

I am curious about the company’s next steps, though. This has been a contentious proposal — one that I have covered extensively and, as a result, found myself going from concerned to cautiously optimistic. I still think Apple bungled this announcement; its Child Safety page still reads as though these are finished products that will ship in this form, with the exception of the notice added today. This was a big public push that even media trained executives struggled to explain in a clear way, and relied too much on trust in Apple at a time when tech companies are facing increased public skepticism. I look forward to a solution that can alleviate many researchers’ concerns, but I suspect — as with the App Store — trust has been burned. Only Apple can rebuild it.