Apple Walks a Privacy Tightrope to Spot Child Abuse in iCloud wired.com

Andy Greenberg, Wired:

But critics like Johns Hopkins University cryptographer Matt Green suspect more complex motives in Apple’s approach. He points out that the great technical lengths Apple has gone to to check images on a user’s device, despite that process’s privacy protections, only really make sense in cases where the images are encrypted before they leave a user’s phone or computer and server-side detection becomes impossible. And he fears that this means Apple will extend the detection system to photos on users’ devices that aren’t ever uploaded to iCloud — a kind of on-device image scanning that would represent a new form of invasion into users’ offline storage.

Or, in a more optimistic scenario for privacy advocates, he speculates Apple may be planning to add end-to-end encryption for iCloud, and has created its new CSAM detection system as a way to appease child safety advocates and law enforcement while encrypting its cloud storage such that it can’t otherwise access users’ photos. “What Apple is doing here is a technology demonstration,” Green says. “It’s not something they need to scan unencrypted iCloud photos. It’s something you need if the photos you’re scanning are going to be encrypted in the future.”

Greenberg’s article contains the best reporting I have seen on these announcements. It seems that a fair explanation of how Apple’s new approach differs from that used in other CSAM detection efforts is that it happens on the device before the file is ever uploaded. If, as Green speculates, this is to be used to scan files locally that are not destined for iCloud, that is clearly troublesome. But if it is to enable end-to-end iCloud encryption and it is not applied to purely local files, that seems like an overall privacy benefit.

If we follow that line of speculation further, it makes me wonder why Apple would create so much confusion in its communication of this change. Why drop this news at the beginning of August, disconnected from any other product or service launch? Why not announce it and end-to-end iCloud encryption at the same time, perhaps later this year? Perhaps it is because these features have garnered the approval of the NCMEC in statements to Greenberg and internally at Apple despite loudly protested encryption. If that is the case, Apple may demonstrate that it can control the spread of CSAM with its products and services while improving user privacy — if you trust it.