Apple Issues Statement Ahead of Heat Initiative Launch, Which Is Pressuring for CSAM Detection in iCloud wired.com

Lily Hay Newman, reporting earlier this week in Wired:

[…] This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company “detect, report, and remove” child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company. 

Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as Communication Safety features. The company’s response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests.

The campaign website contains some horrific stories from news sources, and it is explicitly targeting Apple. Sarah Gardner, its CEO, was Vice President of External Affairs at Thorn, a non-profit with partnerships across the tech space which builds Safer,1 a CSAM detection product for platform operators.

That Apple got out in front of this is unusual for the company, but not surprising. CSAM is an obviously awful subject; even the mere act of writing about it and its consequences is upsetting to me. What is a little bit surprising is that Apple gave to Wired a copy of the email (PDF) Gardner sent — apparently to Tim Cook — and the response from Apple’s Erik Neuenschwander. In that letter, Neuenschwander notes that scanning tools can be repurposed on demand for wider surveillance, something it earlier denied it would comply with but nevertheless remains a concern; Neuenschwander also notes the risk of false positives.

Here is where I got confused; Neuenschwander:

Scanning of personal data in the cloud is regularly used by companies to monetize the information of their users. While some companies have justified those practices, we’ve chosen a very different path — one that prioritizes the security and privacy of our users. Scanning every user’s privately stored iCloud content would in our estimation pose serious unintended consequences for our users. Threats to user data are undeniably growing — globally the total number of data breaches more than tripled between 2013 and 2021, exposing 1.1 billion personal records in 2021 alone. As threats become increasingly sophisticated, we are committed to providing our users with the best data security in the world, and we constantly identify and mitigate emerging threats to users’ personal data, on device and in the cloud. Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit.

I quoted the entire paragraph so you can be sure I am not taking anything out of context.

The first two sentences appear to reference ad tech companies’ exploitation of users’ data, but it is unclear to me which businesses and information this is gesturing at. Google, for example, says it does not use Gmail, Google Drive, or Google Photos data for advertising. If this is simply a general statement about the use of any data in third-party hands being used for any monetization purposes, it feels like a non sequitur in this context.

The next sentences reference data breaches. This is relevant only in the sense that Apple encrypts users’ files in iCloud, and cloud-based CSAM scanning technologies require unencrypted files in order to function as designed. The quoted paragraph appears to be Apple’s justification for encrypting user data before it is in the company’s hands, and the subsequent paragraphs in the letter seem like a defence against using scanning mechanisms more generally.

I can understand the arguments raised by Gardner in this letter to Cook and on the campaign’s website, but the proposed approach remains necessarily risky for all users. It is not too far removed from proposals against encryption as a whole, which require all users to forego significant privacy and security protections from worldwide threats. One of the stories on Heat Initiative’s website concerns a man who abused his then-fiancé’s daughter in photo and video recordings, some of which were stored in his personal iCloud account. It is not clear to me how this case and others like it would have been discovered by Apple even if it did proceed with its proposed local CSAM detection solution as it would only alert on media already known to reporting authorities like NCMEC.

More generally, there should be a clearer separation between what is personal and what is public, and there should be more assertive expectations for each. I do not think anything I store in iCloud should be property of anyone but myself; it should be treated as an extension of my local storage. That is obviously not a defence of anybody creating or storing these heinous materials; I simply think my own files are private regardless of where they are stored. Public iCloud photo albums are visible to the world and should be subject to greater scrutiny. Apple could do at least one thing differently: it is surprising to me that shared public iCloud albums do not have any button to report misuse.2


  1. The website is currently down but the Internet Archive has a recent mirror. ↥︎

  2. Also that they still feature the old iCloud logo↥︎