Since 2022, the European Parliament has been trying to pass legislation requiring digital service providers to scan for and report CSAM as it passes through their services.
Giacomo Zandonini, Apostolis Fotiadis, and Luděk Stavinoha, Balkan Insight, with a good summary in September:
Welcomed by some child welfare organisations, the regulation has nevertheless been met with alarm from privacy advocates and tech specialists who say it will unleash a massive new surveillance system and threaten the use of end-to-end encryption, currently the ultimate way to secure digital communications from prying eyes.
[…]
The proposed regulation is excessively “influenced by companies pretending to be NGOs but acting more like tech companies”, said Arda Gerkens, former director of Europe’s oldest hotline for reporting online CSAM.
This is going to require a little back-and-forth, and I will pick up the story with quotations from Matthew Green’s introductory remarks to a panel before the European Internet Services Providers Association in March 2023:
The only serious proposal that has attempted to address this technical challenge was devised — and then subsequently abandoned — by Apple in 2021. That proposal aimed only at detecting known content using a perceptual hash function. The company proposed to use advanced cryptography to “split” the evaluation of hash comparisons between the user’s device and Apple’s servers: this ensured that the device never received a readable copy of the hash database.
[…]
The Commission’s Impact Assessment deems the Apple approach to be a success, and does not grapple with this failure. I assure you that this is not how it is viewed within the technical community, and likely not within Apple itself. One of the most capable technology firms in the world threw all their knowledge against this problem, and were embarrassed by a group of hackers: essentially before the ink was dry on their proposal.
Daniel Boffey, the Guardian, in May 2023:
Now leaked internal EU legal advice, which was presented to diplomats from the bloc’s member states on 27 April and has been seen by the Guardian, raises significant doubts about the lawfulness of the regulation unveiled by the European Commission in May last year.
The European Parliament in a November 2023 press release:
In the adopted text, MEPs excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential. Providers would be able to choose which technologies to use as long as they comply with the strong safeguards foreseen in the law, and subject to an independent, public audit of these technologies.
Joseph Menn, Washington Post, in March, reporting on the results of a European court ruling:
While some American officials continue to attack strong encryption as an enabler of child abuse and other crimes, a key European court has upheld it as fundamental to the basic right to privacy.
[…]
The court praised end-to-end encryption generally, noting that it “appears to help citizens and businesses to defend themselves against abuses of information technologies, such as hacking, identity and personal data theft, fraud and the improper disclosure of confidential information.”
This is not directly about the proposed CSAM measures, but it is precedent for European regulators to follow.
Natasha Lomas, TechCrunch, this week:
The most recent Council proposal, which was put forward in May under the Belgian presidency, includes a requirement that “providers of interpersonal communications services” (aka messaging apps) install and operate what the draft text describes as “technologies for upload moderation”, per a text published by Netzpolitik.
Article 10a, which contains the upload moderation plan, states that these technologies would be expected “to detect, prior to transmission, the dissemination of known child sexual abuse material or of new child sexual abuse material.”
Meredith Whittaker, CEO of Signal, issued a PDF statement criticizing the proposal:
Instead of accepting this fundamental mathematical reality, some European countries continue to play rhetorical
games. They’ve come back to the table with the same idea under a new label. Instead of using the previous term “client-side scanning,” they’ve rebranded and are now calling it “upload moderation.” Some are claiming that “upload moderation” does not undermine encryption because it happens before your message or video is
encrypted. This is untrue.
Patrick Breyer, of Germany’s Pirate Party:
Only Germany, Luxembourg, the Netherlands, Austria and Poland are relatively clear that they will not support the proposal, but this is not sufficient for a “blocking minority”.
Ella Jakubowska on X:
The exact quote from [Věra Jourová] the Commissioner for Values & Transparency: “the Commission proposed the method or the rule that even encrypted messaging can be broken for the sake of better protecting children”
Věra Jourová on X, some time later:
Let me clarify one thing about our draft law to detect online child sexual abuse #CSAM.
Our proposal is not breaking encryption. Our proposal preserves privacy and any measures taken need to be in line with EU privacy laws.
Matthew Green on X:
Coming back to the initial question: does installing surveillance software on every phone “break encryption”? The scientist in me squirms at the question. But if we rephrase as “does this proposal undermine and break the *protections offered by encryption*”: absolutely yes.
Maïthé Chini, the Brussels Times:
It was known that the qualified majority required to approve the proposal would be very small, particularly following the harsh criticism of privacy experts on Wednesday and Thursday.
[…]
“[On Thursday morning], it soon became clear that the required qualified majority would just not be met. The Presidency therefore decided to withdraw the item from today’s agenda, and to continue the consultations in a serene atmosphere,” a Belgian EU Presidency source told The Brussels Times.
That is a truncated history of this piece of legislation: regulators want platform operators to detect and report CSAM; platforms and experts say that will conflict with security and privacy promises, even if media is scanned prior to encryption. This proposal may be specific to the E.U., but you can find similar plans to curtail or invalidate end-to-end encryption around the world:
I selected English-speaking areas because that is the language I can read, but I am sure there are more regions facing threats of their own.
We are not served by pretending this threat is limited to any specific geography. The benefits of end-to-end encryption are being threatened globally. The E.U.’s attempt may have been pushed aside for now, but another will rise somewhere else, and then another. It is up to civil rights organizations everywhere to continue arguing for the necessary privacy and security protections offered by end-to-end encryption.