Natasha Lomas, TechCrunch:
The European Union has formally presented its proposal to move from a situation in which some tech platforms voluntarily scan for child sexual abuse material (CSAM) to something more systematic — publishing draft legislation that will create a framework which could obligate digital services to use automated technologies to detect and report existing or new CSAM, and also identify and report grooming activity targeting kids on their platforms.
Lomas reports this is an attempt to unify a splintered set of policies that apply to individual countries within the E.U. but, as written, it appears to require the ability for providers to locally scan the contents of messages and even detect the possibility of minors being coerced, if ordered.
The proposal may appear superficially to contain a balanced and proportionate approach. In particular, providers can only be forced to scan on their platform or service if required to do so by a judicial authority, and are subject to a series of safeguards. According to Contexte, many of these safeguards have only been introduced in the last few days, which shows that pressure from the EDRi network and our supporters has had a positive effect.
However, there are several provisions which would indicate that these protections are mainly cosmetic, and that we may in fact be facing the worst-case scenario for private digital communications. For example, providers of services and platforms have to take actions to mitigate the risk of abuse being facilitated by their platform. But they will still be liable to be issued with a detection order forcing them to introduce additional measures unless they have demonstrated in their risk assessment that there is no remaining risk of abuse at all.
Even German child protection advocates are worried this is overbroad. This proposal is one to keep an eye on for its potentially far-reaching consequences.