Assessing the Goals and Consequences of the Proposed EARN IT Act cyberlaw.stanford.edu

Alfred Ng, CNet:

Google, Facebook, Microsoft, Twitter, Snap and Roblox have agreed to adopt 11 voluntary principles to prevent online child sexual exploitation, government officials said Thursday. But the effort also hints at the potential to undercut encryption, an essential element of online security.

[…]

The federal government has argued that it doesn’t want to end encryption that protects the average person, and instead wants “lawful access.” The concept would mean creating a technical opening, or backdoor, that only law enforcement could use in investigations — something cryptography experts have long argued is impossible.

Tech companies like Apple, Facebook, Google and Microsoft agree with those experts and have refused to create backdoors to their encryption protocols. They’ve warned that if they’re forced to create such openings, it would essentially weaken security for everyone by creating an unlock tool that could fall into the wrong hands.

Additional reporting from Ng:

Depending on who you ask, the EARN IT Act could either destroy the fundamental values of an open internet or protect children from being sexually exploited online. The Eliminating Abusive and Rampant Neglect of Interactive Technologies Act, which requires tech companies to meet safety requirements for children online before obtaining immunity from lawsuits, will have its first public hearing on Wednesday.

Unlucky for me, I have a severe allergy to strained backronyms and I have broken out into hives. Please send help.

A bipartisan group of US lawmakers introduced the bill Thursday, saying that the legislation would enforce standards to protect children from sexual exploitation online. The announcement came at the same time the Justice Department hosted a press event to argue that end-to-end encryption protects online predators.

While few would question the importance of ensuring child safety, technology experts warn that the bill is really just the government’s latest attempt to uproot both free speech and security protections online.

A copy of the current draft of the Act can be found on the Senate website (PDF).

Elliot Harmon of the Electronic Frontier Foundation:

The EARN IT Act would create a “National Commission on Online Child Sexual Exploitation Prevention” tasked with developing “best practices” for owners of Internet platforms to “prevent, reduce, and respond” to child exploitation online. But far from mere recommendations, those “best practices” would essentially become legal requirements: if a platform failed to adhere to them, it would lose essential legal protections for free speech.

[…]

As we mentioned when we wrote about the prior version of EARN IT, Section 230 does not exempt online intermediaries from liability for a violation of federal criminal law. If a platform knowingly distributes child exploitation imagery, then the Department of Justice can and must enforce the law. What’s more, if an Internet company finds sexual abuse material on its platform, the law requires it to provide that information to the National Center for Missing and Exploited Children and to cooperate with law enforcement investigations.

Riana Pfefferkorn of the Center for Internet and Society:

The bill would, in effect, allow unaccountable commissioners to set best practices making it illegal for online service providers (for chat, email, cloud storage, etc.) to provide end-to-end encryption — something it is currently 100% legal for them to do under existing federal law, specifically CALEA. That is, the bill would make providers liable under one law for exercising their legal rights under a different law. Why isn’t this conflict with CALEA acknowledged anywhere in the bill? (We saw the exact same problem with the ill-fated Burr/Feinstein attempt to indirectly ban smartphone encryption.)

In a tangentially-related report, Vice created a data set of five hundred iPhone search warrants to give some context to this discussion.

Joseph Cox:

One of the top level findings of Motherboard’s dataset is that many law enforcement agencies and officials can not reliably access data stored on iPhones. Whether that’s due to a device having too strong a passcode, the phone being damaged, an unlocking capability not being available at that specific point in time, or a particular agency not having access to advanced forensic technology itself, Motherboard found many cases where investigators were not able to extract data from iPhones, at least according to the search warrants.

But in some cases officials were able to obtain data from a variety of devices, including some of the latest models of iPhones offered at the time. Multiple federal agencies and local police departments have access to tools from companies such as Grayshift and Cellebrite, which can, depending on a variety of factors, unlock and obtain data from iPhones.

[…]

Most of all, the records compiled by Motherboard show that the capability to unlock iPhones is a fluid issue, with an ebb and flow of law enforcement sometimes being able to access devices and others not. The data solidifies that some law enforcement officials do have trouble accessing data stored on iPhones. But ultimately, our findings lead experts to circle back to the fundamental policy question: should law enforcement have guaranteed access to iPhones, with the trade-offs in iPhone security that come with that?

This piece focuses on the iPhone because it has a consistent and known security policy, but this question applies similarly to every device and mode of communication.

I don’t think anyone would doubt the inherent good in creating laws to ensure the safety of children and assisting in the capture and prosecution of those who abuse them. I entirely support the idea of an encryption standard that preserves the security and privacy of legal activities, yet still allows law enforcement to surveil and capture those abuse its protections to commit serious crimes. Nothing like that currently exists, however, and it is unlikely that it will — at least for the foreseeable future. We should not choose to become less safe because of the limitations of math, nor should we punish technologists for being unable to comply with impossible requests.

Update: Lauren Feiner, CNBC:

Senators disputed the tech industry’s claims that a bipartisan bill targeting tech’s long-standing legal shield would prohibit encryption by necessity.

“This bill says nothing about encryption,” Sen. Richard Blumenthal, D-Conn., said at a hearing Wednesday to discuss the legislation. Blumenthal introduced the EARN IT Act last week with Senate Judiciary Committee Chairman Lindsey Graham, R-S.C., ranking member Dianne Feinstein, D-Calif., and Sen. Josh Hawley, R-Mo.

Issie Laopwsky, Protocol:

[…]The EARN IT Act still opens up the possibility that an administration interested in weakening encryption — as the last several have been — could make Section 230 immunity dependent upon building a backdoor for law enforcement. If that weren’t at least part of the goal of the bill, Mayer said, its authors could easily write in language to allay those concerns. But they haven’t.

It’s worth asking why that is the case.