Twice now, the U.S. Department of Justice has pushed Apple to help decrypt iPhones involved in high-profile crimes. Twice, Apple has pushed back. And, twice, the popular press has framed these cases in terms that do not help their general-audience readers understand why Apple is refusing demands to cooperate; instead, using language that implicitly helps those who believe that our rights should be compromised to a lowest common denominator.

The first time the Department of Justice began this campaign was in the aftermath of the December 2015 mass shooting in San Bernardino, California. Two individuals murdered fourteen people in a workplace terrorist attack motivated by extremist views. The perpetrators were killed. One had an iPhone and, while Apple was able to provide investigators with a copy of the data stored in iCloud, they were unable to assist with the phone’s unknown passcode. The Department of Justice attempted to use the All Writs act to compel the company to disable any passcode-guessing countermeasures that might be enabled, and Apple refused on the grounds that it would universally undermine its products’ security and set a precedent against encryption — more on that later. The two parties fought and nearly ended up in court before the FBI enlisted a third-party vendor to crack the passcode. Ultimately, nothing of investigative value was on the phone.

It has been over four years since that case first began, and officials did not, in that time, again attempt to compel Apple into weakening the security of its products. That is, despite nearly seven thousand devices in the first eleven months of 2017 alone being apparently inaccessible, the Department of Justice did not again make any further requests of unlocking assistance from Apple.

Until recently, that is, when a case of horrible deja vu struck. In December 2019, one person motivated by extremist views murdered three people in a terrorist attack at his workplace. The perpetrator had two iPhones, one of which he shot before being killed by police. Apple has provided investigators with the data they were able to access, but is not assisting with the decryption of the iPhones in question.

Which is how we arrive at today’s announcement from U.S. Attorney General William Barr that he wants more “substantive assistance” from Apple in decrypting the two phones used by the perpetrator in this most recent attack — and, more specifically, Katie Benner’s report for the New York Times:

Mr. Barr’s appeal was an escalation of an ongoing fight between the Justice Department and Apple pitting personal privacy against public safety.

This is like three paragraphs in and it is already setting up the idea that personal privacy and public safety are two opposing ends of a gradient. That’s simply not true. A society that has less personal privacy does not inherently have better public safety; Russia and Saudi Arabia are countries with respectable HDI scores, brutal censorship and surveillance, and higher murder rates than similarly-advanced countries that lack an authoritarian anti-privacy stance.

More worrisome, however, is how easily the issue of encryption is minimized as being merely about personal privacy, when it is far more versatile, powerful, and useful than that. The widespread availability of data encryption is one reason many companies today are okay with employees working remotely, since company secrets can’t be obtained by those not authorized. Encryption helps journalists get better information from sources who must remain anonymous. Encryption is why I haven’t had a printed bank statement in ten years, and how you know you’re giving your health care information to your insurance provider. Encryption helps marginalized people bypass unjust and oppressive laws where they may travel or live. It makes commerce work better. It prevents messages from being read by an abusive ex-partner. It gives you confidence that you can store your work and personal life on a single device.

The U.S. Department of Justice is trying to compel Apple to weaken the encryption of every iOS device. That will set a precedent that those who implement encryption technologies ought to loosen them upon request. And that means that everything we gain from it is forever undermined.

Public safety would not be improved if encryption were weakened — it would be gutted.

Knowing all that helps one see why a summary like this is wildly inaccurate:

Apple has given investigators materials from the iCloud account of the gunman, Second Lt. Mohammed Saeed Alshamrani, a member of the Saudi air force training with the American military, who killed three sailors and wounded eight others on Dec. 6. But the company has refused to help the F.B.I. open the phones themselves, which would undermine its claims that its phones are secure.

Apple is not declining to decrypt these iPhones for marketing reasons. If anything, the public reaction to its stance will be highly negative, as it was in 2015. They are refusing Barr’s request because it means, in effect, that we must forego all of the benefits we have gained from encryption.

Apple says as much in a statement they gave Scott Lucas of Buzzfeed:

We are continuing to work with the FBI, and our engineering teams recently had a call to provide additional technical assistance. Apple has great respect for the Bureau’s work, and we will work tirelessly to help them investigate this tragic attack on our nation.

We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers. Today, law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users’ data.

This is the same thing experts keep telling lawmakers, who persist in believing that it’s a matter of hard work and willpower rather than a limitation of reality — and then cast their lack of understanding in profoundly offensive terms:

“Companies shouldn’t be allowed to shield criminals and terrorists from lawful efforts to solve crimes and protect our citizens,” Senator Tom Cotton, Republican of Arkansas, said in a statement. “Apple has a notorious history of siding with terrorists over law enforcement. I hope in this case they’ll change course and actually work with the F.B.I.”

Setting aside how stupid and disliked Cotton has proved himself to be, it’s revealing that his best argument is to claim that Apple sides with terrorists. He really hasn’t got a clue.

Back to the Times report:

The San Bernardino dispute was resolved when the F.B.I. found a private company to bypass the iPhone’s encryption. Tensions between the two sides, however, remained; and Apple worked to ensure that neither the government nor private contractors could open its phones.

This is one of those instances where a reporter is so close to getting it, but ends up missing the mark and landing in dangerous territory. Apple fixed a bunch of iOS security problems; these changes simultaneously prevent investigators and criminals from gaining access to devices because both are unauthorized, as far as the security infrastructure is concerned. Any breach of that may help law enforcement, but it will also help people trying to break into, for example, the President’s iPhone. Weakening security for one weakens it for everyone.

Apple did not “ensure” that it locked law enforcement out of its products. It fixed bugs.

Apple did not respond to a request for comment. But it will not back down from its unequivocal support of encryption that is impossible to crack, people close to the company said.

This Times piece was published before Apple responded at length to reporters — as linked above — but its position has been admirably consistent. Much in the same way that it’s impossible to draw a line between security holes for good people and security holes for bad people, it’s also hard not to see this ending with encryption compromised everywhere. If the Department of Justice thinks it should be breached for this device, why not the apparently thousands of devices in storage lockers? If encryption should not apply to devices belonging to the dead, why not devices belonging to the living? If they can get into encrypted devices at rest, why wouldn’t they feel the right to decrypt communications in transit? If the relatively stable and reputable law enforcement of the United States can gain access, what about officers in other countries? Why not other branches of the justice system, or even officials more broadly? Are there any countries that you would exclude from access?

There is a point at which I expect many people will start to push back against this ever-expanding list of those allowed access to encrypted communications. From a purely technical perspective, it doesn’t matter where you stopped: if you don’t think a particular corrupt regime should be allowed to decrypt communications and devices on demand, or you object to other branches of a government having access, or you think that this policy should only apply to devices with dead owners. It simply doesn’t matter. Because, from a technical perspective, once you allow one point of access, you allow them all. Code doesn’t care whether a back door was accessed by an investigator with a warrant, a stalker, a corrupt official, or a thief.

This story is not a case of a stubborn tech company feeling like they are above the law. It is about an opportunistic Department of Justice that is making an impossible demand that devices should allow access to the authorized user, law enforcement agencies, and nobody else. They haven’t argued for that since a previous high-profile terrorist attack, so it isn’t about principle. It’s about taking advantage of a situation they know will be a hard public relations battle for Apple — in large part because the public at large doesn’t understand the unfeasibility of what is being asked. Articles like this one do nothing to explain that, and only help to push the government’s dangerous position.

After the past few years of all “big tech companies” being lumped into the same pile of public distrust, I fear they might win this time. As a result, we, the public, will lose our electronic privacy, security, and safety.