Private Exploit Marketplaces May Have Broad Security Benefits

Hey, remember that iPhone 5C that the U.S. government barely tried to crack before demanding Apple give them a back door, only to find a way in just one day before a related court hearing was to begin? It turns out that the company that they paid to crack it was not one of the usual suspects like Cellebrite or Grayshift.

Ellen Nakashima and Reed Albergotti, Washington Post:

The iPhone used by a terrorist in the San Bernardino shooting was unlocked by a small Australian hacking firm in 2016, ending a momentous standoff between the U.S. government and the tech titan Apple.

Azimuth Security, a publicity-shy company that says it sells its cyber wares only to democratic governments, secretly crafted the solution the FBI used to gain access to the device, according to several people familiar with the matter. The iPhone was used by one of two shooters whose December 2015 attack left more than a dozen people dead.

[…]

Apple has a tense relationship with security research firms. Wilder said the company believes researchers should disclose all vulnerabilities to Apple so that the company can more quickly fix them. Doing so would help preserve its reputation as having secure devices.

What a bizarre turn of phrase. It would help it “preserve its reputation as having secure devices” because it really would help improve the security of its devices for all users, in much the same way that telling a fire department that there is a fire nearby would help a building’s reputation as a fire-free zone.

Thanks to this report, we now know some of the backstory of how the 5C came to be cracked without Apple’s intervention, and Nakashima and Albergotti confirm why the FBI was so eager to take Apple to court for this specific case:

Months of effort to find a way to unlock the phone were unsuccessful. But Justice Department and FBI leaders, including Director James B. Comey, believed Apple could help and should be legally compelled to try. And Justice Department officials felt this case — in which a dead terrorist’s phone might have clues to prevent another attack — provided the most compelling grounds to date to win a favorable court precedent.

It was not “months of effort”; according to a Department of Justice report, the FBI spent a few hours actively trying to figure out how to crack the device. But if it was not perfectly clear before, it is now: this was the model case for getting a law enforcement back door in encryption because it involved a terrorist. The next time the FBI brought this up, it was because of another terrorist attack. In both cases, the iPhones were able to be cracked without Apple’s intervention.

Most of all, this report adds one more data point to the debate over the ethics of the zero-day market. If Azimuth had reported the vulnerabilities it exploited in cracking this iPhone — including a critical one reportedly found well before this terrorist attack occurred — Apple could have patched it and improved the security of its devices. However, if third parties were unable to find an adequate exploit, a court may have compelled Apple to write a version of iOS that would give law enforcement an easier time breaking into this iPhone. Once that precedent is set, it cannot be un-set.

Katie Moussouris of Luta Security on Twitter:

Selling exploits to law enforcement removes their plausible cause to petition courts to order Apple & others to self-sabotage security of all customers.

Azimuth’s exploit sale saved us all from a mandated back door then, & the court precedent that would force backdoors elsewhere.

I’m midway through Kim Zetter’s excellent “Countdown to Zero Day”. One of the chapters is dedicated to exactly this question in the context of Stuxnet: how much responsibility do security researchers have to report critical security problems to vendors? An auxiliary question some specific vendors, like Microsoft, may face is what their obligation is for patching vulnerabilities that may be currently exploited by friendly governments in their intelligence efforts. That is something Google’s Project Zero wrestled with recently.

In the case of this iPhone, it seems like the private exploit marketplace helped avoid a difficult trial that may have, in effect, resulted in weakened encryption. But it is a marketplace that creates clear risks: platform vendors cannot patch software they do not know is vulnerable; there is little control over the ultimate recipient of a purchased exploit, despite what companies like Azimuth say about their due diligence; and these marketplaces operate with little oversight.

It does seem likely that this market perhaps provides some security benefit to us all. So long as bug bounty programs continue to pay well and there are true white hat researchers, vulnerabilities will continue to be found, responsibly disclosed, and patched. If it manages to avert mandatory back doors or other weakening that at least seven countries’ governments are demanding, it may be to our benefit.

I do not like that idea, but I like the apparent alternatives — anything requiring deliberate flaws in encryption — a whole lot less. In a better world, I would rather these exploits be reported immediately to platform vendors. But the lid for this particular Pandora’s Box has long been lost.