Signal CEO Moxie Marlinspike Got His Hands on a Cellebrite Device signal.org

Moxie Marlinspike of Signal recently got his hands on a Cellebrite device and analyzed it from a security practices perspective. From what I understand, this is the first time an investigation into these devices has been made public, and it is not good for Cellebrite.

I wanted to focus on a specific claim in this piece:

Also of interest, the installer for Physical Analyzer contains two bundled MSI installer packages named AppleApplicationsSupport64.msi and AppleMobileDeviceSupport6464.msi. These two MSI packages are digitally signed by Apple and appear to have been extracted from the Windows installer for iTunes version 12.9.0.167.

[…]

It seems unlikely to us that Apple has granted Cellebrite a license to redistribute and incorporate Apple DLLs in its own product, so this might present a legal risk for Cellebrite and its users.

I reached out to Cellebrite earlier today with questions about this claim and have not heard back. I will update this post if I get a response. While I wait, I think it is noteworthy that Apple has used Cellebrite devices to copy iPhone data in its stores. I do not know if Apple remains a Cellebrite customer, though I have asked.

Marlinspike also hints at theoretical retaliatory measures:

Given the number of opportunities present, we found that it’s possible to execute arbitrary code on a Cellebrite machine simply by including a specially formatted but otherwise innocuous file in any app on a device that is subsequently plugged into Cellebrite and scanned. There are virtually no limits on the code that can be executed.

[…]

In completely unrelated news, upcoming versions of Signal will be periodically fetching files to place in app storage. These files are never used for anything inside Signal and never interact with Signal software or data, but they look nice, and aesthetics are important in software. […]

This is a cute idea but I am concerned about its repercussions if it were carried out which, for legal reasons, I imagine is unlikely. Law enforcement and intelligence agencies around the world are already itching to weaken encryption. Do we want a messaging service taunting them for a cheap laugh? If the purpose is to weaken evidence extracted by Cellebrite devices by introducing chaff into the system it is again not something I am sure is desirable for maintaining the legality and ready availability of strong encryption.

Marlinspike leads his post by listing the many countries with dubious human rights records that Cellebrite counts as customers. This is the part that does not sit right with me about private exploit marketplaces. I am grateful that they exist because they give law enforcement a way, with a warrant, to crack devices that does not require eliminating encryption or deliberately adding a back door. I do not like that private companies have different standards of accountability than state-run organizations.

To be very clear: I am not arguing that I would prefer that police forces began collecting zero-days, nor that intelligence agencies have a good record for following the law, nor that operating only within democratic countries would guarantee lawful and just use. But it is concerning to see companies like Cellebrite selling their services in jurisdictions where they are being used to further policies ranging from the oppressive to the authoritarian. If we are to have a private marketplace for vulnerabilities — and I do not realistically see it going anywhere — there must be greater incentives for ethical sales and responsible disclosure.