Data Security on Mobile Devices securephones.io

It is baffling to me that, in 2021, I still do not know the security practices of the devices and cloud services I use more frequently than ever.

This became particularly worrisome last year when I began working my day job from my personal computer. I have several things in my favour: it is an iMac, not a portable computer, so there is dramatically less risk of unauthorized physical access; I keep an encrypted Time Machine backup and an encrypted Backblaze remote backup; I use pretty good passwords. But what about my phone and iCloud, for example? I do not use either for much work stuff, but I inevitably have some communications and two-factor authentication apps on my iPhone, and I use iCloud for backups.

Over the holidays, I immersed myself in an early copy of a new report by Johns Hopkins University students Maximilian Zinkus and Tushar Jois, and associate professor Matthew Green, as I tried to find answers for what should be simple questions. The researchers’ conclusions in the now-published report were eye-opening to me:

  • Limited benefit of encryption for powered-on devices. We observed that a surprising amount of sensitive data maintained by built-in applications is protected using a weak “available after first unlock” (AFU) protection class, which does not evict decryption keys from memory when the phone is locked. The impact is that the vast majority of sensitive user data from Apple’s built-in applications can be accessed from a phone that is captured and logically exploited while it is in a powered-on (but locked) state.

[…]

  • Limitations of “end-to-end encrypted” cloud services. Several Apple cloud services advertise “end-to-end” encryption in which only the user (with knowledge of a password or passcode) can access cloud-stored data. We find that the end-to-end confidentiality of some encrypted services is undermined when used in tandem with the iCloud backup service. More critically, we observe that Apple’s documentation and user settings blur the distinction between “encrypted” (such that Apple has access) and “end-to-end encrypted” in a manner that makes it difficult to understand which data is available to Apple. Finally, we observe a fundamental weakness in the system: Apple can easily cause user data to be re-provisioned to a new (and possibly compromised) [Hardware Security Module] simply by presenting a single dialog on a user’s phone. We discuss techniques for mitigating this vulnerability.

The muddy distinction between “encryption”, “end-to-end encryption”, and “true end-to-end decryption in a way that Apple cannot reverse” remains a source of consternation, especially as Apple’s own documentation is anything but precise.

Lily Hay Newman, Wired:

“It just really shocked me, because I came into this project thinking that these phones are really protecting user data well,” says Johns Hopkins cryptographer Matthew Green, who oversaw the research. “Now I’ve come out of the project thinking almost nothing is protected as much as it could be. So why do we need a backdoor for law enforcement when the protections that these phones actually offer are so bad?”

If there is one conclusion of this report that is damning by its silver lining, it is that it calls bullshit on law enforcement’s insistence that smartphone encryption creates some sort of investigative black hole. Maybe Apple has successfully created encryption that is strong enough for personal and business use with a strictly-controlled opening for legitimate legal use — by sticking itself in the middle of that chain. That is how I am reading between the lines of the statement an unidentified spokesperson provided Wired:

The researchers shared their findings with the Android and iOS teams ahead of publication. An Apple spokesperson told WIRED that the company’s security work is focused on protecting users from hackers, thieves, and criminals looking to steal personal information. The types of attacks the researchers are looking at are very costly to develop, the spokesperson pointed out; they require physical access to the target device and only work until Apple patches the vulnerabilities they exploit. Apple also stressed that its goal with iOS is to balance security and convenience.

There are security problems with iOS devices and iCloud services that Apple can and should fix, but I bet there are many that it will not because it is perhaps unwise to be a company that is explicitly trying to block any subpoena from having an effect. If that is the case, Apple ought to say so. It should be plainly clear to users what their security options are, and Apple ought to be more honest in its marketing and documentation of these features.

If Apple is appointing itself guardian of its users’ data — in iCloud form, including defaulted-to-on iCloud Backups of iPhones, iPads, and Apple Watches — that also means that it can respond to law enforcement requests at any level by any agency. Depending on how much you trust your local police and national intelligence services, perhaps that does not seem like a great idea to you. More worrying is that it leaves Apple open to potentially being a part of corrupt regimes’ human rights abuses if it is responsive to data requests for activists’ accounts, or if it complies with device search requests from border patrol.

Maybe there are only bad options, and this is the best bad option that strikes the least worst balance between individual security and mass security. But the compromises seem real and profound — and are, officially, undocumented.