Day: 14 January 2020

In 2016, ABC NewsDavid Muir interviewed Tim Cook about why Apple was fighting the FBI’s order to create a modified version of iOS that would allow the forced unlocking of the iPhone used by one of the San Bernardino shooting perpetrators. Memorably, he called the development of any backdoor the “software equivalent of cancer”. He also described what the FBI was asking for: a version of iOS, but without the preference to erase data after ten attempts, and with the ability for the FBI to try an unlimited number of passcodes as fast as a computer could enter them. Now, they seem to be asking for something similar; the FBI, once again, wants Apple to do something to help decrypt iPhones for law enforcement.

At no point — then or now — has Cook or anyone at Apple publicly confirmed how such a backdoor may be installed, or if it’s even possible. Presumably, it would use the iOS update mechanism, but how could permission be granted if the passcode to the iPhone isn’t known? After all, you must enter your iPhone’s passcode to install a software update. When you plug an iPhone into a computer, you must enter the passcode on the phone to enable a trusted data connection. But I thought there might be a way around all of this with one of iOS’ recovery modes.

Coincidentally, I have an almost perfect environment in which to test this. I recently had to install a clean copy of MacOS Catalina on my MacBook Air1 and had not yet connected my iPhone to that laptop, so I had something which could simulate a stranger’s computer to perform an update. And, happily, Apple released a new seed of iOS 13.3.1 today, so I had something to update to.

In the interest of testing this, I risked spending all evening restoring my iPhone’s data and followed Apple’s directions to enter recovery mode.2 I was able to update my iPhone to a newer version of iOS from a local .ipsw package without once entering my passcode.

  1. I downloaded the software update package from Apple’s developer website. Presumably, this means that any software update signed by Apple could be used instead.

  2. I connected my iPhone to my MacBook Air and forced a reboot, which cleared the temporary Face ID passcode authorization from the phone’s memory. I restarted it again, this time into recovery mode.

  3. MacOS prompted me to update or restore the phone. I picked “Cancel” to close the dialog box, then option-clicked on the “Update” button in Finder so I could select the software update package instead of using one from Apple’s server. It began and competed installation, then prompted for my passcode twice before it switched to an “Attempting Data Recovery” screen. After this process completed, my iPhone booted normally.

To be clear, my iPhone still prompted for its passcode when the update had finished its installation process. This did not magically unlock my iPhone. It also doesn’t prove that passcode preferences could be changed without first entering the existing valid passcode.

But it did prove the existence of one channel where an iPhone could be forced to update to a compromised version of iOS. One that would be catastrophic in its implications for iPhones today, into the future, and for encrypted data in its entirety. It is possible; it is terrible.

Update: I’ve had a few people ask questions about what this proves, or express doubt that this would enable an iPhone to be unlocked. To be perfectly clear, a compromised software package with the workarounds the FBI has asked for would have to be signed with Apple’s key for it to be installed. The passcode would still have to be cracked for user data to be extracted from the phone. But if Apple were legally compelled to comply with the FBI’s request in San Bernardino, this proves that a software update package containing the workarounds can be installed on an iPhone without having to enter a passcode.


  1. Long story short, my MacBook Air contains a battery and an SSD from two different third-party vendors. The battery is one year old and comes from an organization well-known for their advocacy of right-to-repair legislation, and its capacity has already been reduced by over a third. I’ve been trying to get a replacement, even though it’s just out of warranty, and had to perform a series of tests to verify the age and wear on the battery. While trying to do these tests, the third-party SSD — from a different company that’s similarly well-known for their stance on repairing electronics — also started to fail. I replaced the third-party SSD with the original one that came with the MacBook Air, wiped the drive, and did a clean install of MacOS Catalina on it.

    I have two takeaways. First, I am receiving a free replacement battery, even though the one-year warranty has lapsed. I haven’t been so lucky with the SSD. I am admittedly a month and a bit outside of the manufacturer’s three-year warranty, but it is fairly disappointing that I have all sorts of SSDs and spinning rust drives that have outlived that drive.

    The second takeaway is that, even though I share some principles and sympathy with right-to-repair advocates, I would be much more convinced about its merits if they shipped higher quality products that lasted longer. It’s entirely anecdotal and probably bad luck, in part, if not in full. But this experience underscores that — in addition to environmental and ethical reasons for device repair rather than replacement — the biggest advocates are businesses that sell parts. ↥︎

  2. This documentation is ridiculous at times:

    On a Mac with macOS Catalina 10.15, open Finder. On a Mac with macOS Mojave 10.14 or earlier, or on a PC, open iTunes. If iTunes is already open, close it.

    There’s no way to read this that makes sense. If you’re using Mojave, open iTunes. If iTunes is now open, close it. is the first and most literal way to read this, but is clearly wrong. If you’re using Mojave, open iTunes. If you had iTunes open first, close it, then reopen it. is the second way to read this, but it also seems silly. ↥︎

Seb Joseph, Digiday:

Apple’s iOS 13 update, released in September, includes regular reminders when apps are sucking up a user’s location data. The pop-up gives a user a chance to choose from the following options: allowing data collection at all times, or only when the app is open — or only one time. Four months in, ad tech sources are reporting the result that some observers had predicted: There’s less location data coming from apps.

Right now opt-in rates to share data with apps when they’re not in use are often below 50%, said Benoit Grouchko, who runs the ad tech business Teemo that creates software for apps to collect location data. Three years ago those opt-in rates were closer to 100%, he said. Higher opt-in rates prevailed when people weren’t aware that they even had a choice. Once installed on a phone, many apps would automatically start sharing a person’s location data.

Apple did not dither around with some balance of allowing advertisers to keep collecting location data at will while nominally protecting user privacy. Apple didn’t even block background location access. It just changed iOS so that users must deliberately allow background access, and the system now reminds users when apps actually use that access. That’s all. Yet, these simple changes have made it difficult for companies you’ve never heard of to monetize information you didn’t know you were sharing.

Apple, not coincidentally and unlike some of its competitors, is not a company making its money off personalized advertising.

Justin Schuh of Google:

After initial dialogue with the web community, we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete. Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome. Our intention is to do this within two years. But we cannot get there alone, and that’s why we need the ecosystem to engage on these proposals. We plan to start the first origin trials by the end of this year, starting with conversion measurement and following with personalization.

Google’s Privacy Sandbox plans still require the cooperation and support of the web’s standards bodies, which is why they are pretending to be hindered from making privacy-supportive changes. It probably is, ultimately, a privacy-friendly move, albeit undercut by suspicions that it will further entrench Google’s business.

That wouldn’t be true if the world’s most popular browser were not owned by a personalized advertising company. C’est la vie.