Last week, Apple released a white paper with explicit information about the security of the entire iOS stack, including hardware, firmware-level software, and other software on top of that. While I feel it’s a very approachable document for even the moderately technically-inclined, it’s a lot of information.
Happily, a number of writers have begun to distill some of the information within and present it in a much more summarized fashion. Here’s TechCrunch’s Greg Kumparak explaining iMessage’s security:
So if Apple never has your private key, how do messages arrive at all of your devices in a readable form? How do your private key(s) get from one device to the other?
Simple answer: they don’t. You’ve actually got one set of keys for each device you add to iCloud, and each iMessage is encrypted independently for each device. So if you have two devices — say, an iPad and an iPhone — each message sent to you is actually encrypted (AES-128) and stored on Apple’s servers twice. Once for each device. When you pull down a message, it’s specifically encrypted for the device you’re on.
TechCrunch again, but Darrell Etherington this time writing about Touch ID:
The document also includes previously revealed technical data around the Touch ID scanner itself, which takes an 88-by-88-pixel, 500-ppi raster scan of the finger being applied, which is then transmitted to the Secure Enclave, vectorized for the purposes of being analyzed and compared to fingerprints stored in memory, and then discarded. This info, it’s worth recalling, is never transmitted to Apple’s servers, nor is it stored in iCloud or the iTunes backup of a device.
Here’s Rich Mogull from TidBits, on iCloud Keychain:
When passwords are added or changed, Apple syncs only the individual keychain items to other devices that need the update, one at a time. In other words, each keychain item is sent only to each device that needs it, the item is encrypted so only that device can read it, and only one item at a time passes through iCloud.
To read it, an attacker would need to compromise both the key of the receiving device and your iCloud password. Or re-architect the entire process without the user knowing. Even a malicious Apple employee would need to compromise the fundamental architecture of iCloud in multiple locations to access your keychain items surreptitiously.
While I think it’s ill-advised to blindly trust any company, the extent and depth of Apple’s security structure for iOS seems extremely robust. There’s a reason why jailbreaks take forever to create (and, also, why they’re closed as quickly as possible).
Put it this way: Apple doesn’t have a reason to be lax on privacy or security. They earn money by selling physical products and software to customers, not by selling personal information.