Why can’t Apple decrypt your iPhone?

Last week I wrote about Apple’s new default encryption policy for iOS 8. Apple_Computer_Logo_rainbow.svgSince that piece was intended for general audiences I mostly avoided technical detail. But since some folks (and apparently the Washington Post!) are still wondering about the nitty-gritty details of Apple’s design, I thought it might be helpful to sum up what we know and noodle about what we don’t.

To get started, it’s worth pointing out that disk encryption is hardly new with iOS 8. In fact, Apple’s operating system has enabled some form of encryption since before iOS 7. What’s happened in the latest update is that Apple has decided to protect much more of the interesting data on the device under the user’s passcode. This includes photos and text messages — things that were not previously passcode-protected, and which police very much want access to.*

Excerpt fro Apple iOS Security Guide, 9/2014.

So to a large extent the ‘new’ feature Apple is touting in iOS 8 is simply that they’re encrypting more data. But it’s also worth pointing out that newer iOS devices — those with an “A7 or later A-series processor” — also add substantial hardware protections to thwart device cracking.

In the rest of this post I’m going to talk about how these protections may work and how Apple can realistically claim not to possess a back door.

One caveat: I should probably point out that Apple isn’t known for showing up at parties and bragging about their technology — so while a fair amount of this is based on published information provided by Apple, some of it is speculation. I’ll try to be clear where one ends and the other begins.

Password-based encryption 101

Normal password-based file encryption systems take in a password from a user, then apply a key derivation function (KDF) that converts a password (and some salt) into an encryption key. This approach doesn’t require any specialized hardware, so it can be securely implemented purely in software provided that (1) the software is honest and well-written, and (2) the chosen password is strong, i.e., hard to guess.

The problem here is that nobody ever chooses strong passwords. In fact, since most passwords are terrible, it’s usually possible for an attacker to break the encryption by working through a ‘dictionary‘ of likely passwords and testing to see if any decrypt the data. To make this really efficient, password crackers often use special-purpose hardware that takes advantage of parallelization (using FPGAs or GPUs) to massively speed up the process.

Thus a common defense against cracking is to use a ‘slow’ key derivation function like PBKDF2 or scrypt. Each of these algorithms is designed to be deliberately resource-intensive, which does slow down normal login attempts — but hits crackers much harder. Unfortunately, modern cracking rigs can defeat these KDFs by simply throwing more hardware at the problem. There are some approaches to dealing with this — this is the approach of memory-hard KDFs like scrypt — but this is not the direction that Apple has gone.

How Apple’s encryption works

Apple doesn’t use scrypt. Their approach is to add a 256-bit device-unique secret key called a UID to the mix, and to store that key in hardware where it’s hard to extract from the phone. Apple claims that it does not record these keys nor can it access them. On recent devices (with A7 chips), this key and the mixing process are protected within a cryptographic co-processor called the Secure Enclave.

The Apple Key Derivation function ‘tangles’ the password with the UID key by running both through PBKDF2-AES — with an iteration count tuned to require about 80ms on the device itself.** The result is the ‘passcode key’. That key is then used as an anchor to secure much of the data on the phone.

Overview of Apple key derivation and encryption (iOS Security Guide, p.10)

Since only the device itself knows UID — and the UID can’t be removed from the Secure Enclave — this means all password cracking attempts have to run on the device itself. That rules out the use of FPGA or ASICs to crack passwords. Of course Apple could write a custom firmware that attempts to crack the keys on the device but even in the best case such cracking could be pretty time consuming, thanks to the 80ms PBKDF2 timing.

(Apple pegs such cracking attempts at 5 1/2 years for a random 6-character password consisting of lowercase letters and numbers. PINs will obviously take much less time, sometimes as little as half an hour. Choose a good passphrase!)

So one view of Apple’s process is that it depends on the user picking a strong password. A different view is that it also depends on the attacker’s inability to obtain the UID. Let’s explore this a bit more.

Securing the Secure Enclave

The Secure Enclave is designed to prevent exfiltration of the UID key. On earlier Apple devices this key lived in the application processor itself. Secure Enclave provides an extra level of protection that holds even if the software on the application processor is compromised — e.g., jailbroken.

One worrying thing about this approach is that, according to Apple’s documentation, Apple controls the signing keys that sign the Secure Enclave firmware. So using these keys, they might be able to write a special “UID extracting” firmware update that would undo the protections described above, and potentially allow crackers to run their attacks on specialized hardware.

Which leads to the following question? How does Apple avoid holding a backdoor signing key that allows them to extract the UID from the Secure Enclave?

It seems to me that there are a few possible ways forward here.

  1. No software can extract the UID. Apple’s documentation even claims that this is the case; that software can only see the output of encrypting something with UID, not the UID itself. The problem with this explanation is that it isn’t really clear that this guarantee covers malicious Secure Enclave firmware written and signed by Apple.

Update 10/4: Comex and others (who have forgotten more about iPhone internals than I’ve ever known) confirm that #1 is the right answer. The UID appears to be connected to the AES circuitry by a dedicated path, so software can set it as a key, but never extract it. Moreover this appears to be the same for both the Secure Enclave and older pre-A7 chips. So ignore options 2-4 below.

  • Apple does have the ability to extract UIDs. But they don’t consider this a backdoor, even though access to the UID should dramatically decrease the time required to crack the password. In that case, your only defense is a strong password.
  • Apple doesn’t allow firmware updates to the Secure Enclave firmware period. This would be awkward and limiting, but it would let them keep their customer promise re: being unable to assist law enforcement in unlocking phones.
  • Apple has built a nuclear option. In other words, the Secure Enclave allows firmware updates — but before doing so, the Secure Enclave will first destroy intermediate keys. Firmware updates are still possible, but if/when a firmware update is requested, you lose access to all data currently on the device.

 

All of these are valid answers. In general, it seems reasonable to hope that the answer is #1. But unfortunately this level of detail isn’t present in the Apple documentation, so for the moment we just have to cross our fingers.

Addendum: how did Apple’s “old” backdoor work?

One wrinkle in this story is that allegedly Apple has been helping law enforcement agencies unlock iPhones for a while. This is probably why so many folks are baffled by the new policy. If Apple could crack a phone last year, why can’t they do it today?

But the most likely explanation for this policy is probably the simplest one: Apple was never really ‘cracking’ anything. Rather, they simply had a custom boot image that allowed them to bypass the ‘passcode lock’ screen on a phone. This would be purely a UI hack and it wouldn’t grant Apple access to any of the passcode-encrypted data on the device. However, since earlier versions of iOS didn’t encrypt all of the phone’s interesting data using the passcode, the unencrypted data would be accessible upon boot.

No way to be sure this is the case, but it seems like the most likely explanation.

Notes:

* Previous versions of iOS also encrypted these records, but the encryption key was not derived from the user’s passcode. This meant that (provided one could bypass the actual passcode entry phase, something Apple probably does have the ability to do via a custom boot image), the device could decrypt this data without any need to crack a password.

** As David Schuetz notes in this excellent and detailed piece, on phones with Secure Enclave there is also a 5 second delay enforced by the co-processor. I didn’t (and still don’t) want to emphasize this, since I do think this delay is primarily enforced by Apple-controlled software and hence Apple can disable it if they want to. The PBKDF2 iteration count is much harder to override.

30 thoughts on “Why can’t Apple decrypt your iPhone?

  1. Just the detail I craved, Dr. Green! As an avid reader of your blog thank you for keeping everything so current.

  2. Re hypotesis #3: the wording on the Secure Enclave description does mention a “personalized software update separate from the application processor”, so I believe that the “immutable firmware” option can be safely ruled out.

  3. Re hypothesis #4: It seems to me that it would be possible to retain all data on the device in this scenario (given of course that the end-user has entered the passcode and the old encryption key is stored in RAM). I can think of two ways to accomplish this: a) after upgrading the Secure Enclave firmware unencrypt all data on the phone using the old key and reencrypt it using the new key (derived from same passphrase but a new UID), or b) use a two stage approach where the derived encryption key is only used to protect a secondary key (in which case you can do a without reencrypting anything other than the secondary key).

  4. One little note: If the cracking is done on the devise itself, it would take 5½ year for a 6 character lowercase/digit password because of the 80ms PBKDF2 timing (https://www.apple.com/privacy/docs/iOS_Security_Guide_Sept_2014.pdf).
    But on newer devises (A7 and up / iPhone 5S and up) the new 'Secure Enclave' enforces a 5-second delay between tries. This should change the 5.5 years to 345. (or half a day instead of 15 minutes for a 4-digit password) unless the secure enclave could be avoided in trying the password (which doesn't seem the case?)

  5. This is a disappointing blog post, because you are giving some credibility to a system that has none. All Apple / Google need to do is upload hotcode to sniff your key to compromise the system. Further, they have to do just that if forced to do so with a court order. There is no security here.

  6. Great post! In the future, please avoid posting screenshots of documents. There are a number of situations where they break down – for instance, screen readers for the blind (particularly if there is no alt text). Viewing these images on mobile devices with small screens is also problematic, particularly if the user clicks a link inside an app like facebook or hacker news.

  7. I don't see the problem of “cracking” this encryption scheme. Here's a scenario:
    Mr. FBI man goes to the coffee shop the target/citizen was recently in and then hands the shop owner or manager a National Security Letter the agent wrote there on the spot to take…i mean procure the survailence video of the target entering their password on their iphone and decrypts the 'unbreakable' encryption that Apple 'has no access to'.
    If the video is not clear enough, the Gestapo… err i mean FBI agent procedes to the next destination of his victim… err i mean the citizen.

  8. Good point. The 'standard' delay (“iPhone is disabled / try again in 1 minute”) is definitely handled by iOS itself. But the 5 seconds delay is handled by the 'Secure Enclave':

    Quote from the 'security guide' pdf:
    ————–
    On a device with an A7 or later A-series processor, the key operations are performed by the Secure Enclave, which also enforces a 5-second delay between repeated failed unlocking requests. This provides a governor against brute-force attacks in addition to safeguards enforced by iOS.
    ————–

    So it depends on whether Apple is able to patch the Secure Enclave's firmware in a way that makes it possible to bypass this delay. Any thoughts on this? (I don't know)

  9. “Normal password-based file encryption systems take in a password from a user, then apply a key derivation function (KDF) that converts a password (and some salt) into an encryption key. This approach doesn't require any specialized hardware, so it can be securely implemented purely in software provided that (1) the software is honest and well-written, and (2) the chosen password is strong, i.e., hard to guess.” I'm not sure you meant this the way it was stated; the remainder of the article makes it clear that cryptographic offloading remediates risk of key exposure, yet here the languages insinuates a security approach in software-only. Might be worth an edit to clarify (those who don't read through and understand will be misled into thinking you can secure a system in software-only, which with isn't possible here).

  10. This whole argument rests on the assumption that Apple does not have access to the “hardware key” a priori. Apple says that *they* don't have the key, which, of course, means exactly that if apple is not lying. THEY do not have it but someone else may have it, maybe someone who apple works with. No one besides apple know. Thus, as always, it is about trust. In this case about trusting apple. I don't.

  11. (1) backups
    (2) extract the UID through physical examination of the device

    (1) backups.
    Backups can be restored to new devices when the original device is lost or destroyed. As far as I can tell this means that the any encryption applied to backup cannot contain a device specific key. Thus to bypass the UID security all a gov actor needs to do is secretly subpoena the icloud backup.

    (2) Just bc firmware installed on the device can't extract the UID doesn't mean it can't be extracted by physically hooking up probes directly to the secure enclave. This requires taking apart the phone and having specialized hardware… This might make it secure from the common street thug but certainly not from gov actors, or organized crime.

  12. Yes it does require that apple isn't keeping a list somewhere of UIDs for every phone, and that they couldn't reverse engineer that number by knowing the starting seed for the random number generator and counting the number of devices manufactured… I think these are solvable problems, but probably proving that their manufacturing process precludes them from generating such a list is a hard thing to prove.

  13. How long would it take to crack the four digit swiped PIN code 7852 that most people use?

    80ms.

    How long until a million people, previously using 7852 but now changed to something else that they can't remember, lock themselves out of their phones?

    Three days.

  14. Has anyone done any research into the possibility of extracting the UID using specialized hardware? A google or two didn't turn up anything obvious.

  15. Apple probably doesn’t have access to those devices anymore; they are not lying. By encrypting everything with Enclave they have outsource data management/extraction/encryption to big brother, since data is protected quite well now, allowing Apple to access it would be high risk. It had to move up the chain.

    Notice the bit about hardware manufacturing and fusing keys in to hardware, that process leaves list of keys and processors assign to it, it has to.

    Software can’t extract the keys, because internet is very unsecure those days. Take a closer look at iPhone charges and how they are built (cables and chargers have their own serial numbers…really what for?). With usage of smart power meters iPhones can be easily accessible via chargers over power lines. Chinese had already invaded us some time ago with cheap chargers that were stealing our passwords from BBs, I’m sure they didn’t come up with this concept by themselves. Apple is cutting edge if it comes to security, that is why their stock and popularity is growing super quickly but all data\devices are accessible by those who have access.
    Don’t forget old methods of access via simple UDIDs: http://www.forbes.com/sites/parmyolson/2012/09/04/fbi-agents-laptop-hacked-to-grab-12-million-apple-ids-anonymous-claims/

    Let’s see who will say first it is conspiracy 🙂

  16. Google…. please… Google is useless for about 2 years now. Google is happy place now without any negative comments not to mention any real data/articles. it's like cable TV, 300 channels and nothing to watch 🙂

  17. That would blow out on them in 2 days, Starbucks owner would twitt that they came and ask for footage… They have full access to all the mobile devices. If somebody thinks otherwise it must be nuts. after 9/11 there was law that says any cryptography software producer has to provide backdoor to US gov.

  18. You don't seem to understand the article. If the key is stored on the secure enclave (and never leaves it), there is nothing to sniff. So Apple can do exactly nothing about it (except changing security in their next devices).

Comments are closed.