The story is definitely worth a read, though it may leave you shaking your head a bit. Ars's quoted security experts make some good points, but they do it in a strange way -- and they propose some awfully questionable fixes.
But maybe I'm too picky. To be honest, I didn't realize that there was even a question about who controlled the encryption key to iCloud storage. Of course Apple does -- for obvious technical reasons that I'll explain below. You don't need to parse Apple's Terms of Service to figure this out, which is the odd path that Ars's experts have chosen:
In particular, Zdziarski cited particular clauses of iCloud Terms and Conditions that state that Apple can "pre-screen, move, refuse, modify and/or remove Content at any time" if the content is deemed "objectionable" or otherwise in violation of the terms of service. Furthermore, Apple can "access, use, preserve and/or disclose your Account information and Content to law enforcement authorities" whenever required or permitted by law.Well, fine, but so what -- Apple's lawyers would put stuff like this into their ToS even if they couldn't access your encrypted content. This is what lawyers do. These phrases don't prove that Apple can access your encrypted files (although, I remind you, they absolutely can), any more than Apple's patent application for a 3D LIDAR camera 'proves' that you're going to get one in your iPhone 5.
Without quite realizing what I was doing, I managed to get myself into a long Twitter-argument about all this with the Founder & Editor-in-Chief of Ars, a gentleman named Ken Fisher. I really didn't mean to criticize the article that much, since it basically arrives at the right conclusions -- albeit with a lot of nonsense along the way.
Since there seems to be some interest in this, I suppose it's worth a few words. This may very well be the least 'technical' post I've ever written on this blog, so apologies if I'm saying stuff that seems a little obvious. Let's do it anyway.
The mud puddle test
You don't have to dig through Apple's ToS to determine how they store their encryption keys. There's a much simpler approach that I call the 'mud puddle test':
- First, drop your device(s) in a mud puddle.
- Next, slip in said puddle and crack yourself on the head. When you regain consciousness you'll be perfectly fine, but won't for the life of you be able to recall your device passwords or keys.
- Now try to get your cloud data back.
And it goes without saying: so does every random attacker who can guess your recovery information or compromise your provider's servers.
Now I realize that the mud puddle test doesn't sound simple, and of course I don't recommend that anyone literally do this -- head injuries are no fun at all. It's just a thought experiment, or in the extreme case, something you can 'simulate' if you're willing to tell your provider few white lies.
But you don't need to simulate it in Apple's case, because it turns out that iCloud is explicitly designed to survive the mud puddle test. We know this thanks to two iCloud features. These are (1) the ability to 'restore' your iCloud backups to a brand new device, using only your iCloud password, and (2) the 'iForgot' service, which lets you recover your iCloud password by answering a few personal questions.
Since you can lose your device, the key isn't hiding there. And since you can forget your password, it isn't based on that. Ergo, your iCloud data is not encrypted end-to-end, not even using your password as a key (or if it is, then Apple has your password on file, and can recover it from your security questions.) (Update: see Jonathan Zdziarski's comments at the end of this post.)
You wanna make something of it?
No! It's perfectly reasonable for a consumer cloud storage provider to design a system that emphasizes recoverability over security. Apple's customers are far more likely to lose their password/iPhone than they are to be the subject of a National Security Letter or data breach (hopefully, anyway).
Moreover, I doubt your median iPhone user even realizes what they have in the cloud. The iOS 'Backup' service doesn't advertise what it ships to Apple (though there's every reason to believe that backed up data includes stuff like email, passwords, personal notes, and those naked photos you took.) But if people don't think about what they have to lose, they don't ask to secure it. And if they don't ask, they're not going to receive.
My only issue is that we have to have this discussion in the first place. That is, I wish that companies like Apple could just come right out and warn their users: 'We have access to all your data, we do bulk-encrypt it, but it's still available to us and to law enforcement whenever necessary'. Instead we have to reverse-engineer it by inference, or by parsing through Apple's ToS. That shouldn't be necessary.
But can't we fix this with Public-Key Encryption/Quantum Cryptography/ESP/Magical Unicorns?
No, you really can't. And this is where the Ars Technica experts go a little off the rails. Their proposed solution is to use public-key encryption to make things better. Now this is actually a great solution, and I have no objections to it. It just won't make things better.
To be fair, let's hear it in their own words:
First, cloud services should use asymmetric public key encryption. "With asymmetric encryption, the privacy and identity of each individual user" is better protected, Gulri said, because it uses one "public" key to encrypt data before being sent to the server, and uses another, "private" key to decrypt data pulled from the server. Assuming no one but the end user has access to that private key, then no one but the user—not Apple, not Google, not the government, and not hackers—could decrypt and see the data.I've added the boldface because it's kind of an important assumption.
To make a long story short, there are two types of encryption scheme. Symmetric encryption algorithms have a single secret key that is used for both encryption and decryption. The key can be generated randomly, or it can be derived from a password. What matters is that if you're sending data to someone else, then both you and the receiver need to share the same key.
Asymmetric, or public-key encryption has two keys, one 'public key' for encryption, and one secret key for decryption. This makes it much easier to send encrypted data to another person, since you only need their public key, and that isn't sensitive at all.
But here's the thing: the difference between these approaches is only related to how you encrypt the data. If you plan to decrypt the data -- that is, if you ever plan to use it -- you still need a secret key. And that secret key is secret, even if you're using a public-key encryption scheme.
Which brings us to the real problem with all encrypted storage schemes: someone needs to hold the secret decryption key. Apple has made the decision that consumers are not in the best position to do this. If they were willing to allow consumers to hold their decryption keys, it wouldn't really matter whether they were using symmetric or public-key encryption.
So what is the alternative?
Well, for a consumer-focused system, maybe there really isn't one. Ultimately people back up their data because they're afraid of losing their devices, which cuts against the idea of storing encryption keys inside of devices.
You could take the PGP approach and back up your decryption keys to some other location (your PC, for example, or a USB stick). But this hasn't proven extremely popular with the general public, because it's awkward -- and sometimes insecure.
Alternatively, you could use a password to derive the encryption/decryption keys. This approach works fine if your users pick decent passwords (although they mostly won't), and if they promise not to forget them. But of course, the convenience of Apple's "iForgot" service indicates that Apple isn't banking on users remembering their passwords. So that's probably out too.
In the long run, the answer for non-technical users is probably just to hope that Apple takes good care of your data, and to hope you're never implicated in a crime. Otherwise you're mostly out of luck. For tech-savvy users, don't use iCloud and do try to find a better service that's willing to take its chances on you as the manager of your own keys.
I haven't said anything in this post that you couldn't find in Chapter 1 of an 'Intro to Computer Security' textbook, or a high-level article on Wikipedia. But these are important issues, and there seems to be confusion about them.
The problem is that the general tech-using public seems to think that cryptography is a magical elixir that can fix all problems. Companies -- sometimes quite innocently -- market 'encryption' to convince people that they're secure, when in fact they're really not. Sooner or later people will figure this out and things will change, or they won't and things won't. Either way it'll be an interesting ride.
Update 4/4: Jonathan Zdziarski tweets to say my 'mud puddle' theory is busted: since the iForgot service requires you to provide your birthdate and answer a 'security question', he points out that this data could be used as an alternative password, which could encrypt your iCloud password/keys -- protecting them even from Apple itself.
The problem with his theory is that security answers don't really make very good keys, since (for most users) they're not that unpredictable. Apple could brute-force their way through every likely "hometown" or "favorite sport" in a few seconds. Zdziarski suggests that Apple might employ a deliberately slow key derivation function to make these attacks less feasible, and I suppose I agree with him in theory. But only in theory. Neither Zdziarski or I actually believe that Apple does any of this.