Wednesday, April 4, 2012

iCloud: Who holds the key?

Ars Technica brings us today's shocking privacy news: 'Apple holds the master decryption key when it comes to iCloud security, privacy'. Oh my.

The story is definitely worth a read, though it may leave you shaking your head a bit. Ars's quoted security experts make some good points, but they do it in a strange way -- and they propose some awfully questionable fixes.

But maybe I'm too picky. To be honest, I didn't realize that there was even a question about who controlled the encryption key to iCloud storage. Of course Apple does -- for obvious technical reasons that I'll explain below. You don't need to parse Apple's Terms of Service to figure this out, which is the odd path that Ars's experts have chosen:
In particular, Zdziarski cited particular clauses of iCloud Terms and Conditions that state that Apple can "pre-screen, move, refuse, modify and/or remove Content at any time" if the content is deemed "objectionable" or otherwise in violation of the terms of service. Furthermore, Apple can "access, use, preserve and/or disclose your Account information and Content to law enforcement authorities" whenever required or permitted by law.
Well, fine, but so what -- Apple's lawyers would put stuff like this into their ToS even if they couldn't access your encrypted content. This is what lawyers do. These phrases don't prove that Apple can access your encrypted files (although, I remind you, they absolutely can), any more than Apple's patent application for a 3D LIDAR camera 'proves' that you're going to get one in your iPhone 5.

Without quite realizing what I was doing, I managed to get myself into a long Twitter-argument about all this with the Founder & Editor-in-Chief of Ars, a gentleman named Ken Fisher. I really didn't mean to criticize the article that much, since it basically arrives at the right conclusions -- albeit with a lot of nonsense along the way.

Since there seems to be some interest in this, I suppose it's worth a few words. This may very well be the least 'technical' post I've ever written on this blog, so apologies if I'm saying stuff that seems a little obvious. Let's do it anyway.

The mud puddle test

You don't have to dig through Apple's ToS to determine how they store their encryption keys. There's a much simpler approach that I call the 'mud puddle test':
  1. First, drop your device(s) in a mud puddle. 
  2. Next, slip in said puddle and crack yourself on the head. When you regain consciousness you'll be perfectly fine, but won't for the life of you be able to recall your device passwords or keys.
  3. Now try to get your cloud data back. 
Did you succeed? If so, you're screwed. Or to be a bit less dramatic, I should say: your cloud provider has access to your 'encrypted' data, as does the government if they want it, as does any rogue employee who knows their way around your provider's internal policy checks.

And it goes without saying: so does every random attacker who can guess your recovery information or compromise your provider's servers.

Now I realize that the mud puddle test doesn't sound simple, and of course I don't recommend that anyone literally do this -- head injuries are no fun at all. It's just a thought experiment, or in the extreme case, something you can 'simulate' if you're willing to tell your provider few white lies.

But you don't need to simulate it in Apple's case, because it turns out that iCloud is explicitly designed to survive the mud puddle test. We know this thanks to two iCloud features. These are (1) the ability to 'restore' your iCloud backups to a brand new device, using only your iCloud password, and (2) the 'iForgot' service, which lets you recover your iCloud password by answering a few personal questions.

Since you can lose your device, the key isn't hiding there. And since you can forget your password, it isn't based on that. Ergo, your iCloud data is not encrypted end-to-end, not even using your password as a key (or if it is, then Apple has your password on file, and can recover it from your security questions.) (Update: see Jonathan Zdziarski's comments at the end of this post.)

You wanna make something of it?

No! It's perfectly reasonable for a consumer cloud storage provider to design a system that emphasizes recoverability over security. Apple's customers are far more likely to lose their password/iPhone than they are to be the subject of a National Security Letter or data breach (hopefully, anyway).

Moreover, I doubt your median iPhone user even realizes what they have in the cloud. The iOS 'Backup' service doesn't advertise what it ships to Apple (though there's every reason to believe that backed up data includes stuff like email, passwords, personal notes, and those naked photos you took.) But if people don't think about what they have to lose, they don't ask to secure it. And if they don't ask, they're not going to receive.

My only issue is that we have to have this discussion in the first place. That is, I wish that companies like Apple could just come right out and warn their users: 'We have access to all your data, we do bulk-encrypt it, but it's still available to us and to law enforcement whenever necessary'. Instead we have to reverse-engineer it by inference, or by parsing through Apple's ToS. That shouldn't be necessary.

But can't we fix this with Public-Key Encryption/Quantum Cryptography/ESP/Magical Unicorns?

No, you really can't. And this is where the Ars Technica experts go a little off the rails. Their proposed solution is to use public-key encryption to make things better. Now this is actually a great solution, and I have no objections to it. It just won't make things better.

To be fair, let's hear it in their own words:
First, cloud services should use asymmetric public key encryption. "With asymmetric encryption, the privacy and identity of each individual user" is better protected, Gulri said, because it uses one "public" key to encrypt data before being sent to the server, and uses another, "private" key to decrypt data pulled from the server. Assuming no one but the end user has access to that private key, then no one but the user—not Apple, not Google, not the government, and not hackers—could decrypt and see the data.
I've added the boldface because it's kind of an important assumption.

To make a long story short, there are two types of encryption scheme. Symmetric encryption algorithms have a single secret key that is used for both encryption and decryption. The key can be generated randomly, or it can be derived from a password. What matters is that if you're sending data to someone else, then both you and the receiver need to share the same key.

Asymmetric, or public-key encryption has two keys, one 'public key' for encryption, and one secret key for decryption. This makes it much easier to send encrypted data to another person, since you only need their public key, and that isn't sensitive at all.

But here's the thing: the difference between these approaches is only related to how you encrypt the data. If you plan to decrypt the data -- that is, if you ever plan to use it -- you still need a secret key. And that secret key is secret, even if you're using a public-key encryption scheme.

Which brings us to the real problem with all encrypted storage schemes: someone needs to hold the secret decryption key. Apple has made the decision that consumers are not in the best position to do this. If they were willing to allow consumers to hold their decryption keys, it wouldn't really matter whether they were using symmetric or public-key encryption.

So what is the alternative?

Well, for a consumer-focused system, maybe there really isn't one. Ultimately people back up their data because they're afraid of losing their devices, which cuts against the idea of storing encryption keys inside of devices.

You could take the PGP approach and back up your decryption keys to some other location (your PC, for example, or a USB stick). But this hasn't proven extremely popular with the general public, because it's awkward -- and sometimes insecure.

Alternatively, you could use a password to derive the encryption/decryption keys. This approach works fine if your users pick decent passwords (although they mostly won't), and if they promise not to forget them. But of course, the convenience of Apple's "iForgot" service indicates that Apple isn't banking on users remembering their passwords. So that's probably out too.

In the long run, the answer for non-technical users is probably just to hope that Apple takes good care of your data, and to hope you're never implicated in a crime. Otherwise you're mostly out of luck. For tech-savvy users, don't use iCloud and do try to find a better service that's willing to take its chances on you as the manager of your own keys.

In summary

I haven't said anything in this post that you couldn't find in Chapter 1 of an 'Intro to Computer Security' textbook, or a high-level article on Wikipedia. But these are important issues, and there seems to be confusion about them.

The problem is that the general tech-using public seems to think that cryptography is a magical elixir that can fix all problems. Companies -- sometimes quite innocently -- market 'encryption' to convince people that they're secure, when in fact they're really not. Sooner or later people will figure this out and things will change, or they won't and things won't. Either way it'll be an interesting ride.

Update 4/4: Jonathan Zdziarski ‏tweets to say my 'mud puddle' theory is busted: since the iForgot service requires you to provide your birthdate and answer a 'security question', he points out that this data could be used as an alternative password, which could encrypt your iCloud password/keys -- protecting them even from Apple itself.

The problem with his theory is that security answers don't really make very good keys, since (for most users) they're not that unpredictable. Apple could brute-force their way through every likely "hometown" or "favorite sport" in a few seconds. Zdziarski suggests that Apple might employ a deliberately slow key derivation function to make these attacks less feasible, and I suppose I agree with him in theory. But only in theory. Neither Zdziarski or I actually believe that Apple does any of this.

11 comments:

  1. You left out an important piece of context: http://arst.ch/t06

    Same author, 13 days earlier, in which basically asserts that most of your data in ICloud is encrypted and, by implication safe. The author even goes so far as to point at that notes and email are not encrypted and "it's technically possible for an unscrupulous Apple data center employee to rifle through your e-mail or notes". Clearly implying the rest of the data is secure

    The author then goes into a brief description of OSX's key derivation policies. This is despite the fact that rumor has it that ICloud runs on some combination of Azure and AWS and even if it did, the techniques they outline are completely moot if you send your password to apple.if you didn't send the password and did some computation locally, we wouldn't be speculating on what happens: we'd actually know because we could look at what is going in in our own computers.

    Really the article your complaining about should 1) be a retraction of the linked one and 2) an explanation whats wrong with their reasoning in the first article .Instead, they ignore any and all basic understanding of cryptography / actual physical locks, descend into reading ToS, and compound their cryptographic errors.

    They should have just asked the following question: if your rented storage locker has the world's strongest lock on it, do you still need to trust the owner of the building if he has the key to the lock? Does the strength of the lock even matter ? If he stores the keys in the same building as your storage locker, does it even stop anyone who breaks in to the building from stealing your stuff?

    That answers the question "how safe is my data stored in iCloud" correctly without delving into things that neither the audience nor the author apparently fully understand.

    ReplyDelete
  2. I think that Mozilla Sync proves that you can have recoverability and security. I found Ars post fine. It's just a question of how much you trust Apple or data cloud providers in general.

    ReplyDelete
  3. This has to be one of the stupidest posts I've seen for a while.

    (a) EVERY iPhone owner knows what is being backed up to iCloud. It's a copy of their phone's data. You know. What used to be backed up to their computer.

    (b) Loss of user data is 99.999999% more likely to occur from losing the phone versus iCloud being compromised by a rogue employee.

    (c) Password forget systems are used by everybody. People forget passwords sometimes. It happens.

    ReplyDelete
    Replies
    1. Issues.


      a) I highly doubt that most people know that Apple stores anything. It's stored in the cloud, not at apple. Right? Right?


      b) I know my device has been lost, I will never know if my data as been compromised by apple.

      c) People also post the same information on Facebook. It's by no means a good system.

      Delete
  4. What about using biometrics in an iForgot-like service? They cannot be brute-forced (I guess)

    ReplyDelete
  5. Mozilla Sync is encrypted cloud *sync* - it's not billed as storage or backup, and for good reason.

    IIRC, if you lose all your credentials from all your devices, the recovery method in Firefox Sync is to blow away all the data stored in the cloud and the rebuild it from one of the devices that has been syncing all along.

    Mozilla can't inspect that data, because the keys never leave your hardware and the encryption happens there. So, Mozilla has no way to recover anything - you have to manage your own backups.

    ReplyDelete
  6. In light of this incident: http://www.macrumors.com/2012/08/05/apple-support-allowed-hacker-access-to-reporters-icloud-account/

    Your post is even more relevant than it was before.

    ReplyDelete
  7. According to engineering firms calgary there are applications that uses reverse engineering method to increase privacy settings on their site. Just like LinkedIN.

    ReplyDelete
  8. Any Business Security Cameras installed at your home,office,shop etc. will be convenient to view surveillance cameras on iPhone/Android cellulars wherever you are. Security Camera

    ReplyDelete
  9. I like what facebook is doing with its friend-based account recovery. Not sure how your friends would feel about getting a subpoena.

    ReplyDelete
  10. It's not like any of this is a big gigantic mystery, it's been known for some time that Apple owns all of your stuff (at least, if you put it onto the iCloud) and then sells it to Russian companies and whoever else Apple can contract with and whoever wants to get their grubby hands on it.
    http://storify.com/AnonyOdinn/apple-udids-elcomsoft-belkasoft-but-wait-there-s-m
    https://twitter.com/AnonyOdinn/status/243048715722567680
    https://twitter.com/AnonyOdinn/status/204234839258447872

    ReplyDelete