While browsing some community websites, I noticed a few people talking about the security of *double* (or more generally, *multiple) encryption*. Multiple encryption addresses the following problem: you have two (or more) encryption schemes, and you’re worried that one of them might get compromised. Surely if you encrypt with *both* at the same time you’ll buy yourself an added safety margin.

Let me preface this by saying that multiple encryption addresses a problem that *mostly* doesn’t exist. Modern ciphers rarely get broken — at least, not in the Swordfish sense. You’re far more likely to get hit by malware or an implementation bug than you are to suffer from a catastrophic attack on AES.*

That said, you really *are* likely to get hit by malware or an implementation bug. And that’s at least one argument for multiple encryption — if you’re willing to encrypt on separate, heterogenous devices.** There’s also the future to think about. We feel good about AES today, but how will we feel in 2040?

I note that these are problems for the extremely paranoid — governments, mostly — *not* for the typical developer. The majority of us should work on getting* single encryption *right. But this kind of thing isn’t ridiculous — the NESSIE standards even recommend it. Moreover, my experience is that when people start *asking* questions about the security of *X*, it means that they’re already *doing* X, and have been for some time.

So for all that, it’s worth answering some of these questions. And roughly speaking, the questions are:

- Am I better off encrypting with two or more encryption schemes (or keys?)
- Could I be
*worse off?* - If I have to do it, how should I do it securely?

*fully*, or in any particular order. But I do hope I can provide a little bit of insight around the edges.

**Preliminaries**

There are many ways to double encrypt, but for most people ‘double encryption’ means this:

SuperDuperEncrypt(KA, KB, M) = EncryptA(KA, EncryptB(KB, M))

This construction is called a *cascade*. Sometimes EncryptA and EncryptB are different algorithms, but that’s not really critical. What does matter for our purposes is that the *keys* KA and KB are independently-generated*.*** (*To make life easier, we’ll also assume that the algorithms are published.)A lot has been written about cascade encryption, some good and some bad. The answer to the question largely depends on whether the algorithms are simply block ciphers, or if they’re true *encryption* algorithms (*e.g., *a mode of operation using a block cipher). It also depends on what security definition you’re trying to achieve.

**The good**

Let’s consider the positive results first. If either EncryptA or EncryptB is ‘semantically secure’*, *i.e., indistinguishable under chosen-plaintext attack, then so is the cascade of the two. This may seem wonky, but it’s actually very handy — since many common cryptosystems are specifically analyzed under (at least) this level of security. For example, in the symmetric setting, both CBC and CTR modes of operation can both be shown to achieve this security level, provided that they’re implemented with a secure block cipher.

So how do we know the combined construction is secure? A formal proof can be found in this 2002 paper by Herzberg, but the intuition is pretty simple. If there’s an attack algorithm that ‘breaks’ the combined construction, then we can use that algorithm* *to attack either of the two underlying algorithms by simply *picking our own key* for the other algorithm and simulating the double encryption on its ciphertexts.

This means that an attack on the combination *is* an attack on the underlying schemes. So if one is secure, you’re in good shape.

**The not-so-good**

Interestingly, Herzberg also shows that the above result does *not* apply for all definitions of security, particularly strong definitions such as adaptive-chosen ciphertext security. In the symmetric world, we usually achieve this level of security using authenticated encryption.

To give a concrete (symmetric encryption) example, imagine that the inner layer of encryption (EncryptB) is authenticated, as is the case in GCM-mode. Authenticated encryption provides both *confidentiality* (attackers can’t read your message) and *authenticity *(attackers can’t tamper with your message — or change the ciphertext in any way.)

Now imagine that the outer scheme (EncryptA) *doesn’t* provide this guarantee. For a simple example, consider CBC-mode encryption with padding at the end. CBC-mode is well known for its malleability; attackers can flip bits in a ciphertext, which causes predictable changes to the underlying plaintext.

The combined scheme still provides some authenticity protections — if the attacker’s tampering affects the inner (GCM) ciphertext, then his changes should be detected (and rejected) upon combined decryption. But if his modifications only change the CBC-mode *padding*, then the combined ciphertext could be accepted as valid. Hence the combined scheme is ‘benignly’ malleable,* *making it technically *weaker* than the inner layer of encryption.

Do you care about this? Maybe, maybe not. Some protocols really *do* require a completely non-malleable ciphertext — for example, to prevent replay attacks — but in most applications these attacks aren’t world-shattering. If you do care, you can find some alternative constructions here.

**The ugly**

Of course, so far all I’ve discussed is whether the combined encryption scheme is *at least as* secure as either underlying algorithm. But some people want more than ‘at least as’. More importantly, I’ve been talking about entire encryption algorithms (*e.g., *modes of operation), not raw ciphers.

So let’s address the first question. Is a combined encryption scheme significantly more secure than either algorithm on its own? Unfortunately the answer is: *not necessarily*. There are at least a couple of counterexamples here:

*The encryption scheme is a group.*Imagine that EncryptA and EncryptB are the same algorithm, with the following special property: when you encrypt sequentially with KA and KB you obtain a ciphertext that can be decrypted with some*third*key KC.**** In this case, the resulting ciphertext ought to be at least as vulnerable as a single-encrypted ciphertext. Hence double-encrypting gives you no additional security*at all.*Fortunately modern*block ciphers*don’t (seem) to have this property — in fact, cryptographers explicitly design against it, as it can make the cipher weaker. But some number-theoretic schemes do, hence it’s worth looking out for.*Meet-in-the-Middle Attacks.*MiTM attacks are the most common ‘real-world’ counterexample that come up in discussions of cascade encryption (really, cascade*encipherment)*. This attack was first discovered by Diffie and Hellman, and is a member of a class we call*time-space tradeoff attacks.*It’s useful in constructions that use a deterministic algorithm like a block cipher. For example:DOUBLE_DES(KA, KB, M) = DES_ENCRYPT(KA, DES_ENCRYPT(KB, M))On the face of it, you’d assume that this construction would be substantially stronger than a single layer of DES. If a brute-force attack on DES requires 2^56 operations (DES has a 56-bit key), you’d hope that attacking a construction with

*two*DES keys would require on the order of 2^112 operations. But actually this hope is a false one —*if the attacker has lots of storage.*

The attack works like this. First, obtain the encryption*C*of some*known*plaintext*M*under the two unknown secret keys KA and KB. Next, construct a huge table comprising the encipherment of*M*under every possible DES key. In our DES example there are 2^56 keys, this would take a corresponding amount of effort, and the resulting table will be astonishingly huge. But leave that aside for the moment.Finally, try decrypting

*C*with every possible DES key. For each result, check to see if it’s in the table you just made. If you find a match, you’ve now got two keys: KA’ and KB’ that satisfy the encryption equation above.*****

If you ignore storage costs (ridiculously impractical, but which may also be traded for time), this attack will run you (2^56)*2 = 2^57 cipher operations. That’s*much*less than the 2^112 we were hoping for. If you’re willing to treat it as a*chosen**plaintext attack*you can even re-use the table for many separate attacks.*Plaintext distribution issues.*Maurer showed one more interesting result, which is that in a cascade of*ciphers*, the entire construction is guaranteed to be as secure as the first cipher, but*not necessarily any stronger*. This is because the first cipher may introduce certain patterns into its output that can assist the attacker in breaking the second layer of encipherment. Maurer even provides a (very contrived) counterexample in which this happens.I presume that this is the source of the following folklore construction, which is referenced in Applied Cryptography and other sources around the Internet:UberSuperEncrypt(KA, KB, M) = EncryptA(KA, R⊕M) || EncryptB(KB, R))

Where || indicates concatenation, and R is a random string of the same length of the message. Since in this case both R and R⊕M both have a random distribution, this tends to eliminate the issue that Maurer notes. At the cost of doubling the ciphertext size!

Now the good news is that multiple encipherment (done properly) can *probably* make things more secure. This is precisely what constructions like DESX and 3DES try to achieve (using a single cipher). If you make certain strong assumptions about the *strength* of the cipher, it is possible to show that these constructions are harder to attack than the underlying cipher itself (see this analysis of DESX and this one of 3DES).

I warn you that these analyses use an unrealistic model for the security of the cipher, and they don’t treat multiple *distinct* ciphers., Still, they’re a useful guide — assuming that your attacker does not have any special attack against (at least one) of the underlying schemes. Your mileage may vary, and I would generally advise against assembling this sort of thing yourself unless you really know what you’re doing.

**In summary**

I’m afraid this post will end with a whimper rather than a bang. It’s entirely possible to combine encryption schemes in secure ways (many of which are *not* cascade constructions), but the *amount* of extra security you’ll get is subject to some debate.

In fact, this entire idea has been studied for a quite a while under the heading of *(robust) combiners*. These deal with combining cryptosystems (encryption, as well as hashing, signing, protocols, etc.) in a secure way, such that the combination remains secure even if some of the underlying schemes are broken.

If you’re interested, that’s the place to start. But in general my advice is that this is not something that most people should spend a lot of time doing, outside of (perhaps) the government and the academic world. If you want to do this, you should familiarize yourself with some of the academic papers already mentioned. Otherwise, think hard about *why* you’re doing it, and what it’s going to buy you.

*Notes:*

* *

** *And yes, I know about MD5 and the recent biclique attacks one AES. That *still* doesn’t change my opinion.

*** *Note that this is mostly something the government likes to think about, namely: how to use consumer off-the-shelf products together so as to achieve the same security as trusted, government-certified hardware. I’m dubious about this strategy based on my suspicion that all consumer products will soon be manufactured by Foxconn. Nonetheless I wish them luck.

*** This key independence is a big deal. If the keys are related (worst case: KA *equals* KB) then all guarantees are off. For example, consider a stream cipher like CTR mode, where encryption and decryption are the same algorithm. If you use the same algorithm and key, you’d completely cancel out the encryption, *i.e.:* CTR_ENC(K, IV, CTR_ENC(K, IV, M) = M.

**** Classical substitution ciphers (including the Vigenere cipher and Vernam One-Time Pad) have this structure.

***** The resulting KA’ and KB’ *aren’t* necessarily the right keys, however, due to *false positives:* keys that (for a single message *M*) satisfy DES(KA’, DES(KB’, M)) = DES(KA, DES(KB, M)). You can quickly eliminate the bad keys by obtaining the encryption of a second message M’ and testing it against each of your candidate matches. The chance that a given false positive will work on two messages is usually quite low.

What if my goal in multiply encrypting is not to achieve higher security for the multiply encrypted data, but to arrange that some parties can only access part of the data? Concretely: EE(KA, KB, A, B) = E(KB, B || E(KA, A)); every recipient who has KA also has KB, but not everyone who has KB has KA. Therefore, B is (supposed to be) readable by more recipients than A. It sounds like the only risk here is if the outermost encryption operation is malleable?

This shows up in “onion routing” (Tor e.g.) and also in distributed capability schemes (Tahoe-LAFS e.g.). I've been working on both a lot lately so it's near to my thoughts.

I proposed a scheme (usual disclaimers for hobby-crypto apply) that builds a key expander out of the ciphers to be combined. The essence of the idea is that if cascading is used to generate KA and KB, then both the ciphers must be broken to learn the key K, from KA and KB.

Thus, KA and KB are not independent, but the base key K can not be learned without breaking both cipherA and cipherB. Under the “folklore scheme” this allows messages to be sent such that both ciphers must be broken to the learn the message but requires only a “normal” length key (2 x 128 bit ciphers using a 128 bit key).

Full details here: http://crypto.stackexchange.com/a/749/10

“Let me preface this by saying that multiple encryption addresses a problem that mostly doesn't exist. Modern ciphers rarely get broken”

How do you know that modern ciphers rarely get broken? Is it because the information related to an attack against the cipher in question has not been published?

I don't see any reference to multiple encryption in that Nessie document you've linked to. Perhaps it's been updated, or perhaps I've missed the reference. In any event, would you be so kind as to point me towards it, or to quote it?