On the Juniper backdoor

You might have heard that a few days ago, Juniper Systems announced the discovery of “unauthorized code” in the ScreenOS software that underlies the NetScreen line of devices. As a result of this discovery, the company announced a pair of separate vulnerabilities, CVE-2015-7755 and CVE-2015-7756 and urged their customers to patch immediately.

The first of these CVEs (#7755) was an authentication vulnerability, caused by a malicious hardcoded password in SSH and Telnet. Rapid7 has an excellent writeup of the issue. This is a pretty fantastic vulnerability, if you measure by the impact on security of NetScreen users. But on the technological awesomeness scale it rates about a two out of ten, maybe a step above ‘hit the guy with a wrench‘.

The second vulnerability is a whole different animal. The advisory notes that CVE-7756 — which is independent of the first issue — “may allow a knowledgeable attacker who can monitor VPN traffic to decrypt that traffic.” This is the kind of vulnerability that makes applied cryptographers cry tears of joy. It certainly did that for me:

 

And while every reasonable person knows you can’t just drop “passive decryption vulnerability” and expect the world to go on with its business, this is exactly what Juniper tried to do. Since they weren’t talking about it, it fell to software experts to try to work out what was happening by looking carefully at firmware released by the company.

Now I want to be clear that I was not one of those software experts. IDA scares the crap out of me. But I’m fortunate to know some of the right kind of people, like Steve Checkoway, who I was able to get on the job, mostly by encouraging him to neglect his professional obligations. I also follow some talented folks on Twitter, like H.D. Moore and Ralf Philipp Weinmann. So I was fortunate enough to watch them work, and occasionally (I think?) chip in a helpful observation.

And yes, it was worth it. Because what Ralf and Steve et al. found is beyond belief. Ralf’s excellent post provides all of the technical details, and you should honest just stop reading now and go read that. But since you’re still here, the TL;DR is this:

For the past several years, it appears that Juniper NetScreen devices have incorporated a potentially backdoored random number generator, based on the NSA’s Dual_EC_DRBG algorithm. At some point in 2012, the NetScreen code was further subverted by some unknown party, so that the very same backdoor could be used to eavesdrop on NetScreen connections. While this alteration was not authorized by Juniper, it’s important to note that the attacker made no major code changes to the encryption mechanism — they only changed parameters. This means that the systems were potentially vulnerable to other parties, even beforehand. Worse, the nature of this vulnerability is particularly insidious and generally messed up.

In the rest of this post I’m going to try to fill in the last part of that statement. But first, some background.

Dual EC DRBG

Pretty much every cryptographic system depends on a secure random number generator, or RNG. These algorithms produce the unpredictable random bits that are consumed by cryptographic protocols. The key word in this description is unpredictable: if an attacker can predict the output of your RNG, then virtually everything you build on it will end up broken.

This fact has not been lost on attackers! The most famous (alleged) example of deliberate random number generator subversion was discovered in 2007 by Dan Shumow and Neils Ferguson from Microsoft, when they announced the possibility of a backdoor in a NIST standard called Dual_EC_DRBG.

I’ve written extensively about the design of the Dual_EC generator and the process that led to it its standardization. Omitting the mathematics, the short version is that Dual EC relies on a special 32-byte constant called Q, which — if generated by a malicious attacker — can allow said attacker to predict future outputs of the RNG after seeing a mere 30 bytes of raw output from your generator.

The NIST specification of Dual_EC comes with a default value for Q that was generated by the NSA. Nobody has ever been able to determine how NSA generated this value, but leaks by Edward Snowden in 2013 provide strong evidence that NSA may have generated it maliciously. It certainly doesn’t smell good.

But enough with the ancient history. We’re here to talk about Juniper.

Juniper ScreenOS — before the “unauthorized code”

Although it was not widely publicized before this week, Juniper’s ScreenOS devices have used Dual EC for some time — probably since before Juniper acquired NetScreen Technologies. Prior to this week, the only place you might have learned about this was on this tiny page posted by the company after the Snowden leaks.

The decision to use Dual EC is strange all by itself. But it gets weirder.

First, ScreenOS doesn’t use the NSA’s default Q. Instead, they use an alternative Q value that was generated by Juniper and/or NetScreen. If Juniper had really wanted to rule out the possibility of surveillance, this would have been a good opportunity to do so. They could have generated the new constant in a “provably” safe way — e.g., by hashing some English-language string. Since we have no evidence that they did so, a conservative view holds that the Juniper/NetScreen constant may also have been generated maliciously, rendering it useful for eavesdropping.

Next, ScreenOS uses Dual EC in a strange, non-standard way. Rather than generating all of their random numbers with Dual EC (which would be slow), they only use Dual EC to generate a seed for a fast 3DES-based generator called ANSI X9.17. Since that generator is actually FIPS-140 approved and generally believed to be sufficient to the purpose, it’s not clear what value Dual EC is really adding to the system in the first place — except, of course, its usefulness as a potential backdoor.

The good news here is that the post-processing by ANSI X9.17 should kill the Dual EC backdoor, since the attack relies on the attacker seeing raw output from Dual EC. The ANSI generator appears to completely obfuscate this output, thus rendering Dual EC “safe”. This is indeed the argument Juniper made in 2013 when it decided to leave the Dual EC code in ScreenOS.

The problem with this argument is that it assumes that no other code could ever “accidentally” exfiltrate a few bytes bit of raw Dual EC output. Yet this is exactly the kind of threat you’d worry about in a deliberately backdoored system — the threat that, just maybe, the developers are not your friend. Thus Dual EC is safe only if you assume no tiny bug in the code could accidentally leak out 30 bytes or so of raw Dual EC output. If it did, this would make all subsequent seeding calls predictable, and thus render all numbers generated by the system predictable. In general, this would spell doom for the confidentiality of VPN connections.

And unbelievably, amazingly, who coulda thunk it, it appears that such a bug does exist in many versions of ScreenOS, dating to both before and after the “unauthorized code” noted by Juniper. This issue was discovered by Willem Pinckaers and can be illustrated by the following code snippet, which Steve Checkoway decompiled (see full context here):

void prng_generate()
{
  int index; // eax@4
  unsigned int bytes_generated; // ebx@4
  int v2; // eax@9
  int v3; // eax@12
  char v4; // ST04_1@12
  int time[2]; // [sp+10h] [bp-10h]@1

  time[0] = 0;
  time[1] = get_cycles();
  prng_output_index = 0; // this is a global
  ++blocks_generated_since_reseed;
  if ( !do_not_need_to_reseed() )
    // the routine below fills a buffer with raw Dual EC output
    prng_reseed(); // internally sets prng_output_index to 32
  for ( ; (unsigned int)prng_output_index <= 0x1F; prng_output_index += 8 )
  {
    // this loop is supposed to process the same buffer
    // through the ANSI (3DES) generator. However, since the
    // value of prng_output_index was set to 32 above, it never executes
    memcpy(prev_prng_seed, prng_seed, 8);
    memcpy(prev_prng_block, prng_block, 8);
    ANSI_X9_31_generate_block(time, prng_seed, prng_key, prng_block);

Thus what comes out from this function is 32 bytes of raw Dual EC output, which is all you need to recover the internal Dual EC generator state and predict all future outputs.

So if this was the authorized code, what the hell was the unauthorized code?

The creepiest thing about CVE-2015-7756 is that there doesn’t seem to be any unauthorized code. Indeed, what’s changed in the modified versions is simply the value of the Q point. According to Ralf this point changed in 2012, presumably to a value that the hacker(s) generated themselves. This would likely have allowed these individuals to passively decrypt ScreenOS VPN sessions.

In the more recent Juniper patch to fix the vulnerability, is simply set back to the the original Juniper/NetScreen value.

The attacker also replaced some test vectors. But that appears to be it.

To sum up, some hacker or group of hackers noticed an existing backdoor in the Juniper software, which may have been intentional or unintentional — you be the judge! They then piggybacked on top of it to build a backdoor of their own, something they were able to do because all of the hard work had already been done for them. The end result was a period in which someone — maybe a foreign government — was able to decrypt Juniper traffic in the U.S. and around the world.

And all because Juniper had already paved the road.

So why does this matter?

For the past several months I’ve been running around with various groups of technologists, doing everything I can to convince important people that the sky is falling. Or rather, that the sky will fall if they act on some of the very bad, terrible ideas that are currently bouncing around Washington — namely, that our encryption systems should come equipped with “backdoors” intended to allow law enforcement and national security agencies to access our communications.

One of the most serious concerns we raise during these meetings is the possibility that encryption backdoors could be subverted. Specifically, that a backdoor intended for law enforcement could somehow become a backdoor for people who we don’t trust to read our messages. Normally when we talk about this, we’re concerned about failures in storage of things like escrow keys. What this Juniper vulnerability illustrates is that the danger is much broader and more serious than that.

The problem with cryptographic backdoors isn’t that they’re the only way that an attacker can break into our cryptographic systems. It’s merely that they’re one of the best. They take care of the hard work, the laying of plumbing and electrical wiring, so attackers can simply walk in and change the drapes.

This post made possible by Ralf Philipp Weinmann, H. D. Moore, Steve Checkoway, Willem Pinckaers, Nadia Heninger, Shaanan Cohney and the letter Q.

45 thoughts on “On the Juniper backdoor

  1. Thank you very much for this, one of the best posts I have read all year, good good writing to boot not just the technical details.

    But your last two paragraphs wrap it all up and put a bow on it. Lastly the CIO, CFO, CTO and CEO should be prosecuted and imprisioned. No accountability here whatsoever.

    I used to work with a guy, that strangely enough was the biggest Juniper ScreenOS fanboy in the world. So ironically sad.

    But yes sir, INDEED, the sky is falling.

    and we are well past 1984, in the orwellian sense of things.

    Jim Abercromby

  2. Excellent article, and very clearly explained.
    On the “…that a back door intended for law enforcement could somehow become a backdoor for people who we don't trust” part, I admire your trust of law enforcement.

  3. I usually just describe this to non technical folks as “Sure, the NSA had a back door with a lock on it. And now the Chinese or Russians or ISIS or whomever re-keyed that lock for themselves and moved in to your router. Why would we make it so easy for them to do that?”

  4. Fantastic article. A great job of making non-conspiracy theory based arguments about how government back doors in crypto is a really really bad idea.

  5. A very well written article with an amazing conclusion. Sure most of our boxes are shipped with backdoors

  6. > Or rather, that the sky will fall if they act on some of the very bad, terrible ideas that are currently bouncing around Washington — namely, that our encryption systems should come equipped with “back doors” intended to allow law enforcement and national security agencies to access our communications.

    Quoting for emphasis. Backdoors are a bad idea, perpetuated by people who have a poor understanding of security trade-offs.

    Privacy and security are not opposites. Anyone who frames the argument that way is ignorant of computer security.

  7. I wonder *why* Juniper did their careful source code review — and how they found the changed DBRG constant. Could it be that the NSA found that they could no longer break into Juniper VPN connections?

  8. I would trust law enforcement keeps an alternative backdoor key around, just in case they lock themselves out.

  9. Great explanation!

    For non-technical folks that might not make it to the end of the article though, I think it would be helpful to summarize the last couple of paragraphs and what they say about the idea of “sanctioned backdoors for the good guys” and place it at the top of the article as a tl;dr.

  10. The timing of the announcement of these vulnerabilities is quite curious. It is just as there is a big public policy debate taking place about encryption back doors triggered by the Paris attacks. Is this merely coincidence, or is there something else going on in the background to force their hand?

  11. Thank you very much for your enlightening article.

    It allowed me to understand that cryptography backdoor is a double-edged knife, and we have to take more precaution before thinking of implementation.

  12. Great article! Going to read through the works you cited to learn more.

    Few grammatical issues to clean up:
    “This would likely have allowed them to passively decrypt and ScreenOS VPN sessions they were able to eavesdrop.”

    “To sum up, some hacker or group of hackers attacker noticed an existing backdoor…”
    Kyler

  13. Another twist would make this a perfect Hollywood movie: it was later revealed that Juniper was hacked by some crazy cryptographer, who just wants to give a proof of his anti crypto backdoor arguments. We all know that cryptographers would go to great lengths to prove their schemes, so this is not very surprisingly after all.

  14. If the old Q was the original backdoor how would this go unnoticed for 3 years? Any thoughts why this took 3 years to discover. One would think that it would be discovered earlier. Well, maybe not too many devices were updated with the new software.

  15. Wow, yet more plot twists!

    This opens another possibility then, that an insider realised the existence of a backdoor, and in 2012 changed the Q to a safe one nobody can decrypt!

    And as everyone now rushes to patch, it may put the original backdoor into place? They're effectively forced to, because of the other backdoor being closed at the same time in the same firmware update.

    In any case, how can anyone trust these devices at all? The blowback on other US tech companies too could be immeasurable.

  16. Excellent Article. Would be useful to check every vendor in this space to ensure this was not an industry-wide affliction.

  17. Just pointing out something nobody seems to say: their internal repositories must have been attacked successfully.

    They're p0wned. Who's to say other products from them don't have other or similar vulnerabilities?

  18. Excellent article, thanks. Especially useful for a non-technical, lay person with a limited brain like me…

  19. You* (or your template? it's set individually for every single line) has code samples set to display extra extra small. Please fix this so I can stop wishing you'll die in a fire. Thanks!

    *sorry to single you out because this is mind-bogglingly common and it also extends to anyone else who does the same

  20. I worked on commercial VPN software (but not at Juniper) — google will reveal who I am (but some of the search results are NSFW — but that isn't me!).

    We always wondered if/which of our coworkers were actually paid by intelligence agencies to insert subtle bugs. The only comforting things is to remember “Never attribute to malice that which is adequately explained by stupidity.”.

    There are so many ways to make a product insecure that I'm not sure that you need much conspiracy to insert backdoors. If I was the NSA, I'd focus on binary analysis (and source code analysis) to generate a good stream of vulnerabilities that I could exploit.

    This episode does indicate that the defensive side of the NSA did fall down on its job. It is supposed to protect the US Government and it didn't. If it found these vulns and decided not to inform Juniper to get them fixed then it implies that the existence of the vulns was a net positive to the US — probably because they were being actively exploited by the offensive side of the NSA. The alternative (which I think is more likely) is that the defensive side didn't know of the existence of these vulns. Maybe the NSA is a vast government bureaucracy that is just good at spending taxpayer dollars.

  21. I was recommended this website by my cousin. I am not sure whether this post is written by him as no one else know such detailed about my trouble. You are incredible! Thanks!

  22. Dave Taht: The real bug is the scoping of prng_output_index. It could have been global, as the comment says, or just static but with both prng_generate() and prng_reseed() in the same file. The coder should have made prng_output_index local on the stack frame of prng_generate() and all would have been fine. Safety critical code (avionics, medical devices, etc.) usually uses MISRA static code analysis to catch stuff like this. This code would not have passed a MISRA scan. Why is it that crypto code is not considered as safety critical as those other areas? Many times, it secures the communications link with such devices.

  23. >>Maybe the NSA is a vast government bureaucracy that is just good at spending taxpayer dollars.

    Maybe?

  24. Great article. Simple explanation of events.

    What I can't get my head around, though, is why this “government-must-insert-backdoor-here”-ideology has been such a big “thing” the last couple of years… Do the govs really think that the “bad people” will use potentially-already-eavesdropped-technology? Why would they?

    Nowadays a normally gifted 12-year old can roll his own encryption-algorithm with a standard computer and an open source compiler in a day or two. Who are we kidding? What am I missing? If Bob really. REALLY, wants to whisper something secret to Alice – why use a product under someone elses control, when they can easily make their own?

  25. Thank you, Sir, for this easy-to-understand and impactful explanation of what's encryption, how it affected Juniper and its devastating consequences if government legislate backdoors. This noob here finally gained an insight into the basic idea of encryption. Fascinating! TQ! 🙂

  26. Now i am feeling very proud for your great post. After reading your whole blog content now i get more knowledge. Your all information is very helpful for me. Thank you so much for your great post. superior papers review

  27. I never worked for Juniper (or NetScreen)!

    More seriously, I have never deliberately inserted a backdoor into any product — but then I would say that!

    Are there any backdoors in any product that I have worked on? I don't know. At one company it was specifically against corporate policy to have backdoors in any product. However, it is much easier to have a policy than to be sure that the policy is followed.

    Did everybody on the team follow that policy? I hope so. The trouble with backdoors is that they can be very subtle — the real root cause of Juniper's problem in this case is the fact that the X9.17 generator was bypassed. I suspect that this was the actual backdoor and the bypass was (AFAIK) due to a global variable not being reset. In this way, it is similar to the GOTOFAIL bug (only that was a duplicate line of code).

    Have I worked on products with security vulnerabilities? Yes. We always endeavoured to fix those in a timely manner.

  28. The bug in prng_output_index is most probably another part of the backdoor, but probably inserted at another time to avoid suspicion. Without the bug, the internal state of the Dual_EC PRNG gets obfuscated, and its chosen/weak constants would be irrelevant.

    The following for-loop which does not get executed is probably intended to obfuscate and reassure – i.e. this is a partial *social-engineering* exploit played on the SW developers/reviewers who work with the code at Juniper.

    In other words, the usage of prng_output_index was probably a backdoor feature, not a bug. It looks intentional and obfuscated, and allows plausible deniability even after decompilation of the binary. It looks like a typical TLA approach.

  29. Oh man…. so many reasons not to roll your own crypto. 12 year old home rolled crypto beats the NSA, come on now.

  30. Great write-up! But what terrifies me the most is that apparently you have to convince people why installing backdoors is not good (malicious exploit), whereas the thought of the govermment installing it in the first place… Well, sure, why not?

  31. Exactly my thoughts, static code analyses are mandatory in safety critical code nowadays. Why the cryptography stays aside is beyond my comprehension.

  32. the firefox “No Small Text” plugin could help, with customizations by font class (e.g. fixed-width). There is also the “No Squint” plugin, but that changes (and remembers) all (unfortunately) of the font sizes on the website.

  33. They restored the *original Q* – so if this was an intentional backdoor, and that Q was the key for someone Juniper approved of, they have restored that access for that actor (and cut off access to those who rekeyed the backdoor for their own use)

Comments are closed.