Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
RSA chief believed cryptographers’ warnings on Dual EC DRBG lacked merit (2014) (jeffreycarr.blogspot.com)
185 points by jalcazar on Sept 3, 2021 | hide | past | favorite | 41 comments


Two things real quick:

Art Coviello is a salesman who headed the company that bought RSA and took the name. It would be a little weird to expect him to meaningfully know what a cryptographer even is. The idea that Coviello would himself be weighing NIST against crypto eprints is pretty silly.

And, more importantly, the only important cite here is Shumow and Ferguson. Schneier didn't analyze Dual EC (he never did work in elliptic curves at all, and claimed not to trust their math); here, he's simply reporting on Shumow and Ferguson's paper, and he doesn't even say Dual EC was backdoored. Nor, for that matter, do the cites before Shumow and Ferguson.

(Before anyone jumps on my back about this: I basically shared Schneier's take on this, that Dual EC was too conspicuous to really be a backdoor, and that the right response was to ignore and never use it. I was wildly wrong about how prevalent Dual EC was --- I couldn't imagine any sane engineer adopting it, because it's slow and gross. If I'd known before the BULLRUN revelations that, for instance, every Juniper VPN box was using Dual EC, I'd have been a lot more alarmed and a lot less charitable about it. Oh well, live and learn.)


Not disagreeing with your take, but I think its important to note that I just don't see it being possible that Art came up with his take without any input from folks in the company. I would imagine there were meetings where these talking points were constructed. Right?


I think it seems crazy now, but that's because we know a lot more about the practical applications of malicious RNGs; they aren't an abstract concern now. But they kind of were when the big debate was alleged to have happened at RSA.

Also: I'm naturally going to sound like I'm defending RSA here, and I am not. I feel like --- I'll probably be proven wrong by this in time because we live in a fallen world --- no major company in the world would in 2021 swap out a crucial cryptographic component for one DOD was demanding while cryptographers were making noise about how janky it is. That should have been the standard in 2007 or whatever, too.


> I think it seems crazy now, but that's because we know a lot more about the practical applications of malicious RNGs;

RNGs were understood to be the lynchpin of secure systems for decades, including long before 2007; and it was also widely assumed both now and then that they were one of the most common vectors for attack by the NSA.

Why RSA added Dual_EC_DRBG is easy to explain in dollars & cents: 1) RSA was literally paid to add it, and 2) most of RSA's revenue comes, directly or indirectly, through government contracts (e.g. FIPS compliance, etc).

As for why RSA insiders didn't speak up: there are mountains of scholarship explaining why people just keep their heads down. Even if you were absolutely convinced beyond a shadow of a doubt that Dual_EC_DRBG was a backdoor, intelligent people are very good at rationalizing things. Anybody who has worked at a large company, including RSA, understands that your day-to-day work and the company's business is as a practical matter <10% technical and >90% everything else (sales, profit seeking, integration, etc, etc). More importantly, if you're a company doing business in a space dominated by U.S. government requirements and processes, or even just patriotic, the NSA having a backdoor is hardly the worse thing in the world. There are amazing cryptographers in China. Even the ones who fancy themselves world citizens and above the fray of nationalism, how many do you think would stick their head out were they in a position to identify possible formal government attempts to manipulate technology?

Moreover, a backdoor doesn't necessarily mean insecure; it's not a categorical truth that any backdoor means broken security, that's just a rule of engineering thumb built on the experience that securely maintaining the keys to backdoors is supremely difficult, often more difficult than any other aspect. Nobody has yet come close to breaking Dual_EC_DRBG, AFAIU. From a purely technical perspective, Dual_EC_DRBG is still secure. The keys haven't leaked, and the algorithm remains as impenetrable as ever. At the end of the day, that's all the rationalization most people would ever need to keep their head down. The "security" of Dual_EC_DRBG is a socio-political debate, not a technical one.


I disagree with basically all of this.

I disagree that cryptography engineers understood viscerally how good a target RNGs were or how viable a PKRNG would be (further evidence for that would be the contortions attackers have to go through to extract enough wire state from Dual EC to mount the most straightforward attacks). I think you can formulate an argument that any major cryptographic primitive is the "lynchpin", and indeed you see people doing that, for instance with the SIMON/SPECK block cipher designs --- block ciphers, after all, are the lynchpin of secure systems.

I agree, obviously, that RSA added Dual EC because DOD demanded it. But most of RSA's revenue didn't come from BSAFE, or even things that relied on BSAFE. They were a crappy token company that bought RSA, then built a bunch of multi-factor authentication stuff that had more to do with IP reputation than with cryptography.

I don't really buy that anybody working inside RSA was absolutely convinced that Dual EC was a backdoor. I sort of don't buy that anyone was really even seriously paying attention. I think people think of RSA as a cryptography company, but that is not at all what RSA was at the time this happened.

None of this matters, really. We arrive at the same place about RSA's culpability. But if you came to HN hoping to find someone to stick up for RSA's decision here, you haven't been paying attention to the tenor of this place. All you're going to get here is hair splitting; that's the interesting conversation we can actually have. There's no viable debate about whether adopting Dual EC was defensible. Even when I was saying I doubted Dual EC was a backdoor, I still didn't think using it was defensible.


I've got some direct personal experience in this one. A few key points from how I saw it play out inside:

- there was a lot of noise made about this by the bsafe crypto team when it was first implemented (anecdotal, but I trust the people that were there and the context below helps reinforce this). From what I heard there was clear communication that adding EC drbg to the toolkits the way nsa wanted was insecure.

- that happened before my time, but by the time I got there it was kind of an inside joke that EC drbg was an NSA backdoor (I think this was around 2010)

- the above was tempered by the fact that it was so horrendously slow, no one could imagine it being used

- even though RSA demanded it was the default RNG for the toolkit, the first part of documentation strongly suggested changing this default

- my memory is that this work on EC drbg funded development bsafe SSL toolkits. So while the money may have been relatively small, it opened up a new product for BSAFE

The smoking gun and the bit that made it really obvious that something was off about this came in its use as part of the TLS toolkits.

There was an explicit, but unexplained, requirement that the _first 20 bytes_ of random generated during the handshake were sent unencrypted as part of the handshake.

EAY led that crypto team, they knew their stuff and they knew that this was off and there was no legitimate reason for doing this.

My take: this team new what was happening and they made it clear to management. As a really the people who made the decision to take NSA money knew what it was and the implication and went ahead anyway.

As a foot note, when we did the cleanup on this we found that in some of the toolkits the way that the 20bytes was sent was flawed and would have meant that an attempted backdoor using this would have failed. Whether this was intentionally or not _shrug_.


This is great.

Just to be clear: the TLS integration and 20 bytes of random stuff was definitely a smoking gun; nobody thinks anything but that Dual EC is a backdoor after learning about it.

EAY is Eric A. Young? I didn't realize he'd worked on BSafe.


> I don't really buy that anybody working inside RSA was absolutely convinced that Dual EC was a backdoor. I sort of don't buy that anyone was really even seriously paying attention.

I guess the second part is a fair point, but for anyone who was paying attention - who read the spec and knew enough about cryptography to understand it - there was no question; Dual EC was clearly, obviously a backdoor[0].

0: With perhaps the remote possibility of being "not a backdoor" (AKA, a backdoor that NSA (provably?) didn't have keys to) so they could later say "see, you thought Dual EC was a backdoor but it wasn't; clearly people shouldn't believe you when you say we put a backdoor in $LESS_OBVIOUSLY_BACKDOORED_THING".


> for instance with the SIMON/SPECK block cipher designs --- block ciphers, after all, are the lynchpin of secure systems.

The lynchpin to ciphers are the keys. That's the very definition--proof of security reduces to the question of whether you know the key or not.

Unless you exchange a database of one-time pads, you invariably need an RNG to generate keys for your ciphers. That's your lynchpin right there. The key is the lynchpin, and RNGs generate your keys. You don't need to feel it; it's cryptography 101. Granted, it's such a basic and fundamental aspect to secure systems that it usually gets lost in all the bike shedding.


You don't think that the NSA offering $10 million dollars to make it a default shouldn't have been a smoking gun to the staff at RSA that was aware of the payment?


No, I imagine everyone was aware of the payment. I just think doing stupid shit to close GSA and DOD deals is the norm throughout enterprise software development; what I'm more curious about is whether anyone really gave a shit about this particular stupid thing.

I feel like a hurdle that people arguing the other side of this need to clear, and aren't, is that prior to BULLRUN really not many people were making that much noise† about Dual EC. It was not a secret that it was in BSAFE; it was, according to the post upthread, not just in the documentation, but in the documentation with a warning to disable it!

The tenor of the conversation changed sharply after BULLRUN and people connecting the dots on the TLS random data exposure. But the argument I keep seeing, and the one implicit in this post we're commenting on, is that nobody should have needed BULLRUN to start freaking the fuck out. I disagree with that argument, in a sense (obviously: everyone should have been freaking out.)

Yes, people were making noise, but they made noise (and still do) about _NSAKEY too. There's a difference between then and now, and it's obviously not just because I finally agree with them now.


> From a purely technical perspective, Dual_EC_DRBG is still secure.

I think that depends on the techniques you're thinking of. The usual way of proving such a system is secure is to reduce a break to a solution of a bedrock problem like discrete log, and according to the second reference in the OP, "Cryptanalysis of the Dual Elliptic Curve Pseudorandom Generator", no such proof was provided in this case. I would say that without such a proof, it's not "technically" secure.


Just a quick thought:

I think it wasn't that long before that NSA had warned against some other crypto that was widely thought to be safe and everyone later realized that it had been a good thing.

Can it be that some people thought NSA were doing them a favour again?


You're probably thinking of DES, which happened long before Dual EC (and long before many of the people working at RSA started their careers). But you can see that effect even today, for instance with NSA's "deprecation" of Suite B cryptography and the shade that cast over conventional elliptic curve cryptosystems.

I don't think one can reasonably defend adoption of Dual EC as somehow hedging a bet that NSA had found vulnerabilities in trivial block-based CSPRNGs, though. I think that decision was essentially indefensible, even at the time it was made; it's just more clearly batshit now than it was then.


Ok, thanks. I appreciate your opinion on it and guess you are right.


> Art Coviello is a salesman who headed the company that bought RSA and took the name. It would be a little weird to expect him to meaningfully know what a cryptographer even is.

I don't expect any random person to know, but why would anyone spend that much money to buy that company without doing enough due dilligence to what a crytographer does? I don't imagine they'd be any expert in cryptoanalysis, but you'd likely listen do your own cryptographers on RSA staff, right?


I’m surprised with your opinion. I really found it designed perfectly to be a backdoor when I was reading about it. All the design decisions were there. I remember thinking, how did they think they could get away with that?

The juniper backdoor was another confirmation of that IMO.


Reading about this saga in Ben Buchanan's book "The Hacker and the State" made me realize how every government agency (NIST in this case) seems to be always second fiddle to the "needs" of the NSA/national security apparatus. It seems clear from the book that there was a point in time when they essentially just left it in the NSA's hands to develop, knowing it was probably not secure. Not exactly some huge revelation that the national security apparatus can exert power and leverage over other government groups, or even private companies, but the extent to which it happens was surprising.


Budiansky's Code Warriors emphasized the point that the NSA and its precursors has actively withheld information from the civilian government, including the president. Unfortunately, the very secrecy of it prevent us from knowing the full extent, we only know of the specific cases where its been documented.


Another problem is how NIST should come up with standards. NIST is in charge of standards, but that means that they need to turn to subject matter experts for each separate field. They need to define the standards for everything from measuring weights, to chemicals in wastewater, to cryptography.

So then for each standard you then end up with the government equivalent of an open process where there are requests for comments, maybe a meeting or two to discuss, and trusted folks end up defining the bulk of the document with oversight from editors.

Where this breaks down is when you have the subject matter expert on crypto in government, the NSA, be interested in undermining the standards for their own specialty to serve their internal agenda.


It is difficult to get a man to understand something when his salary depends upon his not understanding it.

— Upton Sinclair


nullc's flagged comment may not have been the best way to get the point across, but it's an important point nevertheless. Conversations about the US intelligence community's repeated attempts to suppress and subvert modern encryption standards never seem to mention Crypto AG, perhaps the most egregious example we know about. A great article just came out that highlights some of the shenanigans:

https://spectrum.ieee.org/the-scandalous-history-of-the-last...

    ... In 1966, the relationship among CAG, the
    NSA, and the CIA went to the next level. That
    year, the NSA delivered to its Swiss partner 
    an electronic enciphering system that 
    became the basis of a CAG machine called 
    the H-460. Introduced in 1970, the machine 
    was a failure. However, there were bigger
    changes afoot at CAG: That same year, the
    CIA and the German Federal Intelligence 
    Service secretly acquired CAG for 
    US $5.75 million.
I'm surprised no one has submitted this one, actually.


> I'm surprised no one has submitted this one, actually.

They did: https://news.ycombinator.com/item?id=28378734


From the wonderful fortune(6) database:

  Anyone who is capable of getting themselves made President
  should on no account be allowed to do the job.
  -- Douglas Adams, "The Hitchhiker's Guide to the Galaxy"
I think the RSA chief can be trusted to do what's in the best financial interest of the RSA, even when that is in contradiction of the correct thing, so long as there's plausible deniability.

I'm glad this is being brought up and not forgotten.


It's important to remember that RSA received cash payments from the USG to backdoor this. It wasn't just an "oops, we were insufficiently vigilant". They actively participated.


It's slightly more complicated than that: There's 3 parties at play in this story. NSA, RSA, and NIST.

NIST was evaluating dual_ec_drbg for certification for government usage, with people from NSA heavily pushing it. NIST was actively contracting out to RSA to evaluate it for weaknesses. The kicker is here: NSA then secretly paid RSA something like $10M to advocate strongly for dual_ec_drbg, behind the back of NIST. So you have one government agency spending money on a contractor, hoping for an honest expert opinion, then another government agency spending money on a contractor secretly so that the second government agency can sneak something behind the first government agency. It's insanity.

Sure, RSA should not have taken the money from NSA, but given that NIST crypto certification mostly matters to government implementations (and not to the private sector), isn't the bigger problem that NSA is happy to introduce backdoors into crypto exclusively used by other government agencies? It's traitorous.

Thomas Massie offered an amendment in the house to try to stop this. He gave a pretty good overview of the situation (he is an electrical and mechanical engineer from MIT) to the house when the amendment was brought to the floor.

https://www.govtrack.us/congress/votes/114-2015/h290

(strangely, the video has been sort of cut on cspan. It's supposed to be at 3:17:11 https://www.c-span.org/video/?326244-2/us-house-debate-fy-20... )


> They actively participated. And that, in my opinion, makes them a criminal enterprise.

Maybe not within a US context, for arguable the US government gave them a mandate for this deception. But within an international context they should probably be held accountable and barred from doing business abroad (as they are essentially an agent/extension of a US intel agency).

Never going to happen, of course. Not with how that whole industry operates. But that only shows how little the whole lot of them and their industry should not be trusted in the first place.


I feel like this detail isn't emphasized enough in the coverage of RSA's participation with Dual EC. Wasn't it like $10 million?


This is important to remember when you see industry professionals paying money to RSA to attend their events, or, worse yet, speaking at them.

Supporting those who make us less safe is a clear signal about where your priorities lie.


We keep having this come up with some of the EC curves like NIST P-256 for example. There's no evidence that it is actually backdoored, but the consensus seems to be that the construction is suspicious, unlike the construction for SHA-2.

What do we do with it? Not many in a product development team that is interacting with other companies or organizations can meaningfully defend not using a NIST curve because it looks suspicious.


The link to the keynote wasn't resolving in the article, so here's the YouTube link: https://youtu.be/aB2gG-cRj10


My explanation of the backdoor: https://youtu.be/OkiVN6z60lg


RSA, being American company, cannot refuse NSA’s backdoors. Discovery of the backdoor hurt RSA’s business, so it’s understandable RSA has beef with them.


Since when can they not refuse a NSA backdoor? Where does the mandate come from, with which the NSA supposedly can instruct commercial/private entities to integrate technological back doors? Does it even have such a legal mandate. I'm sure the NSA will argue that they do, but that doesn't mean they actually have it.


Some of the ways are already known: your company can be denied lucrative government contracts if you deny. Or you might learn you can't export your products due to export restrictions. Other ways are known to exist, but details are not available yet - go read about National Security Letters, or kangaroo "secret courts".


Not that I don't agree, but how do you know the secret courts are kangaroo?


The fact that it's secret.

Also that you aren't even allowed to show up to defend yourself. [1]

Also that they denied 11 out of 34,000 requests over a 35 year period.

Also that the judges are appointed by one person and don't even need congressional approval.

How could it possibly not be a kangaroo court?

[1] https://en.m.wikipedia.org/wiki/Ex_parte


Courts that aren’t adversarial are just interpreting law. Secrecy makes it worse by eliminating accountability by the petitioner and judge.

For a non-secret example, look at the Social Security “fair hearings”, where an administrative law judge basically listens to a petition and makes a decision. The standards vary significantly by locale.


Government buyers that are less important (e.g. state level tollways) would be mandated to buy the backdoored algorithm by having the federal government cook it into a specification of how to buy tollway equipment, for example. Once the backdoored algorithm is in the product suite, it can be put to work on a more tactical level.


> RSA, being American company, cannot refuse NSA’s backdoors.

The key is selling to American government, and any entity related to it. But no, they can't mandate RSA build anything. Of course, if they refuse, they'll find another company which would, pay them lots of money, and then issue a certification requirement that only this particular backdoor algorithm is "approved" and then wait for RSA to go out of business.


This wasn't added via a secret order however. RSA had a business agreement with the NSA to add the backdoor. RSA was paid $10 Million for this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: