Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Sold out for 10 million. RSA back door.
#1
I've always noticed how cheap it is to buy informers. !0 mil for the keys to the nation is a bargain. Also demonstrates the danger of proprietary encryption software.

Quote:

Exclusive: Secret contract tied NSA and security industry pioneer

By Joseph Menn
SAN FRANCISCO Fri Dec 20, 2013 5:07pm EST

20 Comments





[Image: btn_articleslide_previous.png] [Image: btn_articleslide_next.png]


[Image: ?m=02&d=20131220&t=2&i=823207407&w=&fh=&...E9BJ1NAD00]


[Image: ?m=02&d=20131220&t=2&i=823216268&w=&fh=&...E9BJ1PGI00]


1 of 2. An RSA SecurID dongle used for internet VPN tunnelling is seen in Toronto December 18, 2013.
Credit: Reuters/Chris Helgren




Related Video


[Image: ?d=20131220&i=75035431&w=140&r=75035431-1&t=2][Image: video_overlay_140.gif]

Obama on surveillance: "There may be another way of skinning the cat"








(Reuters) - As a key part of a campaign to embed encryption software that it could crack into widely used computer products, the U.S. National Security Agency arranged a secret $10 million contract with RSA, one of the most influential firms in the computer security industry, Reuters has learned.
Documents leaked by former NSA contractor Edward Snowden show that the NSA created and promulgated a flawed formula for generating random numbers to create a "back door" in encryption products, the New York Times reported in September. Reuters later reported that RSA became the most important distributor of that formula by rolling it into a software tool called Bsafe that is used to enhance security in personal computers and many other products.
Undisclosed until now was that RSA received $10 million in a deal that set the NSA formula as the preferred, or default, method for number generation in the BSafe software, according to two sources familiar with the contract. Although that sum might seem paltry, it represented more than a third of the revenue that the relevant division at RSA had taken in during the entire previous year, securities filings show.
The earlier disclosures of RSA's entanglement with the NSA already had shocked some in the close-knit world of computer security experts. The company had a long history of championing privacy and security, and it played a leading role in blocking a 1990s effort by the NSA to require a special chip to enable spying on a wide range of computer and communications products.
RSA, now a subsidiary of computer storage giant EMC Corp, urged customers to stop using the NSA formula after the Snowden disclosures revealed its weakness.
RSA and EMC declined to answer questions for this story, but RSA said in a statement: "RSA always acts in the best interest of its customers and under no circumstances does RSA design or enable any back doors in our products. Decisions about the features and functionality of RSA products are our own."
The NSA declined to comment.
The RSA deal shows one way the NSA carried out what Snowden's documents describe as a key strategy for enhancing surveillance: the systematic erosion of security tools. NSA documents released in recent months called for using "commercial relationships" to advance that goal, but did not name any security companies as collaborators.
The NSA came under attack this week in a landmark report from a White House panel appointed to review U.S. surveillance policy. The panel noted that "encryption is an essential basis for trust on the Internet," and called for a halt to any NSA efforts to undermine it.
Most of the dozen current and former RSA employees interviewed said that the company erred in agreeing to such a contract, and many cited RSA's corporate evolution away from pure cryptography products as one of the reasons it occurred.
But several said that RSA also was misled by government officials, who portrayed the formula as a secure technological advance.
"They did not show their true hand," one person briefed on the deal said of the NSA, asserting that government officials did not let on that they knew how to break the encryption.
STORIED HISTORY
Started by MIT professors in the 1970s and led for years by ex-Marine Jim Bidzos, RSA and its core algorithm were both named for the last initials of the three founders, who revolutionized cryptography. Little known to the public, RSA's encryption tools have been licensed by most large technology companies, which in turn use them to protect computers used by hundreds of millions of people.
At the core of RSA's products was a technology known as public key cryptography. Instead of using the same key for encoding and then decoding a message, there are two keys related to each other mathematically. The first, publicly available key is used to encode a message for someone, who then uses a second, private key to reveal it.
From RSA's earliest days, the U.S. intelligence establishment worried it would not be able to crack well-engineered public key cryptography. Martin Hellman, a former Stanford researcher who led the team that first invented the technique, said NSA experts tried to talk him and others into believing that the keys did not have to be as large as they planned.
The stakes rose when more technology companies adopted RSA's methods and Internet use began to soar. The Clinton administration embraced the Clipper Chip, envisioned as a mandatory component in phones and computers to enable officials to overcome encryption with a warrant.
RSA led a fierce public campaign against the effort, distributing posters with a foundering sailing ship and the words "Sink Clipper!"
A key argument against the chip was that overseas buyers would shun U.S. technology products if they were ready-made for spying. Some companies say that is just what has happened in the wake of the Snowden disclosures.
The White House abandoned the Clipper Chip and instead relied on export controls to prevent the best cryptography from crossing U.S. borders. RSA once again rallied the industry, and it set up an Australian division that could ship what it wanted.
"We became the tip of the spear, so to speak, in this fight against government efforts," Bidzos recalled in an oral history.
RSA EVOLVES
RSA and others claimed victory when export restrictions relaxed.
But the NSA was determined to read what it wanted, and the quest gained urgency after the September 11, 2001 attacks.
RSA, meanwhile, was changing. Bidzos stepped down as CEO in 1999 to concentrate on VeriSign, a security certificate company that had been spun out of RSA. The elite lab Bidzos had founded in Silicon Valley moved east to Massachusetts, and many top engineers left the company, several former employees said.
And the BSafe toolkit was becoming a much smaller part of the company. By 2005, BSafe and other tools for developers brought in just $27.5 million of RSA's revenue, less than 9% of the $310 million total.
"When I joined there were 10 people in the labs, and we were fighting the NSA," said Victor Chan, who rose to lead engineering and the Australian operation before he left in 2005. "It became a very different company later on."
By the first half of 2006, RSA was among the many technology companies seeing the U.S. government as a partner against overseas hackers.
New RSA Chief Executive Art Coviello and his team still wanted to be seen as part of the technological vanguard, former employees say, and the NSA had just the right pitch. Coviello declined an interview request.
An algorithm called Dual Elliptic Curve, developed inside the agency, was on the road to approval by the National Institutes of Standards and Technology as one of four acceptable methods for generating random numbers. NIST's blessing is required for many products sold to the government and often sets a broader de facto standard.
RSA adopted the algorithm even before NIST approved it. The NSA then cited the early use of Dual Elliptic Curve inside the government to argue successfully for NIST approval, according to an official familiar with the proceedings.
RSA's contract made Dual Elliptic Curve the default option for producing random numbers in the RSA toolkit. No alarms were raised, former employees said, because the deal was handled by business leaders rather than pure technologists.
"The labs group had played a very intricate role at BSafe, and they were basically gone," said labs veteran Michael Wenocur, who left in 1999.
Within a year, major questions were raised about Dual Elliptic Curve. Cryptography authority Bruce Schneier wrote that the weaknesses in the formula "can only be described as a back door."
After reports of the back door in September, RSA urged its customers to stop using the Dual Elliptic Curve number generator.
But unlike the Clipper Chip fight two decades ago, the company is saying little in public, and it declined to discuss how the NSA entanglements have affected its relationships with customers.
The White House, meanwhile, says it will consider this week's panel recommendation that any efforts to subvert cryptography be abandoned.
http://www.reuters.com/article/2013/12/2...C220131220
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#2

Report: NSA paid RSA to make flawed crypto algorithm the default

The NSA apparently paid RSA $10M to use Dual EC random number generator.

by Peter Bright - Dec 21 2013, 10:14am EST
24

Security company RSA was paid $10 million to use the flawed Dual_EC_DRBG pseudorandom number generating algorithm as the default algorithm in its BSafe crypto library, according to sources speaking to Reuters.
[Image: backdoor-300x150.jpg]

The NSA's work to make crypto worse and better

Leaked documents say that the NSA has compromised encryption specs. It wasn't always this way.
The Dual_EC_DRBG algorithm is included in the NIST-approved crypto standard SP 800-90 and has been viewed with suspicion since shortly after its inclusion in the 2006 specification. In 2007, researchers from Microsoft showed that the algorithm could be backdoored: if certain relationships between numbers included within the algorithm were known to an attacker, then that attacker could predict all the numbers generated by the algorithm. These suspicions of backdooring seemed to be confirmed this September with the news that the National Security Agency had worked to undermine crypto standards. The impact of this backdooring seemed low. The 2007 research, combined with Dual_EC_DRBG's poor performance, meant that the algorithm was largely ignored. Most software didn't implement it, and the software that did generally didn't use it.
One exception to this was RSA's BSafe library of cryptographic functions. With so much suspicion about Dual_EC_DRBG, RSA quickly recommended that BSafe users switch away from the use of Dual_EC_DRBG in favor of other pseduorandom number generation algorithms that its software supported. This raised the question of why RSA had taken the unusual decision to use the algorithm in the first place given the already widespread distrust surrounding it.
RSA said that it didn't enable backdoors in its software and that the choice of Dual_EC_DRBG was essentially down to fashion: at the time that the algorithm was picked in 2004 (predating the NIST specification), RSA says that elliptic curves (the underlying mathematics on which Dual_EC_DRBG is built) had become "the rage" and were felt to "have advantages over other algorithms."
Reuters' report suggests that RSA wasn't merely following the trends when it picked the algorithm and that contrary to its previous claims, the company has inserted presumed backdoors at the behest of the spy agency. The $10 million that the agency is said to have been paid was more than a third of the annual revenue earned for the crypto library.
Other sources speaking to Reuters said that the government did not let on that it had backdoored the algorithm, presenting it instead as a technical advance.

http://arstechnica.com/security/2013/12/...efault/#p3
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#3
So, They Lied (The NSA, That Is -- Again)
[Image: akcs-www?get_gallerynr=86] Reuters reporting here....
(Reuters) - As a key part of a campaign to embed encryption software that it could crack into widely used computer products, the U.S. National Security Agency arranged a secret $10 million contract with RSA, one of the most influential firms in the computer security industry, Reuters has learned.
The claim is that the NSA paid RSA, a commercial firm that (among other things) makes dongles for "secure" logins to places like banks and similar, to insert a bad random number generator into their reference software and make it the default.
As a quick refresh public-key cryptography relies on true random numbers. If you can guess the sequence -- that is, if the numbers aren't truly random -- you can compromise the encryption. This is much easier than actually trying to break the code itself; think of it as a safe with a big, thick door and a nasty, un-pickable lock -- but because you want to break in you get the owner to install a cheezy $20 screen door on the side of the vault.
This would leave the keys generated by that software "guessable", and RSA was the publisher and owner of the code in question that then wound up -- and is probably still in -- hardware and software found basically everywhere.
RSA a few months ago "urged" its customers to stop using the compromised random generator.
But what of all the code that is out in the "wild" that has this software in it, and this random number generator, and is set to use it?
The bombshell isn't that the flaw was suspected, it is that it is now being alleged that the NSA paid RSA to make the code breakable -- on purpose. Whether RSA knew it was breakable at the time is unknown, but the NSA sure appears to have been fully-aware of it, and if Reuters' reporting is correct they basically paid off the firm to insert it into their software that was then widely distributed to pretty-much everyone.
So you want to trust companies based here in the US when it comes to cryptography eh?
Sounds like a good idea to me.
http://market-ticker.org/post=226971
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#4
And then there was this from a few years back. Lockheed hacked using RSA keys... inside information? Collateral damage? An experiment? Testing the system? Or not...
Quote:Lockheed Martin's Security Networks Were Hacked

[Image: ku-medium.png]

Lockheed Martin, one of the world's largest defense contractors, was hit hard by hackers this week who used falsified SecurID electronic tokens to gain access. The breach threatens the security of vital data on present and future military technology.
Which, you know, sucks for us and our allies abroad who depend on Lockheed to help keep us safe during the ongoing violence in places like Iraq and Afghanistan.
It isn't clear what, if anything, was stolen during the breach. It isn't even clear what the hackers want, but the attacks are being traced back to an hacking campaign back in March on the RSA Security arm of the EMC Corporation, an information storage firm. That one attack compromised the security of anyone using the RSA technologywhich means most Fortune 500 companies, other military contractors, and even the Pentagon. Most of these companies have since taken action to address the security issues inherent here, but as Rick Moy, president of NSS Labs, told Reuters, "Given the military targets, and that millions of compromised keys are in circulation, this is not over."
Lockheed remains confident that their broader security systems already in place will have served to prevent or at least soften the blow from this breach. RSA, however, is at a crossroads. The fallout from this will likely result in newer, more secure, but more expensive technologies being relied upon in the future to prevent this from happening again.
The RSA breach did raise concerns about any security tokens that had been compromised, and EMC now faced tough questions about whether "they can repair that product line or whether they need to ditch it and start over again," he said.
Does that mean Lockheed will be investing in biometric scanners in the very-near future? Time will tell, as more details on just how serious this matter was are still on the way.
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#5
For teh techies.

http://www.tau.ac.il/~tromer/papers/acou...131218.pdf
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#6
Makes you wonder about encryption systems like Pretty Good Privacy doesn't it. In fact, it makes you wonder if any security / encryption system is genuinely able to do what it claims...
The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply
#7
Well, I would never trust a proprietary software company to protect my privacy. At least with PGP and other open source software people, that is everyone, can see if any one is trying to pull a swifty on them and do some thing about it. It is like a glass house. No place to hide any thing nasty. Not so with proprietary software. In fact I think this exposure might be the end of all such ventures, people are so over all this crap, and will do the FOSS community a great service in the end.
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#8

How the NSA (may have) put a backdoor in RSA's cryptography: A technical primer

Published on January 06, 2014 12:00PM by Nick Sullivan.

There has been a lot of news lately about nefarious-sounding backdoors being inserted into cryptographic standards and toolkits. One algorithm, a pseudo-random bit generator, Dual_EC_DRBG, was ratified by the National Institute of Standards and Technology (NIST) in 2007 and is attracting a lot of attention for having a potential backdoor. This is the algorithm into which the NSA allegedly inserted a backdoor and then paid RSA to use.
So how is that possible? This is a technical primer that explains what a backdoor is, how easy it can be to create your own, and the dangerous consequences of using a random number generator that was designed to have a backdoor. This is necessarily a long technical discussion, but hopefully by the end it should be clear why Dual_EC_DRBG has such a bad reputation.

Backdoors

The concept of a backdoor has cast a shadow over the security industry for a long time. A backdoor is an intentional flaw in a cryptographic algorithm or implementation that allows an individual to bypass the security mechanisms the system was designed to enforce. A backdoor is a way for someone to get something out of the system that they otherwise would not be able to. If a security system is wall, a backdoor is a secret tunnel underneath it.
Backdoors can be inserted by lazy programmers who want to bypass their own security systems for debugging reasons, or they can be created to intentionally weaken a system used by others. Government agencies have been known to insert backdoors into commonly used software to enable mass surveillance. Backdoors can be built into software, hardware, or even built into the design of an algorithm.
[Image: image01_1.jpg]
In theory, a well-designed cryptographic system does not include a backdoor. In practice, it is hard to guarantee that a piece of software is backdoor-free. A backdoor was recently found in a widely-deployed version of D-Link router firmware. The backdoor allows anyone with knowledge of a secret user agent string to log in and modify settings on any router running the vulnerable software. The D-Link backdoor took a long time to find because the source code for the router software was not available to security researchers to examine. With open source software, a researcher can look directly at the part of the code that verifies authentication and check for backdoors.
Open source is great tool for understanding how code works but it is not a cure-all for finding backdoors in software. It can be difficult and time-consuming to fully analyze all the code in a complicated codebase. The International Obfuscated C Code Contest shows how code can be made extremely hard to understand. The Underhanded C Contest takes this even further, showing that benign looking code can hide malicious behavior.
The translation step between human programming languages and machine code can also be used to insert a backdoor. The classic article "Reflections on Trusting Trust" introduced this idea back in 1984. The cryptographic community has recently banded together to audit the open source disk encryption software TrueCrypt for backdoors. One of the key steps in this audit is verifying that the machine code distributed online for TrueCrypt matches the source code. This requires re-building the audited source code with a fully open source compiler and making sure the machine code matches. Reproducible binaries help demonstrate that a backdoor was not inserted in the program's machine code by a malicious person or compiler.

Random weakness

In some cases, even this might not be enough. For example, TrueCrypt, like most cryptographic systems, use the system's random number generator to create secret keys. If an attacker can control or predict the random numbers produced by a system, they can often break otherwise secure cryptographic algorithms. Any predictability in a system's random number generator can render it vulnerable to attacks.
Examples of security systems being bypassed using flaws (intentionally created or otherwise) in random number generators are very common. Some recent examples:
It's absolutely essential to have an unpredictable source of random numbers in secure systems that rely on them. This includes SSL/TLS, the fundamental security layer of the internet where session keys are generated using random numbers. If you design a random number generator that allows you to predict the output, and convince someone to use it, you can break their system. This kind of algorithmic backdoor is what we will create in this blog post.

A random stream that isn't

The digits of pi are quite random looking but they don't make a very good random number generator because they are predictable. Anyone who knows that someone is using the digits of pi as their source of randomness can use that against them. Convincing someone to use a pi-based random number generator is a difficult challenge.
Many pseudo-random number generators start with a number called a seed. The seed is the starting point for the internal state of the algorithm. The algorithm generates a stream of random numbers using some mathematical operation on the internal state. As long as the seed (and the subsequent internal state) are kept secret, the pseudo-random numbers output by the algorithm are unpredictable to any observer. Conversely, anyone who knows the state will be able to predict the output.
Linux uses a pool of numbers as the internal state of /dev/random, its pseudo-random number generator. Every time a program requests random data from the system, Linux returns a cryptographic hash of its internal state using the algorithm SHA-1. This hash function is designed to be one-way, it is easy to compute but very difficult to find the input given an output. It is so difficult, no person has ever published an inversion of a SHA-1 hash without knowing the input beforehand. This keeps the internal state of the random number generator secret.
The random data extracted by the hash function is then mixed back into internal state. Periodically, the hashes of the timestamps of "unpredictable" system events like clicks and key presses are also mixed in.
Here's a diagram of the basic pseudo-random number generator construction:
[Image: pseudo-random-number-generator.jpg]
In this diagram
F = SHA1
and G = SHA1 + mix with XOR
This construction is pretty standard. The internal state is kept secret, data is output via a one-way function, and the internal state is updated by mixing the data back into the state.
At any point, if an attacker can figure out the internal state, they can predict the output. The strategic choices for F and G here are what make this construction safe. You do not lose the randomness in the pool by XOR-ing with something else, entropy always goes up.
If F and G were chosen to be two completely independent one-way functions, it would probably still be safe. Having SHA-1 as F and MD5 (a different hash function) as G would not be too unreasonable of a choice. The key here is in the word independent, but first a sidestep into elliptic curves.

Elliptic Curves and one-way functions

In a previous blog post we gave a gentle introduction to elliptic curve cryptography. We talked about how this class of curves can be used for encryption and digital signature algorithms. We also hinted that elliptic curves could be used for generating random numbers. That is what we we will describe here.
The reason elliptic curves are used in cryptography is the strongly one-way function they enable. As described previously, there is a geometrically intuitive way to define an arithmetic on the points of an elliptic curve.
[Image: elliptic-curve.png]
Any two points on an elliptic curve can be "dotted" ("multiplied") together to get a new point on the curve. Dotting a point with itself any number of times is fast easy to do, but going back to the original point takes a lot of computation. This operation can be used to create a nice and simple one-way function from a point P1:
Given a number n, output another number m:

  1. Dot P1 with itself n times to get another point Q
  2. Output the x-coordinate of Q as m
It's hard to go back from m to n, because that would be enough to solve the elliptic curve discrete logarithm problem, which is thought to be very, very hard to do.
The metaphor used in the previous post was that the one way function in elliptic curves is like playing a peculiar game of billiards. If someone were locked alone in a room they could play a certain number of shots and the ball would end up at a particular location. However, if you entered the room at some point and simply saw the position of the ball it would be very difficult to determine the number of shots the player had taken without playing through the whole game again yourself.
With this billiards analogy, we can think of this random number generator as a new bizarro game of pool. Consider two balls on the infinite elliptic curve billiards table, the yellow ball called P1 and the blue ball called P2.
[Image: yellow-billard.jpg][Image: blue-billard.jpg]
These two balls have specific points on the curve where they start. This is a two person game where one person is called the generator and the other is the observer. The generator has a secret number "n". The generator takes the ball P1 and performs n shots, and lets the observer see its final location. Then it takes P2 and performs n shots, taking the final location of P2 as a new value for n. Then P1 and P2 are reset to their original location and that's the end of the turn. Each turn the observer sees a new pseudo-random location for P1, and that's the output of the game.

There's a trap door in your one-way functions

In the Linux random number generator example above, SHA-1 is used as the one-way function. Let's consider what happens when we use our elliptic curve one way function instead.
Looking back at the construction for a pseudo-random number generator above, we need to choose two functions to serve as F and G. The elliptic curve one-way function above seems to fit the bill, so let's use the functions defined by two points on the curve, P1 and P2. Each one-way function is hard to reverse, and if P1 and P2 are chosen randomly, they should be independent.
So how do we add a backdoor? The key is to choose P1 and P2 so that to any outside observer they look random and independent, but in reality they have a special relationship that only we know.
Suppose we choose P2 to be P1 dotted with itself s times, where s is secret number. Then P1 and P2 are related but it is hard to prove how since finding s requires solving the elliptic curve discrete logarithm problem.
Given an initial state n, let's look at what the output becomes and what the state gets updated to.
The output is the x-coordinate of:
Q = P1 ◦ P1 ◦ … ◦ P1 (n times)
Then we get that the state S gets updated to:
P2 ◦ P2 ◦ … ◦ P2 (n times)
But P2 is just P1 dotted with itself s times, so the state is really
(P1 ◦ P1 ◦ … ◦ P1 (s times)) ◦ … ◦ (P1 ◦ P1 ◦ … ◦ P1 (s times)) (n times)
P1 ◦ P1 ◦ … ◦ P1 (s ◦ n times)
or re-arranged
(P1 ◦ P1 ◦ … ◦ P1 (n times)) ◦ … ◦ (P1 ◦ P1 ◦ … ◦ P1 (n times)) (s times)
Seeing that P1 dotted with itself n times is the output Q, we can write this as:
Q ◦ Q ◦ … ◦ Q (s times)
And since we know s and the output (and therefore Q), we can calculate the next internal state of the algorithm. The state is revealed and all subsequent bytes can be predicted. In just one round! Since given P1 and P2, finding s requires solving the discrete logarithm problem, you get to be the only one who knows this mathematical backdoor.
This can be described in the terms of the billiards game from the last section. Remember the output of one turn of the game is the location of P1 after n shots and generator's secret number comes from the location of P2 after n shots. Knowing the value s is like knowing how many shots it takes to go from P1 to P2. This lets the observer cheat at the game. If you know where P1 lands after n shots, you can shoot s times from that location to get the location of P2 after n shots. This gives you the generator's secret number and allows you to predict the next turn of the game.

Back to the real world

This toy random number generator may seem very simple and the backdoor might even seem obvious. The amazing fact is that our toy random number generator described above is Dual_EC_DRBG, almost exactly. It was published by the NSA with two "random" looking points P1 and P2. There is no indication of how these values were generated.
The values for the points P1 and P2 could have been chosen randomly or they could have been chosen with a deliberate relationship. If they were chosen deliberately, there is a backdoor. If they truly were chosen randomly, then finding the internal state is as difficult as breaking elliptic curve cryptography. Unfortunately, there is no way to identify if the two points were chosen together or randomly without either solving the elliptic curve discrete logarithm function, or catching the algorithm's author with the secret backdoor value. This is the nature of a one-way trapdoor function.
The authors did not provide any proof of randomness for the two points P1 and P2. This could have easily been done by choosing P1 and P2 as outputs of a hash function, but they did not. This is just one of many flaws in the design of this algorithm.
The evidence is mounting for Dual_EC_DRBG being well-suited for use as a back door. A working proof of concept backdoor was published in late 2013 using OpenSSL, and a patent for using the construction as "key escrow" (another term for backdoor) was filed back in 2006.
Up until recently, Dual_EC_DRBG was the default random number generator for several cryptographic products from RSA (the security division of EMC), even though cryptographers have long been skeptical of the algorithm's design. There are reports of impropriety connecting a $10 million investment by the United States government and RSA's decision to use this obscure and widely maligned algorithm in their widely-distributed products.

Looking Ahead

It is very difficult to implement a secure system. Backdoors can be introduced at the software, hardware or even algorithm level. Algorithms backed by standards are not necessarily safe or free of backdoors. Some lessons to take away from this exercise are:

  1. Even secure cryptographic functions can be weakened if there isn't a good source of randomness
  2. Randomness in deterministic systems like computers is very hard to do correctly;
  3. Adding unpredictable sources of entropy can help increase randomness and, in turn, secure algorithms from these types of attacks
At CloudFlare, we understand this fact and are working on ways to make sure that the randomness in our cryptographic systems is truly random. Steps include extracting entropy from the physical world, monitoring system entropy levels, using a hardware random number generator to mix in extra entropy, and not relying on a single random number generator as the source of all randomness.

http://blog.cloudflare.com/how-the-nsa-m...cal-primer
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Another Whistleblower Steps Forth: 47 Hard-drives and 600 million Pages of Information David Guyatt 4 12,162 26-03-2017, 10:24 PM
Last Post: Peter Lemkin
  NZ Welcomed Back to Spy Network Magda Hassan 1 3,652 22-06-2014, 05:21 AM
Last Post: Peter Lemkin

Forum Jump:


Users browsing this thread: 1 Guest(s)