Hacker News new | past | comments | ask | show | jobs | submit login
Predicting the next Math.random() in Java (franklinta.com)
150 points by nilknarf on Sept 1, 2014 | hide | past | favorite | 56 comments



If you want cryptographic-quality random numbers, both Java and Javascript have them. Math.random() is simply a super-fast decent RNG.

Example:

    var buf = new Uint32Array(10); 
    window.crypto.getRandomValues(buf);
    console.log(buf);
Outputs things like:

    [4027145128, 258543382, 1205615760, 2665675208, 4033127244, 
     2280027866, 3983484449, 510932333, 1911490534, 2609399642]
This works in Chrome and FF.

IE11 has Crypto.getRandomValues(...)

Java has SecureRandom:

http://docs.oracle.com/javase/6/docs/api/java/security/Secur...


> Math.random() is simply a super-fast decent RNG

Actually, java.util.Random is a horrible, horrible RNG. Its least significant bit (for example) is grotesquely nonrandom. See http://www.alife.co.uk/nonrandom/ for a wonderful demonstration of its awfulness. There are other serious bugs in java.util.Random which Sun, er, Oracle, has steadfastly refused to fix or revise for a decade now due to concerns about "backward compatibility" (in an RNG!)

> If you want cryptographic-quality random numbers

I think you're confusing "crytographic-quality" with "high quality".

RNGs tend to break down into two different camps. First, there are RNGs meant for simulation of stochastic processes. These RNGs need to be fast, have good periods, and have a very high degree of statistical randomness in their output.

Second, there are RNGs meant for crypto. Speed is less important. Rather it's important that these RNGs provide good statistical randomness, but also it's crucial that, given some M previous output numbers of the RNG, you cannot predict or make any statistical claims about the next number even if you know how the algorithm works. This isn't important for non-crypto RNGs.

For example, Mersenne Twister is a pretty high quality, widely used RNG. But it is not cryptographically secure: if you have about 1500 of the previous output numbers, you can perfectly predict the next number if you know it's MT producing them.

While there are many applications for crypto RNGs, for a great many tasks, a non-crypto RNG would be a better fit.

java.util.Random was meant for non-crypto tasks. And it completely fails at even those tasks.

But SecureRandom is really bad for those tasks too: it is slow. For example, my MersenneTwister implementation is about 9 times faster than SecureRandom; and my non-threadsafe MT implementation is about 36 times faster. That's a big deal if you're pulling down millions and millions of numbers.


> There are other serious bugs in java.util.Random which Sun, er, Oracle, has steadfastly refused to fix or revise for a decade now due to concerns about "backward compatibility" (in an RNG!)

Backward compatibility in a RNG is actually rather important, weirdly enough. For example, anything that uses deterministic seeds for repeatability of procedural generation or optimization. (Read: Minecraft, among other more important things).

That being said, things like this are why I wish more programming languages had proper (read: fine-grained) versioning of libraries. "I need to link to a function with this signature, that is one of these versions or any version that has been declared to be compatible with them, preferably the latest version that is so". That sort of thing.

> For example, Mersenne Twister is a pretty high quality, widely used RNG. But it is not cryptographically secure... But SecureRandom is really bad for those tasks too: it is slow. For example, my MersenneTwister implementation is about 9 times faster than SecureRandom

By your own words, you're comparing apples to oranges here...


Also, when you're writing scientific code (Monte Carlo simulations, for example, or disordered systems), the folk wisdom is that you must to keep track of what seed you used. I've never had to use these records myself, but I can image wanting to go back and reproduce exactly the same calculation, for (e.g.) debugging or verifying new code. Now you think about resurrecting some previous grad student's code that only works when used in exactly the right way and has documentation scattered through comments and notebooks---which is perfectly natural, since he probably didn't think anybody else would ever use it---and a change in PRNG algorithm could be immensely frustrating. Or think about trying to re-run old code to compare with new analytical results: you'd want to verify that you're getting exactly the same results, as a way of checking that there isn't some other bitrot hidden away somewhere.

(Now, why one would be using Java for Monte Carlo simulations I haven't the foggiest idea.)


Couldn't you solve this by tracking the version of Java (for example) used initially to develop the simulation?


Yeah, and this is why I'm very careful about recording the versions I use for a calculation (version number and/or commit for the language, library version numbers, and commit for my code). The thing is, that's not your first step: the first step is to try and get the thing to run on whatever Java (/Julia/Matlab/Mathematica/Fortran) you have installed. If that doesn't work, there are a thousand things it could be (improper usage, some missing file you had no clue you might need, change in libraries, change in OS/distro/CPUarch, out-of-memory with a new set of parameters); installing an old version of your language and trying it on that is not necessarily easy, and certainly not the first bug-finding step.

Philip Guo (pgbovine) addressed many of these problems in his thesis work, on Burrito (http://www.pgbovine.net/burrito.html ) and CDE (http://www.pgbovine.net/cde.html ). Both of these look very cool, but I've never managed to get over the hump and actually start playing with them.

At this point I should note that I'm actually in favor of fixing a bad standard-library PRNG---I just think it's important to understand (a) what could break, and (b) why users might be unhappy about the breakage. This isn't quite a Chesterton's Fence sort of situation, but it's similar in spirit.


Sure, but it kind of sucks if your whole program has to remain on Java 1.3 forever because you need a particular RNG.


Until a required library (or Java itself) has a security flaw and the new version does not support that version of Java.


> Backward compatibility in a RNG is actually rather important, weirdly enough. For example, anything that uses deterministic seeds for repeatability of procedural generation or optimization. (Read: Minecraft, among other more important things).

This seems like a very, very bad thing to rely on. Other than Java, how many other languages have guarantees in generator determinism from version to version as part of the language contract? Certainly Lisp considers it an antipattern.


Any language that allows the PRNG to be manually seeded, presumably?

I mean, if manually seeding the PRNG doesn't produce a deterministic result, you wouldn't make it part of the API.


Nope.

Essentially all languages allow manual seeding: almost none specify guarantees with regard to RNG behavior. I'm pretty sure this is because determinism of a given process important (everyone who does simulation needs that), but locking into potentially bad RNGs for the language or library itself, in fact specifying them, is a really bad idea.


Counterexample: Python [1]

[1] http://stackoverflow.com/a/19179886/1814881


If it's specified, I don't mind people relying on it. That being said, Java is over-specified in many ways, this potentially being one of them.

I know Python does not have this guarantee.

Personally? There should really be a couple different RNGs in Java's standard library - all combinations that make sense of [insecure/secure, can set seed / can't set seed, thread-local/global] (local/global being without distinction if one cannot set the seed). The default being secure / can't set seed.


Cryptographic doesn't mean slow. If you generate in to a large enough buffer (1-2kb is good), Chacha8 is anywhere from ~1.5 cycles/byte (SSSE-3, Wolfdale) to ~0.5 cycles/byte (AVX2, Haswell). Even unoptimized implementations are still fairly efficient at ~5 cycles/byte.

If you don't care about the cryptographic properties, it could be made even faster by dropping to 6 rounds and tweaking the output step.


An appropriate Xorshift generator is likely much better (and faster) than java.util.Random: http://xorshift.di.unimi.it/

I've used them to improve my hashing collision rate.


If you want true random, then many CPUs/SoCs have environment based TRNGs. The SoC used in the rPi can emit half a million random bits per second (that passed the statistical tests I threw them through), and many Intel chips have an RNG in them now (though I don't know if it is as easy to access in userland as the rPi's is).

If you need more than your current CPU provides then you could chain a bunch of them together, or pick a good PRNG for your main source and simply reseed that on a regular basis with the output of your TRNG.

There are some concerns with generators like those in the rPi as they are not documented, so you have no way of knowing exactly how good they are or if the manufacturer has somehow arranged for there to be some predictability - but if you are that paranoid you need to be building your own generator from scratch anyway!


The problem with this is that it's not ubiquitous, and there are multiple different implementations.

There's a reason why OSes provide a HAL.


How did Java and up with such a poor implementation? Certainly these things were known when writing that library?


Java has a number of... questionable, shall we say, design decisions.

Some of it is legacy cruft - Random was first released in, what, 1995?

Some of it is fallout of it being designed by committee - "we need it to pass this bunch of certifications, which means we have to do this bunch of things even though it doesn't make much sense to do so".

And some of it is undoubtedly the classic execution speed / maintainability tradeoff - no-one has unlimited dev time, Oracle included. A LCRNG is simple and well-known.


Also they cannot change it now, because its behaviour is documented and people certainly rely on it somewhere.

And it's about km owing what tools to use for what job anyway. I'd you just need to roll a few dice, Random is fine enough. If you need generated passwords, or cryptographic keys, you use SecureRandom. And if you need to run Monte Carlo simulations you'll just implement either MT-19937 or one of the WELL generators.


I remember reading they pulled the implementation almost straight out of TAOCP, from good old Donald Knuth (if that's enough of a name drop). It might not be great, but at least it was from a reasonably respectable, academic source and not copy-pasta'd out of stackoverflow.


Devil's advocate, but why not rename "Random" to "AlmostRandom" instead, and call "SecureRandom" just "Random"? It's not unreasonable for a lay programmer to expect "random" to be exactly that; though, obviously, non-experts shouldn't be implementing crypto regardless. But since they will, despite everyone's best advice otherwise, why not reduce their chances of introducing bugs?


How far do you want to go? The Javadocs state it plain and clear that this is not cryptographically secure. Is AlmostRandom enough? Maybe we should force people to set an enum? new AlmostRandom(AlmostRandomCertification.YES_I_M_NOT_AN_IDIOT_AND_WONT_USE_THIS_FOR_CRYPTOGRAPHIC_PURPOSES)? You have to draw the line somewhere.


True, although honestly your tongue-in-cheek enum example is probably closer to where the line should be.


Also a nice reminder: "Never write your own crypto"


(Outside of education.)


Why not rename comparison functions, too, since they often aren't constant time? And all sorts of other primitives someone might accidentally use?


Because

1) Even SecureRandom is likely not completely random. Unless they seed with physical/unpredictable randomness.

2) It's much slower


> 1) Unless they seed with physical/unpredictable randomness.

They do.


That is a nice not-so-subtle reminder. When a PRNG says it is insecure, it is insecure. When a PRNG says it is secure, it might be - get someone very clever to check it first.


nitpick, Firefox doesn't use Rhino, it uses SpiderMonkey which is C++.

https://developer.mozilla.org/en-US/docs/Mozilla/Projects/Sp...


How dangerous this prediction can be? I can't stop thinking of java-backended real money, poorly written, gaming websites.


It has happened before!

See http://www.cigital.com/papers/download/developer_gambling.ph... for a very detailed explanation of how one figured out all the cards in Texas hold-em.


It's well known. If someone is doing anything with real money, and isn't doing something more secure than this, they've already been hacked, are already out of business, and are already not allowed to write code that deals with real money.


This is a convenient but flawed argument that assumes an equilibrium. Hacking is often (mostly?) about vulnerabilities that change over time.


This has happened in the past. Interesting read:

http://www.cigital.com/papers/download/developer_gambling.ph...


If one were to use such tactics to make money - would that be considered "hacking" and be illegal?


Real random numbers are useless because everything follow some distribution. Actually random generators in programming languages should be called pseudorandom to avoid confusions.


> Real random numbers are useless because everything follow some distribution.

First, real random numbers are quite valuable. Second, yes, all numerical sequences follow a distribution, but one of those distributions is called the "normal distribution", and I think you may be able to guess what that refers to.

> Actually random generators in programming languages should be called pseudorandom to avoid confusions.

It would have been useful to explain the difference between the terms random and pseudorandom. "Pseudorandom" doesn't necessarily mean easily predictable or flawed, it means the sequence results from a deterministic algorithm and can be recreated exactly by restarting the generator with the same seed value.


Why is the normal distribution important here? (In programming, people often start from the uniform distribution, don't they?)


> Why is the normal distribution important here?

There may be a terminological confusion at work here. A normal or uniform number shows no internal pattern:

http://en.wikipedia.org/wiki/Normal_number

Quote: "In mathematics, a normal number is a real number whose infinite sequence of digits in every base b[1] is distributed uniformly in the sense that each of the b digit values has the same natural density 1/b, also all possible b^2 pairs of digits are equally likely with density b^−2, all b^3 triplets of digits equally likely with density b^−3, etc."

There are also pseudorandom generators whose purpose it is to generate results that agree with a normal or Gaussian distribution, for particular purposes.

http://www.design.caltech.edu/erik/Misc/Gaussian.html

Quote: "This note is about the topic of generating Gaussian pseudo-random numbers given a source of uniform pseudo-random numbers."

The problem here is that a normal number shows a uniform distribution of its digits among the possible values, and the term normal distribution is sometimes used to describe this outcome.

> In programming, people often start from the uniform distribution, don't they?

Yes, and as set out above, this starting point may be described in a confusing way.


I have never seen the term normal distribution used in connection with anything but the Gaussian normal distribution. Especially not in connection with normal numbers. Can you point to some examples of people using this sense?


By that logic nothing should ever be called random (in any context). There are things we can measure but don't understand the source of the entropy, but they aren't random we just haven't figured out the source yet. Most things we consider random are quantifiable and predictable with enough data (be it nature or computers).

Plus there are several definitions of the term "random" (English) which are in-line with the programming usage such as "random: a haphazard course."


I thought quantum processes were truly random.


As far as we know. That isn't the same thing as "it is".

If you assume that quantum processes are truly random, there are a bunch of ways for a computer to generate truly random noise - the easiest class being shot noise (connect a computer to a good camera in an almost pitch-black enclosure, or send a small current through a diode), but there are others.


Those likely won't be vulnerable even if they implemented their site using the insecure random function. The reason why this works is that you're the only consumer of Java-randomness, as you add additional consumers it becomes infinitely more difficult.

Consumers also don't need to be users, AI players and cards dealt will also consume randomness. You would also need to know the mapping from the random output into the game (e.g. in card games are there multiple decks each assigned 1 value of entropy? 2 decks, 3 decks, etc? Plus any mappings or conversions will make this impossible (as you wouldn't know the real output of the random number generator)).

Ultimately it will likely work pretty reliably locally, but as soon as you stick it on a web service then all bets are off.


In the event tptacek doesn't show up to explain how awfully wrong this is, I'll do my best to fill in.

An attacker who can predict your PRNG and guess a seeded value knows a potentially infinite number of future random numbers. Now all he needs to do is guess what random numbers, from a very small pool of possible numbers, will show up at what time. Devs often code assuming that PRNG numbers aren't predictable at all, so compromising your RNG is like setting your password to "hunter2" in a situation where nobody thinks about limiting the number of guesses.

Such attacks were famously used to steal a lot of money on PlanetPoker, one of the first poker websites. RNG attacks are also deadly in encryption, where it's reasonable for an attacker to be able to make millions of guesses from his laptop computer.


This more probable than you think. Java code is likely to use Collections.shuffle(). That method uses internal Random which is used only for shuffling. Shuffling is unlikely to be used for other purposes than shuffling cards. There isn't that many ways to shuffle them either.


I worked for 6 years in a company which makes online (gambling) games. I can tell you that anyone who even thinks of using Collections.shuffle() for shuffling cards in production code is an amateur.


shuffle does allow specifying a j.u.Random as second parameter. If you don't use SecureRandom when you have to, it's your own fall.


You'd have more luck with fixed-odds games. The APIs for those tend to expose the results of the RNG in ways that are fairly easy to piece together. Some of the API responses will literally tell you enough to know "we picked a random number between 0 and x, and the result was y".


Math.random() should be used only for tests. That's it. Performance sucks as it's shared. ThreadLocalRandom is a lot better if you need fast but not-quality random.

And there is SecureRandom for security concerns.

Last fun fact Math.random() and a Monte Carlo test introduced "CAS in Java" and all that followed with JSR 166.


Reminded me an interesting Java Random issue with small seeds and power of two intervals:

  for(int i = 0; i < 256; i++) {
      System.out.println(new Random(i).nextInt(8));
  }
It returns same number for all seeds.


I tested a similar attack against ApacheCommons' RandomStringUtil. Given a few bytes of output, I could recover the RNG state in 20 minutes on CPU.


As another commenter has said, Firefox doesn't use Rhino. Here's the relevant code in Firefox's JS engine.

http://dxr.mozilla.org/mozilla-central/source/js/src/jsmath....


Hah, funny! I recently did the same to circumvent CSRF-protection based on java.util.Random. Here's my solver in JS: https://peks.as/experiments/random/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: