Hacker Newsnew | past | comments | ask | show | jobs | submit | anomalroil's commentslogin

These precompiles were long overdue, IMO. Good think to see them finally being available on Mainnet, but I wonder if we'll see the migration away from the much weaker BN254 pairing-friendly curve to the stronger BLS12-381 one prior to post-quantum alternatives being available as precompiles ^^''


Depends if you want the RTX 5090 before next year or not, just like the collision :P


Too bad you didn't try it with much higher degrees.


The "infamous" reference generators from NIST 800-22 included linear, quadratic and cubic congruential generators only. A potentially vulnerable implementation that may have used this document as a reference would probably have only gone up to the cubic case. So I think it's unlikely that someone used a recurrence equation of higher degrees. But you never know. Also, the higher the degree, the more resources the attack will require. So, we opted for a balanced cost/benefit approach.


Yeah, at least to try on that one signature with 3 million wallets on it. If it doesn't work with 30 or 100, most likely it means they're using a valid number generator, but if not, damn. That's a lot of bitcoins


It does but it does not! java.util.Random is not a CSPRNG at all and is terrible, so even tho the nextInt() method is using rejection sampling, it's still producing biased values and also completely fails to be "unpredictable" because java.util.Random is weak and predictable.


Interesting! How come this wasn't fixed? Nobody noticed, or is it that java.util.Random is not meant for serious cryptographic use?

I know there are other parts of the Java standard lib that are so terrible [1] that people for years have recommended not using them, like anything with dates and timezones...

---

[1] or used to, haven't kept up with the latest Java versions. Maybe they fixed it.


Historically one of the first uses for computers was Monte Carlo simulations. In such simulations it's often important to be able to deterministically recreate sequences of "random" numbers. Thus, in almost all computer programming languages, "Random" means deterministic random values. This is a wide convention followed by practically every programming language.

If you want cryptographically secure random numbers, you typically call a function with a different name, one that has "secure" or "crypto" in a name somewhere (e.g., in the function or containing module/package).

This is a convention from the 1950s and has been consistent ever since. That naming-convention ship sailed before many of us were born.


Speaking of Monte Carlo, here's a paper about how using bad randomness can lead to wrong simulation: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2992609/ (or should I call them _biased_ ones?)


And as a counterpoint, if you use a well-chosen deterministic sequence, it can make some Monte Carlo methods produce better results faster than true randomness would (quasirandom sequences and all that).

So, interestingly, "bad randomness" =/= deterministic.

(not that you were claiming that, just adding this subtlety to the mix)


> Thus, in almost all computer programming languages, "Random" means deterministic random values.

Yes, I know how pseudo RNG works.

I just didn't realize this ran counter to doing secure crypto, so thanks for the explanation anyway :)


> is it that java.util.Random is not meant for serious cryptographic use?

Indeed it never was. SecureRandom (a subclass of Random in a different package, grouped with other cryptography related functionality) was introduced in Java 1.1 in 1997.

> I know there are other parts of the Java standard lib that are so terrible [1] that people for years have recommended not using them, like anything with dates and timezones...

The original Java date classes (also from the 1.1 era) were functional and correct, but badly designed. A modern time API was introduced in Java 8 in 2014.


> The original Java date classes (also from the 1.1 era) were functional and correct, but badly designed. A modern time API was introduced in Java 8 in 2014.

I'm pretty sure they were neither correct nor functional. I've read pretty detailed examples of everything they got fundamentally wrong, which was a lot. Wrong, not as "this is cumbersome to use" but as in "this fundamentally misunderstands what dates are at a conceptual level, and produces wrong results". Unfortunately, I would have to google it now.

All I can find right now is Jon Skeet's summary [1], which leans more to the "avoid using java.util.Date because it's one giant pitfall" side of the argument. Though it does mention some fundamental problems with it. However, this is not the article I found back in the day, which was both more comprehensive and more critical.

> A modern time API was introduced in Java 8 in 2014.

I seem to remember even at that time people were still not recommending Java standard date classes and functions, but I may be misremembering. Or were all the replacements like joda time from a previous era?

---

[1] https://codeblog.jonskeet.uk/2017/04/23/all-about-java-util-...


> Or were all the replacements like joda time from a previous era?

Joda time essentially is the new standard Java date/time implementation. Joda was used as an out-of-stdlib proving ground to try out ideas and get feedback. The new Java date/time API is based on the learnings from Joda time.

My understanding from use of the new (I shouldn't say "new"; it's been there since Java 8) date/time classes are that they're pretty good at representing dates, times, calendars, etc., but I wouldn't claim to be an expert on such matters.


The Jon Skeet article you're citing mentions only java.util.Date, and it is in fact absolutely fair to say that this class "fundamentally misunderstands what dates are at a conceptual level" in all of its functionality that goes beyond being a thin wrapper around a number of milliseconds.

But note that java.util.Date is from Java 1.0. The mistake was recognized and in Java 1.1 we got java.util.Calendar and friends, which is, as far as I know, does qualify as "functional and correct" but was ugly and annoying to use, as well as missing many useful features (like the concept of a timespan, or even just a date without a time-of-day).


(I assure you I'm not picking on you or your answer!)

I seem to remember the article which I unfortunately cannot conjure up right now demolished java.util.Calendar as well (again, as in "wrong", not as in "difficult to use"). I'm pretty confident of this because I never used Java 1.x at work, and when I read the article it was relevant to my day job.

Alas, memory fails me.


The contract of Random requires determinism, so the algorithm can never be changed.

> If two instances of Random are created with the same seed, and the same sequence of method calls is made for each, they will generate and return identical sequences of numbers. In order to guarantee this property, particular algorithms are specified for the class Random. Java implementations must use all the algorithms shown here for the class Random, for the sake of absolute portability of Java code


It isn't terrible, it is just not suitable for cryptographic applications. I suspect most of the time that people want a random number up to some value, it is not for cryptographic purposes.


> It isn't terrible

Mind you, I didn't call it terrible. However, the comment I was replying to -- coincidentally the author of TFA on modulo bias we are discussing -- did:

> java.util.Random is not a CSPRNG at all and is terrible, so even tho the nextInt() method is using rejection sampling, it's still producing biased values and also completely fails to be "unpredictable" because java.util.Random is weak and predictable.

and

> [...] using fast, but bad random generators such as Java's Random was shown to be an issue multiple times in the past already for things such as Monte Carlo, and so on, not just for things related to security.

Of course, "CSPRNG" means "cryptographically secure", but his comment made me think he considers it a terrible implementation regardless.


>his comment made me think he considers it a terrible implementation regardless.

Different people have different standards for what is "terrible". There are much better PRNGs that are just as fast. You should probably avoid java.util.Random if you care about the quality of the randomness it produces.

It's good enough for games that don't involve real money.


Well, the person who claims it's terrible is the author of the very thorough article we are discussing.

> You should probably avoid java.util.Random if you care about the quality of the randomness it produces.

That's pretty much the definition of "bad" ;) If you don't care about quality, I assume you could use a string of numbers without any randomness at all.

As for games, it depends. I'm sure one could make a guess it's bad for some kinds of non-betting games and simulations as well.


As you said: Java's Random is not meant for serious cryptographic usage, it's meant to be super fast. You have SecureRandom instead for cryptographic usage, and it's noticeably slower. Interestingly, using fast, but bad random generators such as Java's Random was shown to be an issue multiple times in the past already for things such as Monte Carlo, and so on, not just for things related to security.


That's why we have

public class SecureRandom extends Random


Lemire's technique is really nice, in general a good thing to learn about, since it's a bit mind bending how it's playing with intervals. Sadly last time I benchmarked it in code on x86-64 for cryptographic purposes, it wasn't faster than rejection sampling, or just using a large value and a modulo reduction: in all cases what is actually taking a lot of time is the call to get good quality randomness out of a CSPRNG, the rest being negligible in comparison.


Notice that nowadays, unlike 2 years ago, people usually recommend to use the last technique I presented there in the last paragraph before the Conclusion. Which is to generate a random value that is big enough, so that n-log(p) > 128 so that the bias will be too small to be exploitable in practice. It's simpler and unlike rejection sampling ensures your code cannot fall into an infinite loop in case your PRNG is broken. (I'd argue you might want your code to fail in that case anyway, but YMMV.)


The other virtue of this technique is that some of the more popular fast rejection sampling methods (e.g. Lemire's "nearly divisionless") leak a small amount of information via timing side channel, because the divisionless fast path is _not_ unbiased.


Or rather, observing a network meant to produce randomness in a distributed way.


Yeah, I guess that's a fair way of describing threshold cryptography and a threshold network. There was actually already once a proposal of creating a "timelapse encryption service", by Rabin in 2006 (https://dash.harvard.edu/handle/1/26506434) they said they were planning on implementing and deploying such a service, but it never made it AFAIK. The main difference here is that it's relying on an existing service providing also something else, namely public randomness, and that this service has been running for 2 years and never had any disruption so far.

In any case another assumption is that there are no quantum computers able to break your ciphertext since every primitive used here is broken by quantum computers. So it would be ill-advised to use it to encrypt something that cannot be decrypted until 10, 20 or 50 years, since it might well be broken by then if even if the network survived. But for near-future things, it works well.


Well, Timelock Encryption is "encrypting somewhat towards the future", and as explained in the talk, in 1996 "Timelock puzzles" were proposed by Rivest, Shamir and Wagner as a "proof of work" based system to achieve it. Sadly that's very sensitive to hardware evolution, and when Rivest tried to create a puzzle meant to last 35 years in 99, it ended up being broken in 2019 after only 20 years. It was even doubly broken: once by a guy running it on his CPU for 3.5 years only, and once in only 2 months by a FPGA/ASIC implementation done by the Cryptophage collab between Ethereum Foundation, Supranational and Protocol Labs... Other names that are also about achieving timelock encryption include "timelapse encryption" and "timed release encryption". Some people do a distinction because for instance the timelock puzzle are mostly about achieving a lower bound on the time it requires to solve them but aren't mapping well to a precise time. VDFs are promising tools to achieve timelock or timed released encryption, for sure.


That would be a pretty cool way of achieving timelock, sure, but might not be super practical.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: