RandomLib icon indicating copy to clipboard operation
RandomLib copied to clipboard

Add SHA1 compute time source.

Open padraic opened this issue 12 years ago • 2 comments

Based off https://github.com/GeorgeArgyros/Secure-random-bytes-in-PHP

padraic avatar Feb 24 '13 12:02 padraic

Looking at the implementation, I can't help but to see the similarities with the already added MicroTime Source: https://github.com/ircmaxell/RandomLib/blob/master/lib/RandomLib/Source/MicroTime.php#L67

It looks like it's doing basically the same thing, looping and adding new time data each iteration.

The problem that I have with this particular implementation is that it's fairly complex (and hence difficult to really see what's going on), but doesn't have any significant entropy sources. So it's basically just throwing together a bunch of logic. Looking deeper into it, I see that the only actual entropy that enters into the $entropy variable is timestamps. While this wouldn't be a deal breaker, this appears to just be artificially slowing down the loops for the sake of making the timestamps better. So a lot of CPU power is going to be wasted just just churning. If it fed back the entropy into itself, then at least that would be something. But as of now, it'd just burn the CPU for seemingly no reason.

In the end, it looks like a LOT of undocumented code and complex algorithms for not much benefit. And considering that there's already a source (above) based on microtime that uses a simpler and more standard gather-process-output algorithm that's pretty well documented.

I'll leave this open for a little while in case anyone can see something that I missed, or give a good justification for it.

Additionally, if it was accepted, the strength would need to be reduced to VERYLOW as there is no actual random entropy other then the timestamps (which is very minor)...

Thanks!

ircmaxell avatar Feb 25 '13 13:02 ircmaxell

You're right that it burns CPU time and is microtime based. Basically, it's relying on there being an uncertain delta between each pair of microsecond measurements performed over a fixed period of time. This keeps it low quality but it should yield a little entropy each iteration. My own concern is its performance more than anything else. It's trading time (~20ms) for a large stack of deltas.

I actually missed what you were doing in the Microtime source - it's a similar idea but I think the difference depends on whether more is better.

padraic avatar Feb 26 '13 12:02 padraic