bwbug

Results 33 comments of bwbug

I'll think about how to do this. But the bottom line is that the "score" (whether expressed as a 0-100 score, or in terms of some entropy metric) will be...

> would welcome a double-check of my math For the first table, your **max list length** values are only accurate to 2-3 significant digits, as a result of rounding errors...

Ha! Thank you for the kind words, but I was really just trying to figure out if I could get a little better precision on the "2.62" value. Either there's...

I think you can simplify the evaluation of `assumed_entropy_per_character

Yes, `6.1` is limiting the result to two significant digit, because of the corresponding uncertainty in the 2-significant digit value of 4.5 letters/word that was used in Shannon's calculation. If...

I played around a little with the entropy definition. If a corpus contains _N_ words that appear with _equal_ probability, then the Shannon entropy per word is log2_N_, which kind...

I think that the concept of an attacker requiring (on average) fewer than 26 guesses per character if they know your passphrase is in English is a _very_ important one....

So my math from last night seems to check out. Thus, if you (temporarily?) decide to 86 the "Shannon Line", I would recommend that you report the following value in...

>Is G related to a value I label as "assumed entropy per character"? If so how? Yes: `assumed_entropy_per_character` = log2_G_ _G_ = 2`assumed_entropy_per_character` >Is G more accurately described as assumed...

Haven't delved any further into the primary literature yet, but I found a nice synopsis of Shannon entropy in a post by [Abhimanyu Dubey on Quora](https://qr.ae/pvw4UK). Some excerpts: >the entropy...