Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An important distinction that often gets lost is that entropy is a property of a probability distribution. A system to generate passwords has entropy, one specific password does not.

It's possible to read "hunter2" from /dev/random.

To infer entropy from a single password, the best you can do is to see if it falls within the domain of some known, low-entropy systems. This works ok in practice, but is very far from perfect.



To be fair, one specific password does tell you something about the entropy of the generator. If I can grab 1 MiB of data from your password generator, I can probably estimate quite well the entropy of the generator. 10-20 bytes? That tells you almost nothing (but still not nothing). That means there must be some way of updating an estimate of the entropy of the generator from every additional byte of output and the amount of confidence that you gain about that estimate grows (probably exponentially) with every new byte.

But your password is 10-20 bytes so you can say nothing about the generator.


Well, trying to infer entropy of a password by classifying it into the lowest possible low entropy system actually works well against brute forcing because brute forcing usually works exactly doing that.


That's the crucial point. You can have a super high entropy probability distribution straight from a quantum lava lamp, but if it gives you "hunter2" as a password, you should still not use that password, no matter the entropy of the distribution.


Right, I was just about to ask: Entropy is the measure of how "surprised" I'll get from the next character in the password. It's not a property of the password, it's a property of me. "Γεια" is very surprising to everyone here, as everyone would assume the first character to be English, but not if you know that the user is Greek.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: