A couple of times I have come across a situation where I want to test whether some set of data is dirty, implying something else needs an update. Easy: just generate a hash-sum, and check later whether the new hash is different. For this type of thing it may even be acceptable if the hash-test isn't completely accurate.
I'm not even going to get into performance here (
libstd's hash function being more secure than required). The problems I have:
f64don't implement hash. Understandable, since there are distinct binary values which compare equal, and equal binary values which compare different. But still, for my uses neither of these cases really matter (simply casting the binary value should be fine).
Hash. I'm guessing this is because to generate a stable hash value one would have to sort all entries first, thus there is a significant performance cost. All the same, if I need the hash I either need to do this or switch to a different collection (
BTreeMapis a perfectly reasonable alternative much of the time).
The point is: it feels like there is scope for a separate hasher system here (and/or tweaks to the existing hasher, such as support for
HashMap), and I don't see one (after a quick search through