Followup that became too long for a reasonable edit:
The real burden of adaptive hashing is proving out heuristics and benchmarking everything to make a convincing argument for its adoption. The current system, and the proposed FastHashMap type, has obvious characteristics, so it’s a simple change we can make to make writing performant/deterministic Rust code a little bit better out of the box.
Also regardless of adaptive hashing there is still a desire for non-deterministic/deterministic dichotomy. Deterministic hashing has perfectly valid usecases (see: rustc and reproducible builds). But having non-deterministic hashing by default avoids users accidentally relying on things that we don’t guarantee (see: JS was forced to guarantee objects behave like linked hashmaps for iteration because people were just dumping them into ordered drop downs).
edit: also for full information pczarn is interested in re-investigating an adaptive hashing impl