This is mostly curiosity, but is picking max over min arbitrary, or is there a good reason to do so? In particular, Python's "heap" and Java's queue are min-heaps.

Reasons why min is natural:

- sorted array is a min-heap
- Dijkstra needs a min-heap
- set of timers needs a min-heap

Reasons why max is natural:

- it is useful for heap-sort
- user-specified priorities need max heap

Additionally, while solving today's advent of code, I had a very fun, but frustrating bug, where I implemented Dijkstra, assuming the the heap is max heap. Because the code itself looked perfectly fine, debugging this was not trivial. I wonder if we could/should make it more obvious at the call-site that the heap is max. A straw man would be adding `pop_max`

, `peek_max`

, `peek_max_mut`

methods, and deprecating unqualified versions.