I know that I'm now moving this thread further off topic, but if you have lots of memory, look at a combination of dynamic perfect hashing and an adaptive ladder queue. Done right, you'll have a O(1) priority queue, with the ability to cancel.
BUT, while fun to discuss, these digressions have nothing to do with the original topic at hand! (and I apologize for being the reason we digressed!)
So, getting back on topic, I'm not against remove(), but I'd have to see some real-world numbers to be convinced that its worth the extra effort.
@BadAtUsernames, do you have proof that the elements that need to be canceled normally occur within some distance of the head of the queue? E.g., if you know that when you scan to delete items that they are always within O(lg n) of the head[1], then we can reasonably say that in your use case remove() will always be a O(lg n) operation. Alternatively, if you call remove() roughly O(lg m) of the number of push/pop operations, then we make a hand-wavy argument saying that remove() in your case averages to O(lg m). If both of those cases are false, then I'm going to go with either @steffahn's solution or @Eh2406's solution. Life is too short to try and invent yet another mutable priority queue just to improve the Big-O performance.
-
Technically, you just have to know that you have at most
O(lg n)elements to check, and you have to be able to access each of those elements in constant time for this analysis to work. âŠī¸