Rc is unsafe [mostly on 32-bit targets] due to overflow

OK, that clearly shows that the assumption made in Rc is wrong: available memory simply doesn’t limit the number of non-destructed instances. (no matter whether mem::forget is safe or not)

But the idea for an efficient fix to Rc seems to rely on a similar assumption: the available memory limits the number of live Rc instances (“live” in the sense that their destructor can still be called later). This is a necessary assumption if we want to use “a saturated counter can never again reach zero” to keep the decrement operation unchanged. But is it true? @stebalien’s “wider than memory” store (a type that allows moving arbitrary rust structs to disk and back) would violate this assumption.

With Rc, we would just need a test in the destructor to avoid decrementing the count if its usize::MAX. With Arc, avoid decrementing once the counter reached the outer limits.

So we still have a slight tradeoff between the performance of Rc/Arc and the capabilities of safe Rust ("ban wider-than-memory stores from working with arbitrary types like Arc"). In the interest of not adding completely arcane rules to the “how to write unsafe code” list, we should probably take the performance hit on decrementing.