I’m not sure whether this would just be papering over one instance of the issue or actually fix it, but:
What about if Atomic*::* were made atomic in and of themselves? (AKA they always inline to the atomic syscall instead of adding a stack frame.)
// Stupid bikeshed std-only syntax to guarantee this behavior
pub fn fetch_sub(&self, val: $int_type, order: Ordering) -> $int_type =
unsafe { atomic_sub(self.v.get(), val, order) };
// Alternately, using TCO:
pub fn fetch_sub(&self, val: $int_type, order: Ordering) -> $int_type {
unsafe { become atomic_sub(self.v.get(), val, order) }
}
// Or if we can guarantee inlining 101%
#[inline(required)]
pub fn fetch_sub(&self, val: $int_type, order: Ordering) -> $int_type {
unsafe { atomic_sub(self.v.get(), val, order) }
}
This would mean that there is no place in the stack frame where the &Atomic* is held while it’s been decremented.