Indeed! However, while they can be the source of bugs, I don’t
think that integer overflows/underflows (in Rust, anyway) are
directly the source of undefined behavior, since Rust declares
that they will have a 2’s complement result (in optimized builds; they
panic in debug builds). That said, it’s really easy to overlook
overflows, and I’ve sometimes though about enabling the checks by
default e.g. in Rayon – or at least measuring the impact of doing so.
Overflows could invalidate bounds checks in unsafe code, though. When writing unsafe code, it would be nice to be able to write something like this:
if header_size + nelems * element_size > usize::max_value() {
... // Report memory allocation failure
}
That is, the compiler would evaluate the condition with arbitrary precision and return the exact result.
It is possible to emulate this in various ways (using checked or saturating arithmetic), of course, but the result is rather verbose, and therefore difficult to review.
As you said, if you ensure memory safety elsewhere, then the precise definition of integer arithmetic does not matter for memory safety. So maybe the answer is that up-front size checks such as the one above are doomed anyway, and you should (even in unsafe code) perform size checks when allocating from the buffer (against the allocated size, potentially computed incorrectly).