The issue is that a Rust version of -fno-strict-aliasing
would do much more than the C/C++ -fno-strict-aliasing
, because Rust provides much stronger guarnatees (in the form of &mut
references) than C/C++ do.
If I understand correctly, you believe that a Rust version of -fno-strict-aliasing
would simply be a more permissive version of the language, and that anyone who doesn't want to use it can simply ignore it.
Unfortunately, this is not the case. Consider this (contrived) program:
fn modify_vec(val: &mut Vec<u8>, f: impl FnOnce()) {
val.push(25);
f();
assert_eq!(val.pop(), Some(25));
}
As a user of Rust, I can guarantee that the assert_eq!
will always pass - I have a mutable reference to a vector, which means that I have exclusive access to it. This is despite the fact that I'm calling an arbitary user-provided function - if that function were to try to modify val
through a raw pointer, it would be undefined behavior, and therefore unambiguously be at fault.
Note that this is completely independent of whatever optimizations that compiler chooses to perform. What's important is the fact that &mut T
implies exclusive access to a value of type T
. Even if a user isn't aware that raw pointers even exist in the language, they can still correctly reason about the behavior of modify_vec
.
While this example might seem contrived, there are many types in the standard library that rely on &mut T
being exclusive (e.g. Cell::get_mut
). This is an important property of the language, regardless of whether or not the compiler chooses to exploit it for optimization purposes.
If I understand your proposal correctly, the Rust version of -fno-strict-aliasing
would break the assumptions made by modify_vec
. While it might seem clear that this is the 'fault' of whatever code is creating the aliasing mutable references, this will become far less clear in a complex dependency graph. Two crates may make assumptions that seem reasonable in isolation (this functiion will never be called with an aliasing mutable reference, or using aliasing mutable references will only affect my code), but that interact badly when brought together.
However, I definitely sympathsize with your concern about peace of mind. Personally, I would support a way to disable any mutable-aliasing relating optimizations, provided that having aliasing mutable references is still always U.B.. Effectively, this would be a more fine-grained version of the optimization level - unsafe
code that breaks on a higher optimization level is still wrong at a lower optimization level, but you as a user are free to choose a less agressive set of optimizations.
With such an option, aliasing mutable references would still unambiguously be a bug in your code, and users would still be responsible for fixing them (with the help of tools like Miri). However, this would allow authors of binary crates to mitigate (but not eliminate!) the effects of U.B, especially for code that they don't necessarily trust to be perfectly well-written. This would be similar in spirit to how std::mem::uninitialized
now panics for uninhabited types - your code still has undefined behavior, but the compiler is choosing to make it not 'as bad' as the standard allows it to be.