Impl Default for pointers?

This whole topic, as well as the other referenced one regarding NonNull, shows the issue with Default: the very name seems to imply that there is a canonical value for a type, which is the one returned by Default::default().

When thinking about it, this is not the case for many types. I guess that with signed integers, the argument of “symmetry” (if we dismiss that MIN + MAX ≠ 0) makes 0 more special than MIN_VALUE or MAX_VALUE, and that it then propagates to unsigned integer for the sake of consistency.

But having bool : Default is already a weird thing to have; we could imagine this whole thread debating as to whether it should default to true or false.

The solution for this “debate” is in the official description of the trait, from the docs:

So, the idea becomes the following:

Problem of semantics

Default::default constructs an arbitrary valid value of type Self, but the actual choice is arbitrary.

That is, it should be perfectly fine for <i32 as Default>::default() to return 42.

Hence, whether <*const T as Default>::default() returns ptr::null::<T>(), 1 as *const T, or mem::align_of::<T>() as *const T should not matter (yes, even an unaligned address should be fine, given the real semantics of Default). Imho we should try to choose the one more likely to cause a memory violation when dereferenced, and for such thing NULL seems like the established consensus.

However, if such choice is so hard to make, then surely the problem lies within the Default trait.

A new equivalent trait but for its naming could be made, something like Arbitrary::arbitrary(), only “special-cased” in its construction for Option<T>, where it would be guaranteed to give None. For anything else, the value it creates should never be relied upon (it would even be allowed to have an implementation where it differs from call to call), except for the fact that it is not only valid, but safe (i.e., this would not be mem::uninitialized, although for integer types mem::uninitialized would be a valid implementation!).

For instance, testing whether something equals Default::default(), as suggested in this other thread, would become testing whether something equals Arbitrary::arbitrary(), which shows how absurd the very test is (for such use cases, Option<T> or a manually crafted special discriminant should be chosen; in case of pointer types, this is of course Option<NonNull<T>> instead of *const T, and instead of *mut T we should use … err, there is no NonNullMut!? That’s for another topic, however).

As some have stated, relying on mem::uninitialized/zeroed or even mem::MaybeUninit just because #[derive]ing Default was not possible and implementing it without helper crates is cumbersome (not everybody knows of / uses something like ::derivative) is worse than having <*const T as Arbitrary>::arbitrary() exist.


What’s the rationale for this special-case? It seems like with the semantics you describe this would be a valid implementation:

impl<T> Arbitrary for Option<T> {
    default fn arbitrary() -> Self {

impl<T: Arbitrary> Arbitrary for Option<T> {
    fn arbitrary() -> Self {
        static MAKE_SOME: AtomicBool = AtomicBool::new(false);
        if MAKE_SOME.fetch_xor(true, Ordering::Relaxed) {
        } else {

EDIT: Extending this into argumentum ad absurdum territory this also seems like a valid implementation

impl<T> Arbitrary for Vec<T> {
    default fn arbitrary() -> Self {

impl<T: Arbitrary> Arbitrary for Vec<T> {
    fn arbitrary() -> Self {
        (0..usize::arbitrary()).map(|_| T::arbitrary()).collect()
1 Like

NonNull<T> is "*mut T but non-zero and covariant", there is no equivalent for *const T.

Yes, those would indeed not contradict the true Arbitrary definition (as with the current Default definition; I am just suggesting a rebranding of the trait for the sake of clarity).

However, regarding the particulars of Arbitrary / Default, we could imagine that, for ::std's items (i.e., custom crates would be allowed to do as their choosing), we could follow the current status quo of Default::default()

  • for the sake of performance, it chooses “arbitrarily” among the cheapest implementation , hence for collections such as Vec, Hashmap, VecDeque, etc. they would be empty;

    • a corollary of this leads to Option<T> lending None.
  • EDIT (for clarity): if such direction is taken, I’d imagine that for the sake of consistency among these “exceptions”, their implementations would not be overridable (i.e., no default fn arbitrary/default), thus requiring newtypes for people wanting their own behavior, as is the case in current Rust. Maybe this question deserves further exploring.

If people using raw pointers wanted covariance, they would have had no choice but to use *const T in the struct’s definition. In both the NonNull<T> and *const T cases, a cast to *mut T is then necessary when wanting to mutate the pointee.

Else, there is currently no way to easily get a “defaultable” non-owning mutable raw reference:

  • for people using *mut T as a non-owning mutable reference, then going from *mut T to Option<NonNull<T>> unsoundly makes the reference covariant.

  • That’s why, to get a defaultable non-owning mutable reference, it is currently needed to use:

    • either *mut T with a manual implementation of Default

    • or #[derive(Default)] on a Option<NonNull<T>> + PhantomData<*mut T>;

  • This is where a NonNullMut<T> / NonNullInvariant<T> would solve this.

1 Like

So far we’ve had a lot of armchair theorizing about a null Default for pointers causing problems. Those in favor of a null Default have shown some real world situations in which it would improve Rust code (enabling derive Default, reducing unnecessary verbosity, and providing a safer alternative to people using mem::zeroed).

I’d like to see someone provide a real world example in which a null Default for pointers would cause a problem or create real confusion. Not a contrived or unrealistic example. A real (or at least realistic enough) situation that shows a problem.

And then I’d like to weigh that against the arguments in favor of a null Default. I suspect that the net weight would be in favor of a null Default.


I think that is indeed the important question. Should Default be “an arbitrary initializer that won’t lead to undefined behavior”, or should Default be “an initializer that provides reasonable default behavior that won’t surprise people or make it easy to write buggy code”? If the former, this impl seems fine. If the latter, we’d need some careful thought about whether this impl helps more than it hurts, and makes it hard to misuse.


I think that any struct with nontrivial safety semantics wouldn’t try #[derive(Default)] in the first place. Default already is the “zero”/“empty” of most other std types, so I think it would make sense for <*[const|mut] _ as Default>::default to be ptr::null[_mut].

I definitely think that pointers currently don’t impl Default as a lint. Any structure containing a pointer needs to consider what default makes sense for them, and we provide ptr::null/ptr::dangling to provide these two options for an invalid pointer default.

Would it be possible instead to expand the capability of the inbuilt #[derive(Default)] to allow some of the #[derive(SmartDefault)] features? Specifically, allow #[default(«expr: impl Fn»)] on members?

So you could write

use std::ptr;

pub struct MyThing {
    dangling: *const (),
    null: *const (),
    #[default(|| 5 as *const _)]
    five: *const (),

I intend to publish this eventually, but too little time and too much to do :slight_smile: