Impl Default for pointers?

Is there a reason not to add something like the following to libstd?

impl<T: ?Sized> Default for *const T {
    fn default() -> Self {
impl<T: ?Sized> Default for *mut T {
    fn default() -> Self {

Null isn’t necessarily a reasonable default value for a pointer. Some users of pointers may expect a potential null; other users may expect a valid pointer and explode on a null.

I don’t want to suggest we shouldn’t do this, more that I can see arguments both ways. Do you have a specific use case in mind for this?

It might make more sense, for instance, to impl Default for a struct containing a pointer field, where you know the field can reasonably contain a null pointer.

1 Like

Somewhat related: impl<T> Default for NonNull<T>?


I don’t see how a user could expect a default pointer, to which they have supplied no argument, to be non-null.

That’s not what I’m suggesting.

Rather, I’m suggesting that someone may have a struct, containing a raw pointer, which has the higher-level semantic that the pointer should not be null. In that case, the pointer doesn’t have a reasonable default, and it shouldn’t impl Default. (And the struct should not be able to derive Default.)

1 Like

That’s like saying Option shouldn’t impl Default because the struct might have a higher level guarantee that the Option is never None. We’ve even developed an API that such a struct should be using instead of a raw pointer: NonNull, which encodes exactly that guarantee.


While I agree that such code/structs should potentially use NonNull, they don’t always (especially given that NonNull is new).

It doesn’t make sense to put an Option in a struct when you know that it must never be None. It does make sense to put a pointer in a struct that you expect to be valid.

1 Like

One use case for this is safe initialization of FFI structs like glob_t. Typically this is initialized to all-zero/null, then passed to a function like glob that writes actual data to the struct.

Currently there are two options for constructing these values in Rust:

  • Write all the fields and values by hand. Requires (potentially large amouts of) boilerplate that would be easy to auto-generate.
  • Use std::mem::zeroed. Requires unsafe even though it returns a value that can be constructed in safe code (so the unsafe requirement adds no protection).

If pointers implement Default, then these structs can #[derive(Default)], and initialization becomes simple, safe, and free from boilerplate.


It seems like this is one of those things that could have been done at any point in the last few years, which makes me feel like there must have been a rationale for not doing it. Anyone have an idea of who might remember such a conversation?

They could potentially be expecting align_of::<T> as *const T, the usual dangling pointer, as in the thread @Ixrec linked. (Though that’s certainly not what I’d expect Default to do on a pointer.)

But “someone could write a bug” is not a good argument for not providing an API! This API would be genuinely useful, is not inherently prone to bugs: you have to both use the wrong type that doesn’t provide the invariant you care about (step 1), and then write additional code that violates the invariant because you used that wrong API (step 2). And of course we already provide a mechanism for constructing invalid values like this, just one that is not as useful as this one would be.

This is the sort of pessimistic argument for making everyones’ lives harder that is so common on this forum, and so completely unconvincing to me.


There’s nothing pessimistic about it, it’s decades’ programming experience: if people are not required to think about violating an invariant, they won’t, and they’ll write bugs. Nobody is actively and maliciously trying to make people’s lives hard. If we want to encourage writing memory-safe code even in the presence of raw pointers, then giving users an invalid pointer without them needing to stop and think about it is a bad idea.

There are many legitimate use cases for raw pointers that can be null (i.e. they can’t be NonNull) but that doesn’t automatically imply that the default value should be null (as it’s an exceptional value), it doesn’t even mean that it makes sense as e.g. an initial value in the context where it’s used.

I’m thinking that preferring the convenience of not having to write a ptr::null and getting it by default over the requirement of ensuring that it makes sense for a pointer to be null is completely against Rust’s philosophy.


That’s the use case I would have liked not to have to write my own Default for, FWIW.


IMO raw pointers are just necessary evil and should be wrapped by other types as much as possible, and all the quality-of-life features should be implemented on that wrapper type. If you want null-by-default pointer, maybe you can use Option<NonNull> for it?


Is Option<NonNull<T>> guaranteed to work where FFI wants a *mut T or a *const T?

(BTW, are the FFI guarantees listed somewhere?)

Yes, Option<NonNull<T>> should work everywhere raw pointers work.


I, for one, would love null-based Default for raw pointers.

The ergonomics of Option<NonNull> are a nightmare compared to raw pointers. And you lose mutability specifiers (there’s no *const T type-level equivalent for Option<NonNull> unless you write your own ConstNonNull wrapper around NonNull). And I believe ergonomics are important for safety; bad ergonomics don’t prevent bugs, they cause bugs.

Note, though, that Option<NonNull<T>> was only recently made FFI-safe, so if you’re not using nightly you’ll get a warning about it if you use it for FFI.


In my opinion, the very notion of a default value does not make sense for pointers. Even though the documentation does not explicitly mention that, it seems that a default value is meant to be a safe fallback. In other words, it should not require you to change it to something else in order for your program to work correctly. Think of it it as of default configuration options. I do not see how a ptr::null default can be a safe default in Rust, since the first attempt to use it will most likely be the last in the lifecycle of your program.

It may make sense in a context of some particular C API, because in C a null pointer is often used as an indicator of a mission value, a poor man’s None option variant. But even in C it is just a convention, not a solid rule. Even worse, th well known snprintf is designed to accept a null pointer argument for its output buffer parameter as an indication that we want it to just calculate how large a buffer should be, while at the same time passing a null pointer as a format string is undefined behaviour.

I like the definition of the term “default” in Cambridge Dictionary:

the thing that exists or happens if you do not change it intentionally by performing an action:

  • Unless something else is agreed, the default is to meet at the hotel at 7.00 p.m.
  • The computer will take 0 as the default value, unless you type in something different.
  • In default of (= because there is not) any better alternative, we will have to proceed with the original plan.

Two of the three examples explicitly state that something will happen even if you do not make a change, which makes sense to me.


I don’t buy this. I have always believed that the ergonomics around unsafe should be kept to a minimum, because in practice, asymptotically zero Rust programmers should be writing unsafe code. Making something easy to use will encourage people to play with it, and unsafe is not a toy. Plus, when it comes to unsafe, readability is paramount over ergonomics.


This seems a textbook example of disagreeing about words, rather than what anyone’s trying to say with the words.

I suspect everyone agrees that, for example: it should be easy to tell what’s going on in someone else’s unsafe code; replacing *ptr with dereference(ptr) would be merely adding noise and making overall comprehension harder; making raw pointers do auto-deref would be dangerous rather than helpful; and so on. Which of those fall under the word “ergonomics” is unclear but also unimportant.


Regarding the ergonomics of unsafe code, I would suggest much more specific guidance: hard to misuse (rather than easy to misuse). That’s not the same as “easy to use”.

I’m very much in favor of making it easy to do the right thing, including in an unsafe block. The question in this thread then becomes: does an implementation of Default make it easier or harder to do the wrong thing?