Pre-RFC: `#[derive(Default)]` on enums with a `#[default]` attribute

I’m not sure I’m able to fill in the details of the argument here. Breaking changes on a Default implementation for an enum… sounds like the enum is pub, otherwise changes to that Default implementation couldn’t have any impact in terms of breaking changes. Unless there’s some public-in-private going on here, this would mean that all the enum variants are visible, too (there’s no hiding of implementation details inside of an enum) in which case—well…—the user of the enum knows which variant it is that the Default implementation is returning (e.g. through pattern matching on it). Wouldn’t that mean that “changing the default variant” is always a breaking change anyways?


I think that's where we're disagreeing. Personally, I don't see it as a breaking change (semver minor only), as it's an implementation detail (someone relying on default should be able to work with any default). If there's consensus that this is not the case, I'd be fine with narrowing the bounds to the variant from the start.

Alright, that explains your argument. I’m not that familiar with the usual semver requirements that Rust crates are supposed to uphold. Surely, a decently documented crate would probably also document what the default anyways, in which case changing it obviously is a breaking change.

So some open question would be

  • what’s the default if the docs don’t specify anything?
  • is it legitimate to explicitly document that the default value can change with minor releases?
  • how common are cases where you’d want to be changing the default variant of a public enum?

And also

  • are there any (realistic) use cases of public-in-private that would make the Default implementation bounds “visible/observable” but not expose the default variant that’s used?

It's not an implementation detail if the enum is public, because then the Default impl is part of the public API, and returning a different variant can be a breaking change, which is only allowed in minor releases if it's a bug fix.

You argue that the variant which Default returns shouldn't be relied upon. This is a valid opinion, but I believe it's not an opinion shared by many people in the Rust community. In my experience, Default impls return the most obvious default value that everyone agrees on. If there isn't an obvious default, then Default isn't implemented for the type. This means that it's very unlikely that the default variant changes later, even in a major version.

1 Like

I know RFCs aren't meant to be a popularity contest, but in this case it seems pretty much inevitable as it's a matter of opinion as to whether it's a breaking change. Given that (so far at least) I'm the only one who's expressed explicit support for having the ability to change defaults without affecting bounds, let's put it to a poll.

What bounds should be required for #[derive(Default)] on enums?
  • All generic types should require Default
  • Only generic types used in the tagged variant should require Default (others have no required bounds)
  • Some fancier heuristic (please explain how this would work!)

0 voters

In my view, the primary reason for using a derive macro is convenience. You can always fall back to writing a manual impl (or using an external derive-proc-macro crate like derivative). Therefore, I think we should only require that the tagged variant's type implements Default - I believe that this is what most users will expect to happen.

If we require all of the variant types to implement Default, I think we'll get a lot of bug reports of the form 'Why is Rust requiring this type to implement Default when it shouldn't be required to'? I suspect that saying 'because you might want to change the #[default] variant' will be unsatisfying to most users - if someone had wanted that kind of future-compatibility technique, they would probably have written a manual impl themself.

1 Like

My main reason to pick "all generic types" isn't that the default should be considered semver unstable (though I think it should be exactly as stable as struct defaults), but because I think #[derive(Default)] should apply the exact same bounds for enum derives as for struct derives.

The counter argument is that the default of an enum is often a unit variant, even for generic enums. That by itself might be enough to convince me, tbh.

The "both worlds" solution that doesn't require also solving the struct case could be something like #[default(narrow)], but I honestly don't know how much I like it.


Its not very fancy but the heuristic I'd think of is: require Default bounds for all the field types of fields in the tagged variant.

I don't think that requiring Default bounds on generics used in the tagged variant has any advantages about this heuristic besides the fact that the generated bounds are creating better rustdoc output (in the cases where they - by chance - turn out to be the correct bounds in the first place). It's a bummer that derive macros are a syntactic thing, otherwise the compiler could simplify the bounds by applying unique blanket impls if they exist (like turning Arc<T>: Default into T: Default) and removing trivial bounds (like u32: Default or Vec<T>: Default). Maybe there's a chance that rustdoc itself could perform such a simplification step though. (Edit: I just had the thought that the situation is quite similar to auto traits which seem to have decent bounds in rustdoc, so maybe this is more realistic than I initially thought.)

I also see value in keeping similarity with structs. In particular, I would (as a user) probably expect single-variant-enums to do the same kinds of derived implementations as a struct with public fields, or at least any derive macro that doesn't do this makes the situation a bit more confusing.

I guess that's another advantage that your “bounds on generics used in tagged variant” heuristic has: For single variant enums it's the same as the “bounds on all generics” approach and thus in line with the current derive implementation for structs.

1 Like

Yeah, for my custom derives I've been using bounds on field types rather than on generic parameters for a while, now; I've always found the std library derives to be quite limited in that fashion. That a newtype around *const String does #[derive] Copy, Clone just fine, but that a newtype<S> around *const S is not Copy when S = String has always seemed to me to be an inconsistency of the generics mental model.

If you want to use a stdlib derive and yet have these improved semantics, you need to jump through some hoops:

#[derive(Clone, Copy)]
pub struct MyPtr_<Ptr>(Ptr); // `Copy where Ptr : Copy`

/// This way we do get `Copy where *const S : Copy`, and thus _always_ `Copy`!
pub type MyPtr<S> = MyPtr_<*const S>;

So, the "smartest"/ least restrictive solution for the #[derive(Default)] is to bound on the field types present on that very variant: it's the exact set of necessary bounds.

  • One drawback to this approach, but I find the argument to be quite moot, is that it may be slightly easier to make breaking changes if the bounds are "too optimal" all the time :woman_shrugging:

But, since all the other derive usages are suboptimal already, alas, I'd say that we have to bound the generic type parameters instead, for the sake of consistency. Just know that we may end up with instances of:

// #[derive(Default)]
// pub enum Enum<T, …> {
//     // #[default]
//     A(Option<T>),
//     …
// }
pub enum Enum_<A, …> {

pub type Enum<T, …> = Enum_<Option<T>, …>;

as we already have to do for other #[derive]s (mainly, PartialEq and Eq when wanting to have your type feature StructuralEq on stable Rust).

That being said, bounding only on the generic type parameters appearing on that variant could also lead to breaking changes: a change of implementation whereby the #[default] variant changes could thus cause bound changes for it being Default!

So, until we get better tools to get more fine-grained control over these things1, we'd have to settle for the least dangerous / future-breaking approach: bounds on each and every generic parameter appearing on the enum :disappointed:

1Note that since this suggested syntax is the first time we get an extra parameter, which naturally leads to future extensions where parameters could be fed to it, we could perfectly envision offering a way to loosen these bounds in the future:

pub enum Enum<T, …> {
    #[default(where Option<T> : Default)]

For those that have expressed interest in having the bounds be variant-based, how do you propose handling #[non_exhaustive] variants?

Please do not suggest the ability to manually declare the bounds — I've made clear that I have no desire to include that in this RFC, as it needs some design work that I'd rather have done at another point (trying to keep this RFC small). It's listed in future possibilities for a reason.

In exactly the same way. #[non_exhaustive] means that more variants can be added later. However I still expect Default to only be implemented for an enum if there is an obvious default value that will not change with new variants. I think this is consistent with how #[non_exhaustive] already works: you can add new variants however old variants should stay the same.

To be clear, I'm asking about placing the attribute on the variant, not the enum. It would be possible to do:

enum Foo<T> {

Adding a field to the alpha variant that uses T would be perfectly legal. What should the bounds be on the generated code?


Ah I see that I indeed misread your question. You are right that this is a harder combination of attributes and its a good point to discuss now.

Personally I still think that we should base the bounds only on the types in the specific variant. If both the #[default] and #[non_exhaustive] attributes are placed on one variant then it would be best to give a warning. In this case adding a field that uses only already used generic types would not be a breaking change but adding a field that uses a new generic type would.

I think this would be a reasonable way to deal with this combination of attributes because I think a default value should be the obvious value and therefore not normally be non_exhaustive. I see a default value as part of the public API and therefore changing it should be considered a breaking change.

Another reason I think it is best to only use the specific variants types is that this is in most cases what the programmer wants. One example already mentioned is the Option<T> type. Here it is clear that None is the obvious default choice and this will not change later. We also would not want Option<T> to only implement Default if T does. I think that this behaviour is what is wanted for most enums and therefore what derive should do. If people want to deviate from this general behaviour then they can of course always implement Default manually.

1 Like

That's why I'm asking the question.

The whole purpose of #[non_exhaustive] is to allow adding any field without it being a breaking change. Why should adding the #[default] attribute change this semantic? It seems far too subtle to me, as someone would almost certainly add a field using a new type, inadvertently changing the bounds on the derived Default impl.

This is a good point. I think it would make sense to disallow the #[default] attribute on non_exhaustive enum variants.

Updated to reflect the overwhelming view that the types used in the tagged variant is what determines bounds. #[non_exhaustive] and #[default] can no longer be used on the same variant due to forward-compatibility concerns. #[non_exhaustive] can still be used on the enum itself and any other variants.


Filed as RFC 3107!