Refining variance and traits

Now that we’ve had time to gain some experience with variance, I’ve come to think that we should subset the Variance RFC design and make all traits invariant with respect to their input types (and lifetimes). The primary motivation for supporting variance on traits was so that closure objects (e.g., Box<Fn(T)>) would exhibit variance, but in fact they do not. This is because the closure return type is now an associated type, and any trait with an associated type is now always invariant with respect to its inputs.

The remaining use cases for variance and traits feel somewhat limited to me. One can write a Getter trait, for example:

trait Getter<A> { fn get(&self) -> A; }

But in this case the type A is almost certainly better modeled as an associated type. Similarly, one can use Default, and infer that because &'static T: Default, we know deduce that &'a T: Default, but this doesn’t really occur in practice and in any case could typically be handled with higher-ranked trait bounds. (In the worst case, a need for variance can be worked around with a new type or wrapper.)

On the other hand, inferring variance for traits requires MarkerTrait and PhantomFn to ensure that all inputs are used. Given the lack of a strong use case, it doesn’t seem worth requiring these markers.

The effect in practical terms

Basically this change would mean that we can deprecate MarkerTrait and PhantomFn and remove them by 1.0. Trait matching would always be invariant. PhantomData would be unaffected.

In principle, there is some loss of expressiveness, but I’ve done the experiment to implement this in the compiler and no existing code was affected, other than tests. I wouldn’t expect impact in cargo either.

Future possibilities

If we opt to make all traits invariant now, then we could still permit variance in traits via some sort of opt-in. This might then be useful also for associated types. We’d have to work out precisely what this means and how it should look, but presumably at that time we’d have strong use cases to use as a model.

It is also maybe possible to go back towards an inferred system, but there are some corner cases (such as impls that might overlap if trait matching is variant) that could potentially break, though I judge these unlikely to occur in the wild (but you never know). Another problem would be handling unconstrained cases (i.e., those cases that today require marker traits and phantom fns).

Thoughts?

Can one implement Peano-Numerals without marker traits as in dimensioned? Or anything in shoggoth.rs?

Just removing PhantomData and PhatomFn would make the code compile.

Update: this PR turned out to cause problems with some new traits that were, ironically, written by myself. So perhaps I was wrong about the lack of importance of variance. I will do a bit more digging and return to this topic.

Seems like the PR in question is already merged. However, I’m still unclear on this.

  • Why is PhamromFn removed, but not PhantomData?

  • What kinds of things were impacted by this change?

To answer the second question first, I am not aware of any real fallout from this change. I can certainly construct examples that would break, but I don’t have any that are from actual code we found in the wild. What was removed was specifically “trait variance”, which basically means being able to approximate lifetimes as part of trait bounds or object types. To put it another way, if I know that (e.g.) &'static T: SomeTrait, can I deduce from this that &'x T: SomeTrait must hold? The answer on nightly is basically “no”, you can’t ever use an impl written just for 'static with any other lifetime. Because most impls that reference lifetimes wind up being written generically over all lifetimes (i.e., impl<'a> SomeTrait for &'a T, not impl SomeTrait for &'static T), this doesn’t have much effect in practice.

Now, as to the first part. The reason that PhantomData is still around is because variance on data types like Option is still quite useful. For example, if I have a Option<&'static Foo>, I can still use that where Option<&'x Foo> is expected.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.