Pre-RFC: GAT variance

  • Feature Name: gat-variance
  • Start Date: 2023-11-04

Summary

Allow the variance of a trait's generic associated types to be optionally specified. Impls of that trait are rejected by the compiler if there's a mismatch between the requested variance and the actual variance of the concrete types.

Motivation

Currently, generic associated types are assumed to be invariant by default, restricting their usage in generic code that doesn't know the concrete type.

In practice, we would like to relax that assumption, and optionally enforce that the type be covariant or contravariant with respect to its generic parameters.

Consider the following trait that allows a type to be deref'd to a proxy reference type.

pub trait MyDeref {
    type Target<'a> where Self: 'a;

    fn my_deref(&self) -> Self::Target<'_>;
}

For this trait to be as close as possible to std::ops::Deref, we would like MyDeref::Target<'a> to be covariant with respect to 'a, similarly to how &'a T is covariant with respect to 'a.

Detailed explanation

Assume we want to write a trait that takes a reference to a type, and returns a proxy reference type to its contents. For example, it could take &Vec<T> and return a &[T], or take an &Option<Vec<T>> and return an Option<&[T]>.

We could write it like this

use std::ops::Deref;

pub trait MyDeref {
    type Target<'a> where Self: 'a;

    fn my_deref(&self) -> Self::Target<'_>;
}

impl<T: Deref> MyDeref for T {
    type Target<'a> = &'a <T as Deref>::Target where Self: 'a;

    fn my_deref(&self) -> Self::Target<'_> {
        self.deref()
    }
}

enum Option<T> {
    Some(T),
    None,
}
use Option::*;

impl<T: MyDeref> MyDeref for Option<T> {
    type Target<'a> = Option<T::Target<'a>> where Self: 'a;

    fn my_deref(&self) -> Self::Target<'_> {
        match self {
            Some(inner) => Some(inner.my_deref()),
            None => None,
        }
    }
}

While this works, the compiler assumes that MyDeref::Target<'a> is invariant with respect to 'a. This means that the following code is rejected:

fn foo<'a, T: MyDeref>(a: T::Target<'a>, b: T::Target<'a>) {}

fn bar<'a, 'b, T: MyDeref>(a: T::Target<'a>, b: T::Target<'b>) {
    foo::<T>(a, b)
}

Meanwhile using the current Deref trait, the following is accepted:

fn foo<'a, T: Deref>(a: &'a T::Target, b: &'a T::Target) {}

fn bar<'a, 'b, T: Deref>(a: &'a T::Target, b: &'b T::Target) {
    foo::<T>(a, b)
}

If we want our MyDeref trait to be as close as possible to Deref, we would like the first snippet to be accepted. This means restricting the generic associated type to be covariant with respect to the lifetime parameter, and enforcing that restriction for every impl block.

Our suggested syntax for adding this is through an attribute:

pub trait MyDeref {
    type Target<#[covariant] 'a> where Self: ']a;

    fn my_deref(&self) -> Self::Target<'_>;
}

This attribute allows the compiler to assume that Target<'a> is covariant with respect to 'a, so that the first example with foo and bar can compile, since both 'a and 'b can be shortened to a common lifetime.

For this assumption to be sound, the variance needs to be checked for every impl of the trait. This means that the following code should be accepted.

struct Int(i32);
struct IntRef<'a>'(&'a i32);

impl MyDeref for Int {
    type Target<'a> = IntRef<'a> where Self: 'a;

    fn my_deref(&self) -> Self::Target<'_> {
        IntRef(&self.0)
    }
}

On the other hand, the following code should be rejected, due to fn(&'a S) being contravariant over 'a.

struct S;

impl MyDeref for S {
    type Target<'a> = fn(&'a S) where Self: 'a;

    fn my_deref(&self) -> Self::Target<'_> {
        fn inner<'a>(s: &'s) {}

        inner
    }
}

Our examples focus on covariant lifetimes, but contravariant lifetimes and variance with respect to generic type parameters can be similarly specified and enforced.

Drawbacks

None that I can think of.

Rationale and alternatives

As an alternative, we could use a keyword instead of an attribute to specify the variance. At the moment it is unclear to us if that would be a better option.

The impact of not doing this is that GATs will remain limited by the restriction of invariance with respect to their generic parameters, thus limiting their usabilitiy in generic code.

Prior art

?

Future possibilities

In the future, we would also like to suggest adding traits to the standard library that are similar to Deref/DerefMut and Index/IndexMut, but can return proxy reference types instead, instead of being restricted to language references. Once that's done we could change the language syntax desugaring to use the new Deref[Mut]Proxy and Index[Mut]Proxy for &*val and &val[index] once all the unresolved questions with that design are solved.

6 Likes

Explicit variance annotation has been discussed previously:

Personally, I think covariance and contravariance should be treated like auto traits. For example, consider the following:

trait WithGat {
    type Gat<T>;
}

impl WithGat for () {
    type Gat<T> = Option<T>;
}

struct Foo<T: WithGat, U> {
    a: T,
    b: <T as WithGat>::Gat<U>,
}

Under current rules, the above is covariant in T and invariant in U, for any T and U. However, Foo<(), U> could potentially be inferred covariant in U—analogous to how auto-trait inference works today.

A more trait-bound like model of variance would give extra flexibility in other ways. For example, it would allow constraining variance only at the use-site:

#![feature(trait_alias)]
// Imagine that only some `MyDeref` impls can provide covariance

trait MyDerefCovariant = MyDeref
where
   for<'a> (<Self as  MyDeref>::Target<'a>: covariant_in<'a>);

fn foo<'a, T: MyDerefCovariant>(a: T::Target<'a>, b: T::Target<'a>) {}

fn bar<'a, 'b, T: MyDerefCovariant>(a: T::Target<'a>, b: T::Target<'b>) {
    foo::<T>(a, b)
}
1 Like

i can see the benefits of that approach, but i feel like it could be overkill in this scenario. genericity over variance sounds like it could be implemented in an orthogonal way to enforced variance

i think there's possibly room in the language for both

If you have variance bounds, enforced variance falls out of the design "for free":

pub trait MyDeref {
    type Target<'a> where Self: 'a + covariant_in<'a>;

    fn my_deref(&self) -> Self::Target<'_>;
}
1 Like

oh, that's neat! yeah i didn't consider the case of it being used as a bound inside the trait directly

I'm no guru with many of this advanced stuff, so I'm commenting on the syntax alone, but having an attribute added to a lifetime, while it is clear what it means, seems out of place syntactically compared to anything I've seen in Rust anywhere else, and like maybe it's a lot of text in a weird spot between the angle brackets.

1 Like

It is possible to implement that today with a method fn shrink<'a: 'b, 'b>(value: Self::Target<'a>) -> Self::Target<'b>. Yes, it is not convenient, but is the use-case really that common?

A helper function does not compose well, for example HashMap<u32, T::Target<'a>> should be a subtype of HashMap<u32, T::Target<'b>> if 'a is a subtype of 'b and T::Target is covariant in its lifetime. However, it's not right now because T::Target is invariant in its lifetime. (Of course you can reconstruct the map, but that's not the point of my example, and it would be expensive). Subtyping coercions are also is guaranteed to be a noop, which is not the case for your method.

Technically there is precedent of this with #[may_dangle]:

unsafe impl<#[may_dangle] 'a> Drop for ...

I don't think this syntax / fixed-lifetime bound would be able to properly express covariance.

I'd rather expect:

type Target<'a>
where
    Self : 'a,
    for<'b where 'a : 'b>
        Self::Target<'b> : IsSubTypeOf<Self::Target<'a>>
    ,

Needless to say, we are years away from this becoming a thing. Variance of GATs, on the other hand, once given a syntax for it, is something that should be quite simple/fast to implement.

As a general rule, I've heard language designers such as Niko Matsakis point out that they prefer variance over htbs, so even in a world where subtyping were to be expressible with bounds, having a more direct #[covariant] syntax would seem preferable.


For reference, working around this nowadays is quite cumbersome. Either every implementation needs to provide a "subtyping function" (which technically can do other stuff beyond subtyping), or a macro has to be involved.

Demo:

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.