- Feature Name:
gat-variance
- Start Date: 2023-11-04
Summary
Allow the variance of a trait's generic associated types to be optionally specified. Impls of that trait are rejected by the compiler if there's a mismatch between the requested variance and the actual variance of the concrete types.
Motivation
Currently, generic associated types are assumed to be invariant by default, restricting their usage in generic code that doesn't know the concrete type.
In practice, we would like to relax that assumption, and optionally enforce that the type be covariant or contravariant with respect to its generic parameters.
Consider the following trait that allows a type to be deref'd to a proxy reference type.
pub trait MyDeref {
type Target<'a> where Self: 'a;
fn my_deref(&self) -> Self::Target<'_>;
}
For this trait to be as close as possible to std::ops::Deref
, we would like
MyDeref::Target<'a>
to be covariant with respect to 'a
, similarly to how
&'a T
is covariant with respect to 'a
.
Detailed explanation
Assume we want to write a trait that takes a reference to a type, and returns a proxy reference type to its contents.
For example, it could take &Vec<T>
and return a &[T]
, or take an &Option<Vec<T>>
and return an Option<&[T]>
.
We could write it like this
use std::ops::Deref;
pub trait MyDeref {
type Target<'a> where Self: 'a;
fn my_deref(&self) -> Self::Target<'_>;
}
impl<T: Deref> MyDeref for T {
type Target<'a> = &'a <T as Deref>::Target where Self: 'a;
fn my_deref(&self) -> Self::Target<'_> {
self.deref()
}
}
enum Option<T> {
Some(T),
None,
}
use Option::*;
impl<T: MyDeref> MyDeref for Option<T> {
type Target<'a> = Option<T::Target<'a>> where Self: 'a;
fn my_deref(&self) -> Self::Target<'_> {
match self {
Some(inner) => Some(inner.my_deref()),
None => None,
}
}
}
While this works, the compiler assumes that MyDeref::Target<'a>
is invariant with respect to 'a
.
This means that the following code is rejected:
fn foo<'a, T: MyDeref>(a: T::Target<'a>, b: T::Target<'a>) {}
fn bar<'a, 'b, T: MyDeref>(a: T::Target<'a>, b: T::Target<'b>) {
foo::<T>(a, b)
}
Meanwhile using the current Deref
trait, the following is accepted:
fn foo<'a, T: Deref>(a: &'a T::Target, b: &'a T::Target) {}
fn bar<'a, 'b, T: Deref>(a: &'a T::Target, b: &'b T::Target) {
foo::<T>(a, b)
}
If we want our MyDeref
trait to be as close as possible to Deref
, we would like the first snippet to be accepted.
This means restricting the generic associated type to be covariant with respect
to the lifetime parameter, and enforcing that restriction for every impl
block.
Our suggested syntax for adding this is through an attribute:
pub trait MyDeref {
type Target<#[covariant] 'a> where Self: ']a;
fn my_deref(&self) -> Self::Target<'_>;
}
This attribute allows the compiler to assume that Target<'a>
is covariant
with respect to 'a
, so that the first example with foo
and bar
can
compile, since both 'a
and 'b
can be shortened to a common lifetime.
For this assumption to be sound, the variance needs to be checked for every impl
of the trait.
This means that the following code should be accepted.
struct Int(i32);
struct IntRef<'a>'(&'a i32);
impl MyDeref for Int {
type Target<'a> = IntRef<'a> where Self: 'a;
fn my_deref(&self) -> Self::Target<'_> {
IntRef(&self.0)
}
}
On the other hand, the following code should be rejected, due to fn(&'a S)
being contravariant over 'a
.
struct S;
impl MyDeref for S {
type Target<'a> = fn(&'a S) where Self: 'a;
fn my_deref(&self) -> Self::Target<'_> {
fn inner<'a>(s: &'s) {}
inner
}
}
Our examples focus on covariant lifetimes, but contravariant lifetimes and variance with respect to generic type parameters can be similarly specified and enforced.
Drawbacks
None that I can think of.
Rationale and alternatives
As an alternative, we could use a keyword instead of an attribute to specify the variance. At the moment it is unclear to us if that would be a better option.
The impact of not doing this is that GATs will remain limited by the restriction of invariance with respect to their generic parameters, thus limiting their usabilitiy in generic code.
Prior art
?
Future possibilities
In the future, we would also like to suggest adding traits to the standard library
that are similar to Deref
/DerefMut
and Index
/IndexMut
, but can return
proxy reference types instead, instead of being restricted to language references.
Once that's done we could change the language syntax desugaring to use the new
Deref[Mut]Proxy
and Index[Mut]Proxy
for &*val
and &val[index]
once all
the unresolved questions with that design are solved.