Opting out of lifetime invariance in opaque types

Currently traits with lifetime parameters are borrow checked very conservatively in contexts where the concrete type is hidden. The lifetime parameter is assumed to be invariant, since the actual information about how the type uses the lifetime is not available.

This is usually not a problem, however some designs that work fine with concrete types suddenly fail when owned trait objects[1] are introduced due to the interaction of the forced lifetime invariance and drop checking.

Here's an example of where these restrictions create problems

Playground

use std::{borrow::Cow, fmt::Debug, marker::PhantomData};

struct Holder<'a>(Vec<&'a dyn Debug>);

impl<'a> Holder<'a> {
    fn new() -> Self {
        Self(Vec::new())
    }

    fn add<T>(&mut self, x: &'a T)
    where
        T: Debug,
    {
        self.0.push(x)
    }
}

trait CompoundValue<'a> {
    fn add_all(&'a self, holder: &mut Holder<'a>);
}

#[derive(Clone)]
struct Value(String, Vec<usize>, Cow<'static, str>);

impl<'a> CompoundValue<'a> for Value {
    fn add_all(&'a self, holder: &mut Holder<'a>) {
        holder.add(&self.0);
        holder.add(&self.1);
        holder.add(&self.2);
    }
}

fn get_value<'a>(v: &Value) -> impl CompoundValue<'a> {
    v.clone()
}

fn get_value_ref<'a>(v: &Value) -> &impl CompoundValue<'a> {
    v
}

fn run() {
    let mut holder = Holder::new();

    // No type erasure, works as expected.
    let v = Value("String".into(), vec![1, 2, 3], Cow::Borrowed("Cow"));
    v.add_all(&mut holder);

    // Type erased, but the trait object isn't responsible for dropping the value so it still works
    let ref_v: &dyn CompoundValue = &v;
    ref_v.add_all(&mut holder);

    // Type erased, includes drop. Without constraints on the variance of 'a, can't be safely dropped.
    let boxed: Box<dyn CompoundValue> = Box::new(v.clone());
    boxed.add_all(&mut holder); // `*boxed` does not live long enough
                                // borrow might be used here, when `boxed` is dropped and runs the destructor for type `std::boxed::Box<dyn CompoundValue<'_>>`

    // Not technically type erased, but opaque types apparently don't leak this kind of variance either.
    let opaque = get_value(&v);
    opaque.add_all(&mut holder); // `opaque` does not live long enough
                                 // borrow might be used here, when `opaque` is dropped and runs the destructor for type `impl CompoundValue<'_>`

    // Once again the opaque type isn't responsible for drop, so this works
    let opaque_ref = get_value_ref(&v);
    opaque_ref.add_all(&mut holder);
}

/// An invariant lifetime type with a Drop impl
struct EvilValue<'a>(PhantomData<*mut &'a mut ()>);

impl Drop for EvilValue<'_> {
    fn drop(&mut self) {}
}

impl<'a> CompoundValue<'a> for EvilValue<'a> {
    fn add_all(&'a self, _: &mut Holder<'a>) {}
}

fn evil() {
    let mut holder = Holder::new();

    // EvilValue is invariant in 'a due to the PhantomData, and has a Drop impl. If you remove either of those properties, this will compile
    let evil = EvilValue(Default::default());
    evil.add_all(&mut holder); // `evil` does not live long enough
                               // borrow might be used here, when `evil` is dropped and runs the `Drop` code for type `EvilValue`
}

There isn't currently any way I'm aware of to work around these errors other than completely erasing the lifetimes, which is not easy to do soundly.

Adding the ability to specify the required variance of the lifetime parameter would be one way to allow this sort of design. One obvious way to add that capability to the language would be a trait counterpart of PhantomData. A simple marker trait Variance<T> could be used as a trait bound on the opaque type.

fn get_value<'a>(v: &Value) -> impl CompoundValue<'a> + Variance<&'a ()>

'a would be required to be covariant here, since the type passed to Variance uses 'a in a covariant position, which is consistent with how PhantomData is used to constrain type parameters inside a type. Variance could also be used as a supertrait, to force all implementers to have a certain kind of variance.

Lifetime parameters on GATs have a similar problem, and I know there's been at least one thread on allowing variance to be specified on GATs. That thread proposed using variance keywords which has the potential to be clearer in trait contexts than a PhantomData-like solution, at the expense of having two very different ways to constrain variance in the language.


  1. or RPITs ↩︎

1 Like

Prior discussion: there was some bikeshedding on variance bound syntax in this other thread.

1 Like

This actually existed before 1.0. https://github.com/rust-lang/rust/pull/23938

1 Like

I think that's a syntactic precedent, but it doesn't seem like it has any effect on the way the lifetimes are handled in trait objects, etc.


I didn't communicate this particularly well in the original post, but I'm primarily interested in whether there are major issues with the concept of allowing opaque types to opt in other kinds of variance rather than the specific syntax.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.