Semantics of AsRef

For the sake of analyzing where exactly the current state of std is inconsistent, or where mistakes have been made, I would like to share some more theoretical considerations. Note that I'm doing these considerations not to conclude what to do or what to change in Rust in the short-term right now, but to better understand what's wrong as a first step (I'll get back to what to do later in this post). For these considerations, I would like to take the limitations of Rust's type system as granted, i.e. I assume that we have no negative bounds or specialization.

We first can see that the blanket implementation of AsRef<U> for &T conflicts with a blanket implementation of AsRef<T> for T (reflexivity). Moreover, any blanket implementation of the form AsRef<U> for SomeTypeConstructor<T> would conflict with a blanket implementation for reflexivity. But blanket implementations of the form AsRef<T> for SomeTypeConstructor<T> do not. (Playground)

Thus, given the constraints of Rust's type system, we have two choices:

  1. Make AsRef reflexive through a blanket implementation.
  2. Allow references, generic wrappers, smart pointers, etc. to act transparent in regard to implementing AsRef in the same way as the the wrapped or targeted type does.

Let's for a moment assume that we decide to make AsRef reflexive through a blanket implementation. That certainly saves us some explicit implementations for concrete types. However, the benefit is limited, because:

  • We could also manually implement AsRef<T> for T for all T which support cheap reference-to-reference conversions from and to other types (e.g. str and Path).
  • In the generic case it is of no use, because wrappers, smart pointers, etc. will not pass that implementation through, e.g. Cow::<str> would not implement AsRef<Path> unless there existed a concrete implementation for that. (Playground)

For that reason, I believe (given the constraints of Rust's type system) it was a good choice to refrain from providing a blanket implementation AsRef<T> for T. This makes it possible to implement AsRef<U> for &T as well as AsRef<U> for SomeTypeConstructor<T> in general.

But do we really want AsRef<U> for &T or AsRef<U> for SomeSmartPointer<T>?

We have three choices here:

  1. Provide a blanket implementation of AsRef<T> where T is the inner type (or target) for all of these (references as well as smart pointers).
  2. Provide a blanket implementation of AsRef<T> only for smart pointers (with target T) but implement AsRef<U> for &T where T: AsRef<U>.
  3. In all cases (references as well as smart pointers), provide a blanket implementation of AsRef<U> where the inner type T implements AsRef<U>.

The first and second option seems to be nice because it allows us to "unwrap" any type from a Cow<'_, T> or Box<T> or Rc<T> even if the inner type T: !AsRef<T>. But as shown in the previous Playground, it makes it impossible to generally pass &SomeSmartPointer<U> where we expect an &impl AsRef<U>.

Only the third choice seems to come with a true and consistent benefit for generic APIs as shown in my post #4 in this thread: Playground.

I would therefore conclude that the following implementations of AsRef are indeed "mistakes":

But honestly, the whole subject is pretty complex and confusing, and I'm not sure if I made a mistake in my reasoning.

What do do?

First of all, I would appreciate some feedback in regard to whether my above reasoning seems to be correct or whether I made a mistake somewhere.

Assuming the analysis is correct, I think it's best to document these inconsistencies. Regarding my proposal to update the docs (see previous post), maybe it's even wise to give specific advice to not make the same mistake in third party crates (but I'm not sure about that).

I really would like to see if these mistakes could be solved in future (or if it could be shown that these are no mistakes). As I said here, simply solving this with coherence-impacting negative bounds or (an improved variant of) specialization doesn't seem to be possible as outlined here:

Maybe this would be a feasible solution to improve things:

Either way, it's not going to be easy, I assume.

2 Likes