Borrow inference/coersion for contents of Vec


I’ve heard that it’s better to have your function accept &str over &String and &[] over &Vec

The reasons for this seem like good ones. Good enough that it seems reasonable to want to do the same with the contents of a Vec, not just the container. But there doesn’t seem any way to do this currently.

Is there anything preventing us from implementing a .as_borrowed() on Vec who’s contents support the Borrow trait?

This would let us call test(&vector.as_borrowed()); below and have it work instead of having to do something like test(&vector.iter().map(|x| x.as_str()).collect());?

fn main() {
    let vector: Vec<String> = vec!["a".to_string(),"b".to_string(),"c".to_string()];

fn test(something:&[&str]){

Playground link:

evolved from this discussion:


Yes: the layout is incompatible, so it cannot be O(1).

But you can use iterators, or something like this:

fn test(something:&[impl AsRef<str> + std::fmt::Debug]){


There’s also an RFC, postponed a while back, for being able to do safe, zero-cost coercions in a bunch of cases. Not for everything Borrow, but still for a lot of things:

In particular, I believe you’d be able to coerce &[String] into &[&str] under that RFC.


I suspect that will never work, as &str is 2/3 the size of String, so the meaning of a slice would need to change drastically.

There are a variety of things that I could imagine like that, though, like safely converting a &[String] into a &[Vec<u8>].


Oh, my mistake. I was forgetting the capacity field for some reason. So in this use case, your suggestion of using a generic function is probably the best.


it doesn’t have to coerce into &str, just something borrowed and consistent. And ideally into the same thing you coerce into outside of Vecs, though that may mean changing that convention also if we can’t do into &str

what about &[String] into &[&[u8]] (or even into &[&[&u8]] because you might as well have turtles all the way down)?

Although these are specific examples it could be anything that’s easy to use and consistent and doesn’t lose functionality. (come to think of it &[u8] doesn’t offer the same functionality as &str so that might be a bad example)


Then what about inversing the convention, to coerce into &String?


Not supported, because sizeof(String) != sizeof(&[u8]), so the layouts are fundamentally incompatible.

Doesn’t help, because the problem is the inequality, not smaller/larger.


Well, consider this: theoretically, it could be possible to coerce &[String] into something like &[(&str, usize)], just padding with an unused usize. But the problem is, this is still a different type than &[&str], so you still can’t have a function that can take both types of input at runtime. You could write a generic function that gets monomorphized so it can take either type – but you can already do that, the way @scottmcm suggested.

I suppose it would be possible using a type that was like a slice reference, except instead of just having a pointer and a size, it had a pointer, a size, and a stride (allowing the values to be spaced differently than the normal packing). But of course, that has runtime overhead compared to a regular slice, so you wouldn’t use it if you didn’t need to. As opposed to the slice-versus-Vec case, where if you have a Vec, using it as a slice is no more expensive to than using the Vec (and perhaps cheaper!).


This is… interesting. I imagine that we could extend the slice type [T] to include the stride as part of its type (let’s use Stride<T, const n: usize> as a strawman). The neat thing is that &Stride<T, n> as the same layout as &[T]. The only difference now is that an access is a pointer offset by n (known at compile-time), instead of by size_of::<T>(). This is just as fast as a normal slice access, and [T] becomes an alias for Stride<T, size_of::<T>()>. Of course, we’d need to require that n be at least as large as the size of T.

In your example, we could coerce &[String] into a &Stride<&str, { 3 * size_of::<usize>() }>. Of course, to take full advantage of this we’d need const generics, so the most general type you accept is

fn foo<T, const n: usize>(stride: &Stride<T, n>) { .. }
// imagine we add `_` as sugar for the "minimally constrained parameter"
// so that we can write this somewhat less painfully:
fn foo<T>(stride: &Stride<T, _>) { .. }

Even neater, with const generics you can define this type entirely in a crate!


I think even that wouldn’t quite work right, though, since Vec (and thus String) is basically (RawVec, Length), and thus you’d get a &str with the capacity of the string instead of its length. (Well, unless you’re unlucky and repr(rust) decides to reverse the order of the fields in Vec, in which case you can str::from_raw_parts(length as *const u8, ptr as usize), which is even worse.)


Yeah I realized this last night. It’s pretty unfortunate that there’s no way to enforce field-reordering to be layout-compatible with &str. It’d certainly be neat to be able to write #[repr(prefix(T))] or whatever, though I’m not sure how useful this is. We’re already pretty far in the weeds trying to make this work.


Yeah, my biggest feeling here is that while I’m sure there’s a way we could make this work, it feels like “just use an iterator” is the right answer (perhaps with better random-access support than exists today).


It sounds like we’re already working on solving this issue in other areas of the language, so it’s likely it will come to parameters too eventually

The mental model of patterns has shifted a bit with this change, to bring it into line with other aspects of the language. For example, when writing a for loop, you can iterate over borrowed contents of a collection by borrowing the collection itself:

let my_vec: Vec<i32> = vec![0, 1, 2];

for x in &my_vec { ... }

The idea is that an &T can be understood as a borrowed view of T , and so when you iterate, match, or otherwise destructure a &T you get a borrowed view of its internals as well.