I would agree here. Interestingly, the docs on AsMut give a somewhat awkward example though:
Examples
Using
AsMutas trait bound for a generic function we can accept all mutable references that can be converted to type&mut T. BecauseBox<T>implementsAsMut<T>we can write a functionadd_onethat takes all arguments that can be converted to&mut u64. BecauseBox<T>implementsAsMut<T>,add_oneaccepts arguments of type&mut Box<u64>as well:fn add_one<T: AsMut<u64>>(num: &mut T) { *num.as_mut() += 1; } let mut boxed_num = Box::new(0); add_one(&mut boxed_num); assert_eq!(*boxed_num, 1);
Wouldn't this be better written as follows?
fn add_one(num: &mut u64) {
*num += 1;
}
fn main() {
let mut boxed_num = Box::new(0);
add_one(&mut boxed_num);
assert_eq!(*boxed_num, 1);
}
The .as_mut() call is entirely superfluous in this example. Not sure about an API that takes an impl AsMut<u64> though. Is there any real use-case for it? What are the real use-cases of AsMut anyway?
Note that here the same oddities arise that exist with AsRef:
fn add_one<T: AsMut<u64>>(num: &mut T) {
*num.as_mut() += 1;
}
fn main() {
let mut boxed_num = Box::new(0);
add_one(&mut boxed_num);
assert_eq!(*boxed_num, 1);
let mut num = 0;
//add_one(&mut num); // fails!
let mut referenced_num = &mut num;
//add_one(&mut referenced_num); // fails too!
}
Compare with:
-fn add_one<T: AsMut<u64>>(num: &mut T) {
- *num.as_mut() += 1;
+fn add_one(num: &mut u64) {
+ *num += 1;
}
…
- //add_one(&mut num); // fails!
+ add_one(&mut num); // works!
…
- //add_one(&mut referenced_num); // fails too!
+ add_one(&mut referenced_num); // works too!
where the failing lines compile properly. (Playground)
I'm asking because I have been thinking about how to improve the documentation further:
This is what I came up with, so far.
Generic Implementations
AsRefauto-dereferences if the inner type is a reference or a mutable reference (e.g.:foo.as_ref()will work the same iffoohas type&mut Fooor&&mut Foo).Note that due to historic reasons, the above currently does not hold generally for all dereferenceable types, e.g.
foo.as_ref()will not work the same asBox::new(foo).as_ref(). Instead, many smart pointers provide anas_refimplementation which simply returns a reference to the pointed-to value (but do not perform a cheap reference-to-reference conversion for that value). However,AsRef::as_refshould not be used for the sole purpose of dereferencing; instead ‘Derefcoercion’ can be used:let x = Box::new(5i32); // Avoid this: // let y: &i32 = x.as_ref(); // Better just write: let y: &i32 = &x;Types which implement
Derefshould consider implementingAsRefas follows:impl<T> AsRef<T> for SomeType where T: ?Sized, <SomeType as Deref>::Target: AsRef<T>, { fn as_ref(&self) -> &T { self.deref().as_ref() } }
The idea here is to discourage using superfluous .as_ref() and .as_mut() calls where actually deref-coercion is what should be used instead. But the existing example in AsMut's docs would conflict with that recommendation.
So my question is: What's the actual use of AsMut? (beside dereferencing Box'es, for which it maybe shouldn't be used?)
Update:
PR #28811 added the non-transitive dereferencing AsMut (and AsRef) implementation(s) to Box (and Rc and Arc), saying that:
These common traits were left off originally by accident from these smart pointers, […]
[…]
These trait implementations are "the right impls to add" to these smart pointers and would enable various generalizations such as those in #27197.
If I understand right, then the referenced use case is being generic over AsRef<[u8]>. Aside of that being the non-mutable case, an implementation like
impl<T, U> AsMut<U> for Box<T>
where
T: ?Sized + AsMut<U>,
U: ?Sized,
{
fn as_mut(&mut self) -> &mut U {
self.deref_mut().as_mut()
}
}
would do just fine (or even better, Playground) for that particular use case, because AsRef and AsMut are reflexive for slices.
So maybe the example in the AsMut docs should be replaced to use an AsMut<[u8]> instead of AsMut<u64>. Generally, using AsRef<T> or AsMut<T> in cases where reflexivity is not implemented for a particular type T leads to odd behavior (because you cannot pass T, &T, or &mut T, as shown in this Playground above).
There aren't really many implementations of AsMut in std, so perhaps the following could be a suitable example for the docs: Playground. (Along with some descriptive text.)