`for _ in _` loops could have the same semantics as `while let Some(_) = _.next()`

Consider the case where you want to run over an iterator, but also need access to the iterator during iteration. An example:

let mut iter = something.iter().peekable();
while let Some(v) = iter.next() {
    if some_fn(v, iter.peek()) {
        // do something

If you try to do this using for v in iter you get a borrow error because the for loop consumes rather than borrows the iterator. Could the behavior of for _ in _ be changed to more closely mirror the while loop option, by only borrowing the iterator when getting the next element? this would have to be at an edition boundary, I assume.

How would this interact with for x in collection where collection doesn't implement Iterator?

For that matter, what about for x in collection.iter() - would the iterator still be somehow available inside the loop?

1 Like

The current desugaring of for x in y { z } is like

    let mut iter = IntoIterator::into_iter(y);
    while let Some(x) = iter.next() {

so this wouldn't work naively, because y is always consumed by into_iter() (even though in some cases it is the identity function). There might be some fancier rule that would work, but I doubt there's one that would be considered reasonable to be put in the language.



Maybe this could be enabled with a more advanced form of NLL which allows lifetimes to have holes in them. Then, a user could write for x in iter.by_ref() and since x doesn't borrow from the borrow of by_ref (guaranteed by the contract of Iterator::next), iter would be available to borrow in the loop body, for example by calling peek.

However, I'm not 100% sure that lifetimes with holes are sound or ever possible. I'm way out of the loop on NLL.

I think I prefer the need for different constructs, as it's a clue something besides the loop might modify the iterator. For example, with while let something in the body of the loop might call iter.next(), too.


Thanks all for the comments.

I was thinking of this desugaring changing (over an edition boundary) to just work on the iterator y directly when it already implements Iterator. This would not be backwards compatible where the IntoIter::into_iter is not the identity for types that implement Iterator.

I think the comments have put me off the original idea, because it increases complexity (different behaviour in different situations).

Such types don't exist, since for every Iterator type[1], the IntoIterator implementation is given by the standard library blanket implementation.

There's other breakage though. E. g. a closure || for _ in foo {} would no longer move foo, so this closure would start to fail being 'static.

Edit: On that point, it should at the moment be fine to turn an IntoIter type into a full Iterator without breakage, i. e. semver-compatibly, but with that new desugaring, such a change can now cause breakage (due to the closure capturing behavior mentioned above). If closure capturing were the only problem (which I'm not sure if they are) , then special capturing rules might technically solve this, but at that point the increase in language complexity would be absurd for the little benefit of not needing to type a Some and a .next().

  1. ignoring unsized cases, which are irrelevant for the for desugaring ↩ī¸Ž


If the for loop supported streaming iterators, there could be a version of peekable that iterates over tuples of (current_moved, &next_borrowed) elements, so you could do:

for (current, next) in iter.streaming_peekable() {
1 Like