Let crash = foo[foo.len()];

In Greg’s KubeCon keynote, he gave the above example :play_button:.

It makes me wonder why the compiler doesn’t catch this (and similarly) predictable out of bounds index in array and Vec. That not only catches such beginner mistakes early. By doing so, all the other accesses may sometimes be known to be within bounds – so this can be one step of optimizing away the run time check.

N.b.: Compile-time array indexing bounds checks for constants discussed a similar problem – without giving a reason to postpone avoidable catastrophe to runtime.

The reason this is not a compile error is for consistency: out of bounds accesses are defined as panics, not as invalid code. It would be inconsistent to have them panic in some cases and fail to compile in other cases.

I don't know why the unconditional_panic warning doesn't trigger.

4 Likes

Hmm, so improving the compiler would be a breaking change… I wonder if any code could possibly benefit from having this panic only at run time.

“Why the unconditional_panic warning doesn’t trigger” sounds like a bug? And once that is fixed, it would be a nice candidate for strict mode, where you can opt in to such things becoming an error.

The unconditional_panic warning is already set to deny by default, i.e. it turns into a compile error by default.

But it doesn't even trigger in the following case, so I don't know what the logic behind when it triggers is, since this clearly is an unconditional panic:

pub fn f() {
    panic!("things are bad")
}

This lint is for panics that are probably unintended, like 1 / 0 or [1, 2][10] (maybe it should be renamed to unintended_panic or else, at least have the docs fixed: "The unconditional_panic lint detects an operation that will cause a panic at runtime." seems wrong).

The lint fires for out of bounds array access because arrays have their size known at compile time. But, it could also fire for vecs in some limited ways, on a best effort basis. myvec[myvec.len()] seems like something that should be linted for, indeed (and also let a = myvec.len(); myvec[a] and things like that)

Unconditional panics as a general feature are useful if you use todo!() or unimplemented!() to either signal that the function body is going to be written soon, or that it's intended to not be implemented. However many people use panic!() for either of those situations (which is probably a mistake).

(Btw why is rustdoc saying that todo!() exists since 1.0.0? The source code clearly says #[stable(feature = "todo_macro", since = "1.40.0")])

7 Likes

There will literally always be cases that panic at runtime that the lint doesn't detect, because what's obvious to a human is different from what's obvious to reasonable-to-write-and-run lint code.

The lint is not a proof assistant; it's there to catch the simple and obvious cases.

(The optimizer is far more powerful at optimizing things away than the lint is.)

2 Likes

This seems like a great potential improvement.

5 Likes

Additionally it follows from Rice's Theorem that it is impossible to not have either false positives or false negatives for a check like this.

3 Likes

This keeps coming up and I don't think it's a good argument. Technically it's correct, there will be false positives/negatives. But practically speaking, having false positives rejecting valid code didn't stop us developing borrow checker, we just do a best effort! Regard to unconditional_panic, I think we can catch more obvious cases and let complex cases slide.

4 Likes

I've found the inverse issue, option_env! which is marked as 1.0.0 in the code but rustdoc renders it as being added later.

UPDATE: another one, stringify!

3 Likes

I think rustdoc is mixing up the versions from different use paths, e.g. here:

2 Likes

Sure! But:

  • you expect and program with the borrow checker in mind. It is a fundamental part of Rust without which it wouldn't work. Avoiding panics is not so central that many would be ok with working around false positives for them.

  • the borrow checker has some clear rules, designed to work for both users and the compiler, which "best effort" solutions usually don't. The borrow checker also offers backward compatibility guarantees, which exclude introducing new false positives.

2 Likes

That's why I advocated we allow false negatives (instead of false positives) in this case. We try to catch more simple and obvious cases as deny-by-default lint, and just compile success with complex cases.

I'm not sure whether adding more deny-be-default lint is considered breaking back-compat though.

2 Likes

Given the way the lint is implemented today (evaluate every rvalue with CTFE and check if it panics), extending it to something like this is basically impossible with the same implementation, so you'd need to implement something entirely new to detect more cases.

But I'm sure that extensions to the lint that lint on obvious cases like this will be accepted just fine.

2 Likes

Does this mean rustc is doing constant folding in MIR optimizations, ahead of LLVM? Does this happen even in debug builds with no optimization? (Otherwise I don't see how this warning would get reported in debug builds)

I don't think this will satisfy. If I were going to use this lint, I would want it to give me an assurance that code that passes the lint has no remaining panic calls, and I've seen several other people say similar things (I regret I can't remember who they were). Or possibly no remaining panic calls in a particular category; panics might be considered acceptable for memory allocation failure but not for array bounds checks, for instance.

That requires erring on the side of false positives rather than false negatives.

That just sounds like a different analysis altogether. This lint seems to fall more in the "teaching/catching silly mistakes" category.

5 Likes

You are thinking about a totally different lint. I was talking about "this is silly and logically wrong", not "I don't want (specific kind of) panic in my code".

3 Likes

It's just evaluating it for the lint, not as an optimization. There used to be an optimization as well that did the same thing but that's been replaced by a more principled approach to const propagation.