I was playing around with compiler explorer and I found some very strange optimizing behavior around unused arrays.
The code looks like this:
pub fn weird() {
let _unused = [1_u32; 881];
}
For arrays with sizes below 881, this generates the expected assembly, namely a single ret
, but for sizes >= 881, there is some assembly being generated. This happens at different array sizes for different types (it never happens with u8) and in all cases allocating an array of 0s is always optimized away.
This is a table of maximum sizes of arrays that are optimized away for a few types:
type | max. size |
---|---|
*16 |
2400 |
*32 |
880 |
*64 |
880 |
*128 |
297 |
&() |
900 |
&str |
148 |
(u{8, 16, 32, 128}, u{8, 16, 32, 128}) |
148 |
(u64, u64) |
297 |
(u8, u8, u8) |
99 |
(u64, u64, u64) |
148 |
It's clear that the optimization depends at least somewhat on the size of the type but the numbers are mysterious to me and in any case the array should be completely optimized away in every situation. I should note that all this data relates to -C opt-level=3
.
Interestingly, if the array is initialized with 0s and then filled with 1s in a loop, the optimized acts correctly and ignores the whole thing, the problem really only occurs when using the array literal initialization with Copy
types.