I found this issue about changing the architecture for the compiler itself, but I've not found anything about changing the default target for things built by the compiler to newer standards. This should have fewer of the issues that are mentioned about distribution because it's just the default and users can choose to override to an older version if needed.
I think x86-64-v3 makes a good target as that's targeting CPUs introduced over 10 years ago and that's similar to changes that RedHat is making for RHEL 10. In other words, the vast majority of users wouldn't observe any change aside from a small performance bump & the rest can just override the default pretty easily.
Was wondering if there's any concerns with that & what the process would look like for making this change?
Similar idea in the same vein potentially is to change Cargo install to use target-cpu=native by default as it's seems unlikely that those binaries ever get distributed off the machine they're built on.
People do use cargo install into a specified root directory to produce images meant to run elsewhere; "native" would lead to breakage. It'd also make builds non-reproducible, in that the same build run on different systems would produce different results.
I'd love to see this happen. It would need a long lead time and plenty of announcements, and we might need to add a tier 2 target that still uses the old default.
I feel like those are CI cases that can update their flags. It's a fair point about reproducible builds, but is there infrastructure that Cargo leverages to validate that / is that planned to happen? If not, then I don't really understand the issue.
Not a hill I'm going to die on. Agreed on the announcements and tier 2 target. Pairing it with edition 2024 would have been more ideal but that ship has sailed. What would be the next steps?
IMO 10 years are not that many at this point. Some CPUs released more than 10 years ago can still be perfectly valid today for office/browser usage.
For example my dad still has an Intel Core 2nd gen in its desktop pc which only supports x86-64-v2. I also have an AMD FX8350 (also x86-64-v2) laying around somewhere, and while I changed it because compiling code was getting a bit too slow for what I wanted it was still perfectly ok for many other tasks (including playing games).
I thinking trying to switch to x86-64-v2 first makes more sense, since it reduces the possibility for breakages. The lessons learned could then be applied for a future switch to x86-64-v3 when that becomes more reasonable.
As I understand it this change of default doesn't remove any support, but just the default of what hardware can run produced binaries on default settings. If that's accurate my concerns aren't too severe, despite having a desktop in use that doesn't support v3 (maybe not v2 even, unsure).
But a significant difference when it comes to comparisons is that RHEL 8 will be supported another 5 years or so and RHEL 9 will be supported into the 2030s, whereas the lifecycle of a supported Rust version is 6 weeks.
I also agree a move to v2 before v3 would be prudent.
Additionally, the default should not exceed the native target (as the alternative is local miscompilation/unsoundness).
@quinedot that’s a very good point about not exceeding the native target. In that case should we be printing a warning message that the default is auto-lowered on their behalf or a hard error asking the user to explicitly select a target? I think warning message is fine but just checking.
I agree with @josh - this isn’t about removing support for a compilation target. Just changing what is the default / tier 1 experience.
Also I’m not convinced the risk profile of x86-64-v2 is any different from x86-64-v3 but I’m not going to argue about it if people feel strongly about moving to v2 first.
Note that while RHEL 9 moved to x86-64-v2, and RHEL 10 is moving to x86-64-v3, we're still on plain x86-64 in Fedora -- and I don't see that changing anytime soon. Many in the Fedora community do run it on old hardware, whereas enterprise customers are more likely to stick with older RHEL for their old hardware, which they can do because of the longer support period.
We need to be careful about the standard library though, at least while we don't have stable -Zbuild-std. That has to be built with the absolute minimum that we want to support for that target, especially if the compiler default is higher.
Or we'd need to have two separate targets, one for current x86-64 and one for older x86-64, each of which would have their own build of the standard library.
I think we specifically need a x86-64-v1 target because the patents have expired on that so I expect third parties to start making x86-64 cpus (in particular I'm currently working on a cpu that will hopefully support that -- basically it'll have hardware acceleration for emulating any ISA at full speed (x86, riscv, powerpc, etc.), and x86-64-v1 is one of the ISAs it'll hopefully support. later x86-64 versions too if we can figure out how to do it without getting into legal trouble).
https://libre-chip.org/first_arch/index.html
It is likely that a Steam survey is skewed. It represents people using steam, i.e. gamers. Gamers are more likely to have beefier hardware than those who don't game (since many games can be pretty demanding).
Unfortunately, I don't know of a good alternative data source. Anecdotaly I was using a Core 2 Duo Thinkpad at home for basic browsing and music playing up until ca 2017. And it was only two years ago I switched out my Sandy Bridge i5 desktop to a modern Ryzen. And I'm a software developer (so I too have reason to have powerful hardware). And I used to game as well (don't really game much any more).
I don't see myself switching out either my current laptop or desktop for the foreseeable future. They work well enough, I prefer to use that money for other things.
While I don't know, I could also imagine that in parts of the world with lower average incomes than where I live, people would run older (or second hand) hardware for longer.
In conclusion: If you don't game (and you run Linux), your upgrade cycles can be very long. But I don't know of good data sources. Linux users tend to strongly dislike telemetry, for good reasons. But it also makes things like this hard. And even if you don't run Linux, many non-technical people will just stay on old unsupported OSes for quite a long time.
(I also have some even older computers, but that is because I have an interest in retro computing, I don't use my Pentium 3 as a daily driver obviously)
I mean, changing the default doesn't mean not supporting -v1 at all.
We could always offer -v1 and -v3, pushing most people towards the latter unless they're specifically on old chips/OSs, just like how Rust no longer supports pre-Win10 on the default windows target ( Updated baseline standards for Windows targets | Rust Blog ).
Possibly towards older computers. Because a huge chunk of software runs on server hardware which isn't covered by that survey. And letting servers be more energy-efficient is also an important factor.
Most Rust programmers will just pick the default for their binary releases (which is how a lot of Rust programs are installed, especially by people who aren't Rust programmers. Whatever computers we exclude from the default will be computers that will be excluded from the majority of Rust programs.