There’s one point I want to make a little bit more crisply.
Throughout this discussion, I’ve talked about users who “don’t want to care” and just want an int that “works”. I think that isn’t the best explanation for what I have in mind.
Often, when I start working with an integer, I simply don’t know yet how big the integer will become. I could, of course, stop what I’m doing and think about it and try to figure it out, but I usually pick a really big integer size and restrict the size later if I have a better sense of how big the integer might become. In this sense, i32
isn’t much different than i16
or i8
. If I have the cycles to think through it, I try to figure out whether the number I’m working with can fit into a smaller integer.
This is, in my mind, the use-case for a default integer. Not a sloppy “I don’t know anything about sizes” type, but a “I don’t know how big this will be, so let’s go with something reasonable” type.
I think if your experience is that 32-bit overflows are unicorns, the answer you’ll give to that question is i32
. If you’ve experienced gigabytes of data loss due to corruption caused by 32-bit overflow, you’ll tend towards i64
.
Also, if your program is IO-bound, you’ll still get a lot out of the very strict memory management in Rust (making the costs of Rust generally worth it). You still might not care very much how big your integers are, and will be more willing to go with a suitably large integer if you aren’t yet sure how big they’ll get.
If your program is CPU-bound and integer-heavy, you probably care a lot more about integer sizes, and will be willing to spent more time up front carefully thinking about precisely how much space you actually need, and therefore will find the defaults far less useful.
With all of that said, that mostly argues for revisiting this question once we have reasonably fast BigNum
s, which are possibly the right answer to the question: “I don’t yet know how big this will become but I need an integer now”.