The question is: how long are we willing to extend the set of primitive types? If we have u512
without hardware support, why should we stop there? Then why not bake in u1024
, u2048
, and so on?
I think at this point it's unreasonble to expect extending the set of primitive integers even further ad infinitum. There should instead be a single, const
generic, preferably non-heap-allocating, arbitrary-width integer type for those who want to do big math. It could use intrinsics to lower to LLVM's arbitrary-width integers as well. This would resolve the endless debates as to what does and doesn't belong in the core language.