As of today rust uses a naive implementation for converting integers to decimal strings in Int Debug/Display methods. So I propose adding an optimization for the most commonly used cases, that is, converting to decimal characters.
This should be non measurable overall but might give us a minor speedup on serde and rustc serializers.
I wrote a reasonably optimized version here. Further optimizations are possible but I tried to keep the code size small (which I think is important), it’s a road of diminished gains.
The benchmarks are nothing scientific but I tried to avoid the most common pitfalls. Timing includes formating machinery (buffer is always preallocated though) so it should be reasonably close to the real world code path.
Running with rust 1.2 nightly @ x64 Linux - Intel(R) Core(TM) i7-2670QM CPU @ 2.20GHz test tests::new_08 ... bench: 562 ns/iter (+/- 111) (-18%) test tests::new_16 ... bench: 1424 ns/iter (+/- 60) (-08%) test tests::new_32 ... bench: 3342 ns/iter (+/- 92) (-16%) test tests::new_64 ... bench: 7692 ns/iter (+/- 373) (-48%) test tests::stdlib_08 ... bench: 626 ns/iter (+/- 12) test tests::stdlib_16 ... bench: 1540 ns/iter (+/- 113) test tests::stdlib_32 ... bench: 3887 ns/iter (+/- 72) test tests::stdlib_64 ... bench: 11436 ns/iter (+/- 317)