I'm working on something that builds on top of the fraction crate. Fraction has nicely implemented Display for fractional and decimal types.
Now, the delimiter in a fraction is defined by 0x2f: '/' : "ASCII SOLIDUS" in stead of 0x2044: '⁄': "FRACTION SLASH" for good reason: '/' is far more easy to type than '⁄' and probably also has much better font support. However, "FRACTION SLASH" has the proper unicode semantic meaning. Also, supporting fonts display fractions using '⁄' better: 1⁄2 . Thus, provided you have a supporting font, ⁄ displays much better than /. However, if you don't have a proper font, it will end up being an ugly square-thingy, defeating the purpose of Display. It becomes even worse if I were to try to write 1½ in the proper unicode way: 11⁄2, wich renders properly with proper font support: where the user will not understand the program's output if they don't have a proper font installed. . I would say this is a design decision that would bubble up all the way to the end user: Which character sets are supported?
I would propose the following options for unambiguous Display or choice of symbols for Displaying data:
- ASCII: using only ASCII characters. A structure implementing ASCII trait is sure to be backward-compatible with any legacy C/C++ code that doesn't handle unicode.
- Multilingual: Only use letters from languages and not special Unicode blocks.
- Semantic: Using the best possible semantic meaning of Unicode values.
- Visual: Use any combination of Unicode to make your output look as visually appealing as possible.
The problem is then not necessarily in the traits, but rather in the aliases needed.