I must admit that having used the 2 * "foo" in python, it’s something I could definitely get behind; it can really be handy, for example when computing padding.
However the 1 + "bar" feels weird. In C, the result is "ar"; that is it’s just pointer arithmetic, and 5+"bar" is undefined behavior.
Similarly, the 2u + 1i might seem logical, but what about 2u + (-3i) ? In this case, it would feel more natural for the result to be signed, because in regular uses you are far more likely to end up with negative quantities than with quantities so big they do not fit in 32 or 64 bits.