Std::ascii in core?

Is there a reason std::ascii isn’t in core - it doesn’t seem to rely on I/O or alloc?

To give some background: I’m writing a library for formatting byte arrays using std::ascii::escape_default, and there’s not any reason not to make it #[no_std], except for the fact that ascii isn’t in core.

AsciiExt has methods using Self::Owned that allocate for str and [u8], but I suppose those implementation could be in the alloc crate while the trait is in core.

Is liballoc allowed to impl AsciiExt for str if neither the trait nor str is defined by liballoc? (I know for normal crates the answer is no, but do the standard libraries have a way around this?)

Oh, right... but #[lang = "str"] impl str is in alloc already, so I think it might work.

Oh right - my use case uses escape_default so I didn't see those. It's more difficult than I thought.

It would be great if it did - you never know when you need to escape ascii on microcontrollers! :slight_smile:

It should also be possible to put escape_default and EscapeDefault in core, AsciiExt in alloc, then re-export them together in std::ascii.

1 Like

This would solve my use case, but I feel some of the methods on AsciiExt are useful without alloc, for example to_upper on a u8. Maybe it isn't worth the hassle of including them in core though.

EDIT: or maybe it's simply not possible, because of impl rules.

We’ve recently moved AsciiExt’s methods off of the trait and into the individual types as inherent impls.

2 Likes

In that case, I'll just put an issue in to suggest the move for escape_default and EscapeDefault.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.