Likewise, building up data in (representation specified) structures in memory to then stream out verbatim to disk or network is a pretty common pattern. You can argue that it's an antipattern, and I'd tend to agree, but it's easy and readable, so I think it's around go stay. It seems that'd be something worth making easy to get right, and hard to get wrong
Yes, I think it is an antipattern. Not only the issue of such data representation mismatch, data validation and thus security implication is big (you can say security doesn't matter for your application but it is always bad if UB occurs).
Making everything you use there bytearrays and doing explicit swapping read/write operations to it is certainly an option, but it's pretty clunky.
Probably some proc macro derive solution are there, or at least it can be implemented. You don't have to write manually.
#repr(bigendian) and #repr(littleendian) be attachable to any of the fixed-width built-in integer types. They'd only be valid where #repr(C) or #repr(packed) is already in effect.
I feel this is an arbitrary attribute. Attributes don't play well with generics too.
An alternative to attribute is auto-trait. Or a regular trait with your own
derive macros. What is the reason to prefer attributes over traits?