Idea: trait bounds on constants?

In RustCrypto we have a generic implementation of the CCM mode. This mode is generic over two integer parameters (tag and nonce sizes) and the algorithm is defined only for a number of them. With typenum which we currently use defining this restriction is easy enough, we simply introduce (sealed) traits implemented only for valid sizes. But what should we do with a hypothetical migration to const generics?

One way is to abuse the proposed const_evaluatable_checked syntax:

struct Ccm<C, const N: usize, const M: usize>
where
    C: BlockCipher,
    [(); is_valid_nonce_size(N)]: Sized,
    [(), is_valid_tag_size(M)]: Sized,
{ .. }

const fn is_valid_nonce_size(n: usize) -> usize {
    match n {
        7..=13 => 0,
        _ => panic!("invalid CCM nonce size"),
    }
}

The code can be improved a bit by introduction of hypothetical require clauses:

struct Ccm<C: BlockCipher, const N: usize, const M: usize>
where C: BlockCipher
require is_valid_nonce_size(N), is_valid_tag_size(M)
{ .. }

const fn is_valid_nonce_size(n: usize) -> bool { ... }

But it still feels quite weird to restrict space of valid constant parameters using const fns. We effectively use imperative approach instead of the usual declarative one. Also this approach does not allow extension of allowed const parameters in third-party crates (we don't need it in the CCM's case, but it could be really useful in some use cases).

Const parameters behave as types in many regards, so how about allowing trait bounds on const parameters?

struct Ccm<C: BlockCipher, const N: usize, const M: usize>
where C: BlockCipher, N: NonceSize, M: TagSize
{ .. }

trait NonceSize {
    // potential helper methods tied to a constant.
    // constants will not be able to implement traits with
    // methods containing `self` parameters.
    fn foo();
}

impl NonceSize for const 4usize { .. }

Of course, this method is only practical when number of accepted values is relatively small.

Unresolved questions:

  • Should we distinguish between general traits and those intended for being implemented by constants in their definition?
  • Calling syntax for trait methods on constants.

nalgebra has an interesting approach / prior art here, in that they have roughly the equivalent of

struct Ccm<C, const N: usize, const M: usize>
where
    C: BlockCipher,
    Const<N>: NonceSize,
    Const<M>: NonceSize,
{ ... }

trait NonceSize { .. }
struct Const<const N: usize> { .. }

impl NonceSize for Const<4> { .. } 

So this shows that it's possible already to "lift" const parameters into type parameters that can implement traits, etc.

In the face of that, I don't think actually allowing trait impls for const values is too far out there.

That said, I'd need to see where const bounds in general go to know if using that system would be a better approach. const_evaluable_checked as is is (AIUI) just a stopgap working solution while we work on a better solution.

2 Likes

The motivation is compelling and I wholeheartedly agree we should prefer the declarative approach.

However I think a better solution that doesn't extend traits to values (edge cases and high complexity imo) is to go back to old school ranges. I'm taking about Pascal/ADA style range types, not to be confused with what we have in rust.

I'll leave the syntax discussion/bikeshedding for now but the idea and semantics are as follows - allow the user to define a new (sub-)type of integers. It could be a mathematical range or set of values. You could then use that type in the usual manner, such as implement traits for it and whatnot.

Straw man example:

type MyValues : u8  = {2, 4, 6, 10..=20};

Of course, there are a couple of subtleties with this. We'd probably need to carefully define sub-typing relationships and allowed operations on these.

Edit: As @CAD97 said, uplifting values into types. And if it can be done in library without additional syntax even better!

1 Like

I proposed const generics actually just being syntax sugar for types on Zulip:

hmm, what if <const N: usize> was just syntax sugar for <N: core::const::Const<usize>>?

I don't think that's quite right. By uplifting to types I meant that we ought to keep with declarative constraints over types (and not values) and the set of values should be defined by a type. As is normal practice.

In other words, Rust's design should strive to adhere to SRP: we shouldn't need to do procedural computation over types and we equally shouldn't need to use complex declarative constraints over values. In other words we shouldn't mix the two domains if possible.

The above example wants to define a set of valid nonce size values, yeah? Well, That's the role of a type. It should be:

struct CCM<C: BlockCipher, const N: NonceSize, const M: NonceSize> {...}

With a type defined somehow like in my previous post:

type NonceSize : usize = {2, 4, 6}; // whatever is appropriate here

This is still work in progress though and not even completed on nightly.

Rust already has unsafe union types, used for FFI with C. They could have an additional safe form for the above case.

union NonceSize : usize {... List of values... }

Unlike the existing unsafe union which lists fields, this form lists values which must belong to the parent type (usize in this case). This is safe cause all the values have the same memory size.

You could have a runtime value of this type and of course a const value:

let x: NonceSize = ... ;
const y: NonceSize = ... ;

Does that not look more like an enum rather than a union?

1 Like

I agree it would be nice to have ergonomic type-level ranges and sets. With them the CCM example could look like this:

// strawman syntax
type NonceSizes = Range<7usize..=13>;
// it could be worth to restrict set values to a single type
type TagSizes = Set<4usize, 6, 8, 9, 10, 12, 14>;
struct Ccm<C: BlockCipher, const N: NonceSizes, const M: TagSizes> { .. }

// for ergonomics it's worth to allow automatic conversion of
// { integer } to ranges and sets
type CcmInstance = Ccm<Aes128, 7, 4>;

But this approach does not solve the use case with potential extensibility in third-party crates. Although I am not sure if it's important enough or not.

1 Like

It seemed to me that since enum is a discriminated sum type it is a bit less suitable. The set was supposed to contain untagged values.

There's already a POC crate for a ranged type published. The Set type is more tricky to implement in a library since we don't have variadic genetics yet.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.