Pre-RFC: Extending where clauses with limited formal verification

..would conditions then become part of a function type? Functions/closures can be arguments to methods.. And so conditions can be in co- or contra-variant position.

In the proposition I made, the only action required by the compiler is to not repeat the assertion, when using the exact same predicate and thus explicitly not part of the type system. The goal was to allow dead simple conforming implementations. This means that generating assertion at the top/bottom of the function is probably already enough, assuming that the optimizer is able to remove duplicates, across function calls. Otherwise all pre and post conditions can be generated in the caller, which make the de-duplication of assertion even easier. Of course a smart compiler could propagate those assertions in the type system, or any kind of AST decorations, use them for optimization, smartly decide between generating them at caller or callee side, … but it is not required to do so.

I guess not in the case of a function passed as a parameter to another function? Since assertions are not part of the function type the compiler won't be able to insert them before invoking a function pointer..

What about function pointers?

The above model is too simplistic.

Post-condition can be inconditionaly generated at the end of the function.

For the pre-condition, instead of generating some assemby like this:

    // stack pointer and register set-up
    // rest of the code
    // tear done

Then this could be generated:

    // stack pointer and register set-up
    // assertion
    goto __1
    // stack pointer and register set-up
    // rest of the code
    // tear done

And then callers that can prove the pre_condition can replace call to foo by call to foo_unchecked. A call through a function pointer will always call the normal checked version.

You can actually use inlining as a way to do that. See the construction I describe in Idea: make assert!() a keyword - REJECTED - #8 by scottmcm

This whole thing looks like just some syntactic sugar for strategically placed assert!()s, that you somehow expect to be "removed" anyway. Unless we get full-fledged dependent typing, there's no way arbitrary expressions can be tied to types. At that point, doing this whole preconditions-postconditions dance really doesn't gain much of anything.

Alright, so the gain would be easier machine-discoverability. But I still don't get why that needs to be a core language feature? You can slap an attribute-like proc-macro on any function. Calling the proc-macro can transform the function body in an arbitrary way, so it might as well add assertions – it can even parse and modify the where clause however it pleases.

And then we have syn which can parse all Rust syntax including attributes, so it can be used by those external tools to gather the information from the attributes. Ideally, the proc-macro crate that performs the transform would actually expose its parsing API in order to provide a typed AST that 3rd-party tools can reuse.

Whoa, whoa. Please let's not get into subtyping just yet. With great subtyping come great problems, such as variance and the like.

Interestingly, Rust once had something similar in the form of typestates, which were removed back in 2012. The documentation explaining this feature is still very interesting to read though!


Is there something explaining what led to the removal of "TypeStates" from Rust? Anyone care to quickly summarize the arguments and conclusions?


This Stack Overflow thread describes some of the reasoning: rust - What is typestate? - Stack Overflow


This is exactly the point that would be interesting to set straight. What @robinm suggested

  • on the one hand expressly is not a type system change
  • yet this relation between a trait and its impl hugely resembles subtyping

Could somebody (selfish plea) help me understand, if this is a type system change or not? Is there really any subtyping relation between a trait and structs on which it's impl? Would there be any more co- and contra- varience involved - more than already spelled out - if the feature was implemented?

1 Like

I think not presuming the exclusion of any lifetime semantics but it more kinda depends on propositional vs semantic subtyping.

Could somebody (selfish plea) help me understand, if this is a type system change or not?

Where clauses acting as runtime assertions only wouldn't change the type system at least not by disregarding assertion errors as a possible Result of a function. But they change semantics of the program because they are able to abort it.

This was part of the problem regarding the contract proposal of C++, you could simply disable assertion checks, but what happens if the optimizer transforms code anyway even if the preconditions didn't hold?

Either we can't optimize, but we can disable checks, or we can optimize but are required to retain preconditions.

I think where clauses should be mandatory and not be disabled by unsafe tags, otherwise we could likewise state:

unsafe i:i32="Hello World"

Hi there!

...doesn't this sound a bit cryptic?

1 Like

It is possible, but:

  • Even if your function is pure, it is also eligible in receiving non-deterministic values requiring to parametrize all unknowns, e.g. length after array concatenation is then m+n, again m+n+o...., branching makes it even worse: m+n+o | q+p.
  • Inferring such parameters and let them bubble up implicitly causes additional compiler hits and becomes much harder to satisfy against preconditions as more unknowns come in that need to be tracked which may probably could be alleviated by subsuming them with one non-deterministic variable and two bound variables.
  • Existing traits may aren't able to be equipped with these constraints as it might break implementations.
  • Verifying the collection updating its length doesn't guarantee that the collection adds indeed the desired amount of elements. For this we need more powerful logic which may become undecidable.
  • ARC may require implicit synchronization, bubbling up constraints prolongs the synchronization phase.
  • Specialization requires to solve an (possible non-) linear integer problem.
  • Semantics of control structures like for and while have also to be incorporated into the compiler's knowledge, e.g.
for i =0 ... n, j=0 .. n-1 array.add(a) //=> array.length+=n*(n-1) 


for i =1 ... (n+1) array.extend(anotherArray[0...i]) //=> n(n+1)/2 

Touché, I mean if we consider x<=y simply as a membership of (x,y) to the relation <= (propositional logic, 0/1) or did we consider quantitative properties of <= like reflexivity and transitivity expressed over Higher order logic (HOL defining semantic).

The latter is more subtle as it allows simplifying for example (x<=3) && (x<=4) to (x<=3) but is generally undecidable and harder to solve.