Start of an effects system RFC (for async etc) - is there any interest in this?


and tuples are effectively forgotten.


Note that one of the main inspirations for the effects system here was this paper.


perhaps some effects need a context arg for some functions

Just picking on this example, the idea is that the handler is the context, so effect methods would take a &self arg.

how does this actually get implemented? Monomorphisation of every function per effect combination?

That’s the idea, yeah. There would be a global set of pointers to the handler data for each currently active effect though.


Are any of today’s stable and unstable mechanisms for allocation, panicking, async I/O, and blocking I/O forward-compatible with this? What would it take to make them compatible?

I’ll need to track down all the relevant APIs and check. Briefly: for allocation, panicking and blocking IO, existing standard library functions could be re-implemented to just call the underlying effect. The core/std libraries would come equipped with handlers which are present when main is called. Async IO would need to be re-done, something like mio would have to become part of the Io interface so that a non-blocking effect handler could be implemented.

Pretend I have a library that subscribes to Twitter updates…

The problem with this example is that you’re calling process_updates twice in parallel. If process_updates takes a &mut self (which it should do) then this is statically impossible, the task which is suspended on Io::print_stdout is still mutably borrowing the TwitterFeed. Generally, any race condition which is hard or impossible to cause today using threads should be equally hard or impossible to cause when you swap threads for green threads. You should get the same compile errors.


@canndrew I’ve made a GH repo dedicated to effects at, with the main focus on const polymorphism, but I’d like to get a deep understanding with examples, etc. of how everything translates into a “core calculus” and effects in general. You’re welcome to participate, and thanks for the Frank paper link.


Right, clearly I should’ve used a Cell<T> in there. It’s possible that sync->async transformations can only introduce bugs when interior mutability is involved.

But that doesn’t really refute the point that sync->async transformation could introduce severe logic bugs. In particular, this part sounded confused:

I’m not talking about swapping real threads for green threads; I agree it shouldn’t be possible for that to introduce any new bugs. I’m talking about swapping completely synchronous code with somehow async/threaded code. I’m talking about code which has no race conditions at all being transformed into code which has race conditions (that aren’t data races). Maybe this is only possible in code with UnsafeCells, but that’s still a lot of code.

It’s probably worth reiterating that I’m still suspicious of all the other proposed effects having similar counterexamples. For instance, any code which is “not panic-safe” would presumably become buggy if the ambient panic handling was swapped from abort to real panics. I suspect there are similar examples for allocator swapping, though I lack the experience with low-level allocator code to come up with one.


Effects systems are super interesting, and when combined with their counterparts, coeffects, they have the possibility of simplifying and combining a vast number of different concepts present in Rust, while also adding additional power. That said, it would be an very difficult task to cleanly integrate them into Rust as it stands, especially now with async and futures well underway. In a new language, the result might be elegant simplicity, but in Rust it could result in complexity and workarounds given all the existing features of the language.

That being said, I’d be very interested to see any progress and creative thinking around this! Just be on the lookout for how this could simplify existing concepts in the language. I’m also particularly interested in how this could be made fast. Most of the current effect systems do lots of intermediate allocations - I want something that compiles down to the equivalent code I’d write in regular Rust. It would be very interesting to see how this could be done!

Btw, if you want to learn more about what the Multicore OCaml folks are doing with algebraic effects, check out this presentation by Leo White.


I don’t think that sync/async split is that huge problem. Here’s my suggestion to solve it within crate level.

It’s possible to make a crate that runs tokio on background and exposes a generic function that takes a future and block the caller until that future is resolved, so you can get sync version for free when you’ve implemented async version.

I guess in these days sync io is only used for convenience/legacy/low-level-of-async and even in case of small number of io operation tokio’s performance is not bad compared with traditional io, if not same or better.


A lightweight effect system that I think would be very useful is a mechanism for top-down constraint propagation and annotation.

When you annotate a fn as const, or let’s say nopanic, you promise and guarantee that you will not call or reference any fn that does not also abide by these constraints.

There’s no monomorphisation, type gymnastics or polymorphism involved. A limited set of constraints can come with the language and verification of these can be unified by one mechanism.


An effects system for Rust would be really useful just because effects are so prevalent in the language. However, I share aturons concern that it would have to be a new language concept orthogonal to the existing type system (and lifetime and reference systems) and thus the language complexity would grow substantially. Languages like Idris and Haskell (and Scala probably) have type systems that are powerful enough to encode an effects system reasonably efficient way without adding addditional language complexity.

And even if Rusts type system would become powerful enough to encode a useable effects system, it would be a breaking change to introduce it since basically all of the stdlib would have to be rewritten in a monadic (or similar) style. So, I think the train for this alternative has left the station already.