Volatile and sensitive memory

Sorry for dropping out of this thread. I've just read the entire thing and wanted to throw in my two cents.

First, a meta-comment: I don't believe we can expect a "definitive answer" to the questions that @briansmith is raising at this time, because the "Rust memory model" is not yet defined. As @arielb1 says, we "ought to" -- and we are working on settling it, but it is a complex equation with a lot of variables, and frankly one with an unclear priority. As a starting point, I've been going through all the discussion threads and so forth and trying to gather up a list of important examples, along with collecting discussion. This can be found in this repository:

At the moment, these examples are culled directly from discussion threads. I do plan to go back over them and try to eliminate duplicates / simplify / coallesce. Then I hope we can evaluate some various proposals and see how they "score" related to the examples. I've added in the various examples from this discussion as well.

All that being said, @briansmith needs to write some code today. For the time being, we have de facto adopted LLVM's "volatile access model". Personally, I am pretty comfortable with this, and I would not expect major changes. This implies to me that @huon's example of a volatile wrapper is currently correct (as @huon points out, the VolatileCell example is not, though for a reason that is orthogonal to volatility).

The key point is that the compiler will not randomly introduce reads of an &T -- it will only introduce reads that it can prove may happen at some point in the future. This is (I believe) true even with the derefenceable attribute. So if you have a field x: T and it is only ever read via a volatile-read, then the compiler will not introduce spurious, non-volatile reads of it. I believe this is the LLVM's (and hence Rust's) current semantics (@briansmith seems to have come to the same conclusion).

So TL;DR I think @briansmith should adopt a volatile wrapper like @huon's example. It will work fine today. It may need some adjustment in the future, but that seems fairly unlikely, and should not affect consumers of the API.

3 Likes