Hey! We’re trying not to break stuff now, and I’m working on a testing regimen to help by building crates published on crates.io against betas and nightlies. It’s not refined yet, but it does get some useful results, so I’m going to start giving them visibility by posting them here.
Today I have results for our first week trying hard to not break stuff, and they are pretty good! Comparing last week’s beta to last night’s nightly I’ve found two regressions.
The leveldb regression is the result of deny(warnings). This is tough to avoid on our end since we need to be able to introduce warnings.
The second is more mysterious. Some pointer conversion was made illegal. It’s not clear offhand exactly what change caused it, but in @nikomatsakis’s opinion it looks like arguably a legit bugfix.
itertools doesn’t appear in the list. (It works fine in both versions). Not very important, but it might be a sign of a bug in general?
Maybe you can cache package fetching? I think your runs create visible spikes in download statistics, hundreds of downloads per popular package. Big spikes on april 3rd and 10th
There was also a regression in glium introduced a few days after the beta (lifetime issues), however I bypassed the problems in order to continue working on the library.
EDIT: see the travis build of this PR. 2547.2 tests exactly the same things as 2547.5 but 2547.2 uses the nightly while 2547.5 uses the beta.
The borrow checker is a bit mystical to me, so I don’t usually report these problems (as I’m not sure if it comes from my code or from a bug in rustc).
EDIT2: I spent an hour on it but didn’t manage to simplify it.
Hi, author of leveldb here. I just disabled the warning check in leveldb-0.7.1.
I’m not sure whether all lints will be documented here, but given that the suggestions to deny warnings is around the community, we might want to document that it poses a risk to forward compatibility. /cc @steveklabnik
Isn’t that just fixing the symptom? I think it would be interesting to indicate a minimum compatible level for the whole body of code (which would also mean failing if a function is too new). This could be reused by the lint system.
The problems caused by changes to lints are actually temporally reversed compared to problems caused by normal changes. Lint changes will mean that code will compile on 1.1 but not compile on 1.2 (and probably not any 1.x, x > 2), whereas language/library changes will mean that code that compiles on 1.2 won’t compile on 1.1 (but it will compile on all 1.x, x > 2).
That is, code would have to specify “maximum rust version 1.1”, and this would essentially have to be specified for all code ever: who knows what sort of amazing warn-by-default lints we’re going to add in future.
So, yes, maybe, strictly speaking, it is just fixing a symptom of the fundamental problem that there will be many versions of the Rust compiler/language, but the fix for that “problem” (of locking code to exactly the compiler versions it has been tested with) doesn’t seem tenable with the intended way for the ecosystem to work: easy upgrades 1.x -> 1.(x+1).
I don’t understand. I understand the reversal, but not the imminent problem. Probably, I interpret the marker differently.
Say, I specify “compatible down to Rust 1.1”, I expect no newer APIs to work (it is not compatible), but newer lints to warn instead of throw errors (to provide forward compat with newer rust versions). Once I publish this piece of code, the code state will be frozen forever, so I will be able to run that code for quite a while.
Now, before releasing the next version, I start porting to the version I want to support, all warnings become errors and I can start fixing.