I support the basic idea of working towards required 2FA and/or required code signing. I do not think that either of them are too cumbersome, if implemented correctly; however, there are a lot of tricky details in implementing them correctly.
So while the basic outline of the proposed plan in the original comment looks reasonable, I think like @bascule says, it will require first an opt-in implementation, getting sufficient testing, and the plan will probably have to change over time.
In the meantime, letās think about particular threat models. The NPM issue came about because someone who had access to upload a popular package had a password which was shared with other, compromised accounts. This in turn let the attacker upload that popular package, and then when other developers installed the compromised package, it could in turn steal credentials from them.
crates.io doesnāt use its own passwords; it delegates to GitHub. But the equivalent situation in the crates.io case is that someone has a GitHub password that gets compromised, allowing for adding a new access token, and then uploading packages with that.
Thereās been some discussion of whether it would be possible to check GitHub if 2FA is enabled, and many people have pointed out that organizations are able to enforce that. So one quick and easy way to enforce the 2FA requirement would be to just require uploaders to be members of a āCrates.io Usersā organization, with membership automated by a bot, and with 2FA enabled.
Then we get to what we do about the fact that once one person is compromised by a bad package, their tokens can be compromised to compromise their crates. The original post suggests also requiring 2FA for publish, but there are very few 2FA or signing methods that can actually guarantee that the person is using a different device rather than the same device, unless they require hardware tokens of some sort. So any kind of 2FA is at least going to have to trust that users choose a reasonably independent second factor.
Of course, concerns about one package being able to cause a compromise of others in large part comes down to the terrible security model of desktop workstations, in which every process has access to all data owned by the same user running it, and generally has enough access to install keyloggers or the like to intercept even one-time passwords.
Ideally, building a Rust library shouldnāt give the build script access to a users secrets, nor even should running the resulting binary give access to them, unless the user specifically gave it access.
Solving this problem is probably out of the scope of Cargo and Crates.io, as it requires moving to a model a lot more akin to mobile devices or capability based systems like Fuchsia, and I donāt know of any systems for controlling access like this but being flexible enough for a developer workstation, let alone being common on all Tier-1 platforms.
Without solving that problem, if you do add 2FA to the publish step as well, and people use it correctly, by using a separate device, that does largely cut down on the risk of malware that is just hoovering up credentials to be used later; it doesnāt protect against a full persistent threat that will try to MITM or keylog your every action, but it does protect against fairly simple ways of compromising keys, like having them throw out on hard drives, accidentally check their home directory into public version control, or exploits which are able to read data from your system but not write or control anything. So it is still an improvement over a token permanently on disk.
Signing has additional benefits; it authenticates the crate author to the user directly (at the very least with TOFU based identify verification), without having to even trust the crate repository, while 2FA puts all the trust for verifying authenticity in the crate repository. However, I donāt know of many ways to require a second device be involved, at least thatās widespread and reasonably standard and doesnāt require specialized hardware tokens. Is it possible to easily secret-share between you laptop and phone, and so have to push an āOKā button on your phone to perform a signing operation?
I think that 2FA and signing are probably the most important mitigations for these types of attacks, but itās also worth thinking about similar attacks, and potential solutions to them which might also help with this type of attack.
Some attacks to think about:
- Actual developer of popular package goes rogue, publishes malicious version of their package.
- Honest developer gets bad code merged somehow, publishes version with it
-
Crates.io itself is compromised
-
Crates.io gets MITMed, by a nation-state who can issue their own certs, or by someone who exploits a weak link in the WebPKI chain
Some potential other strategies for dealing with them:
- Be able to display and audit diffs between package versions more easily (actual crates.io package versions, not Git, since whatās on crates.io might differ from Git)
- Be able to display and audit diffs between Git and the actual published crate (Iām sure this would reveal cases where people had accidentally published code that hasnāt been committed to Git or was a different commit than the actual tagged version, much more often than it would find vulnerabilities, but thatās also useful)
- 2 person signoff rule (could only apply to packages with more than one maintainer, and probably only really usefully to packages with at least 3, to avoid having each person be a single point of failure for actually publishing new versions of the crate)
- Sandboxing builds and/or capability based permissions for executables (probably requires better platform support)
- Delay releases for a cool-off time during which they can be audited; possibly even a simple version of this could be implemented on the client side, where cargo would not resolve newer compatible point releases until they were a certain age, but you could still explicitly choose them by setting that version in your Cargo.toml
- Have some hand picked trusted versions of crates as a ādistributionā by some auditors doing at least a cursory look for funny stuff, and who are also generally looking for good development and security processes from crate authors (the āLinux Distributionā model)
- TUF-style signing