Proposal: The Rust Platform

This project is already largely underway, see

2 Likes

Thanks, everybody, for the excellent feedback so far – both here and on various other forums. I’ve been reading and digesting all the comments.

To state the obvious, it’s quite clear that the overall response to the proposal as written is negative. Which is fine! I think people are raising a lot of good points. On the other hand, there is definitely room for improvement in the areas under discussion (e.g. discoverability, maturity, interoperability), and many people have been proposing lighter-weight approaches for doing so.

Given the large number of comments here and elsewhere, and the repeated themes, I’m not going to try to respond individually. Instead, I’m going to put together a short follow-up post, gathering the downsides people have pointed out, looking more closely at the goals, and summarizing some of the alternatives being proposed.

9 Likes

The way it is described above seems like this would attract organizations that don't want to contribute significant effort towards maintaining the libraries they use--i.e. freeloaders. But, that's not the kind of organization that we really need right now, in the world of Rust library development. We need organizations that are willing to get their hands dirty and improve stuff with the people who are building the libraries, or who are willing to at least sponsor the maintenance of the libraries in a meaningful way.

I was just talking with a friend that does a lot of node.js stuff, and he said almost all the negativity in his project comes from people who are demanding fixes be backported and/or new NPM releases of his library be made immediately, and who never contribute positively. I've seen how that works myself firsthand. That's not a position I want to encourage Rust library authors to get into.

2 Likes

This is exactly what I was going to type up but even more eloquently said. I would add that if the Rust language committee or whoever would be in charge of such a list sees a gap in tooling or libraries or what have you, then there should be empowerment to fill it in an appropriate way until something else supersedes it. By simply having a well curated list that gets a bit of official support like documentation and the like, no one who is just upgrading a version gets put out by something no longer working as expected (say behind a corporate firewall). They will have to figure out the normal path to get their libraries and when the community starts preferring a different library, they still get the old library they were pleased as plums with.

If the issue is how to make it easier to get libraries for people behind firewalls (or some otherwise restricted environment), then address that issue explicitly.

@briansmith

There is no incentive as a library author to publish a 1.0 crate. When you publish 1.0, people expect you to start mangling your code to maintain backward compatibility.

While we should definitely also talk about ways to encourage crates to stabilize, I'd like to at least make sure we fully understand the end-user experience before switching topics. Both are important and need to be understood.

You also seem to have a very strong take against crates.io as a concept, at least insofar as it becoming a place that people can find stable, supported packages. I think that's a concern we're trying to address, and I'd love to hear your thoughts in terms of how best to address it.

Could you point to some study, even an informal one, that shows this is true? In my experience, it isn't a significant timesink.

While I'm not sure I understand what you're asking about specifically here, my comment is more about the basic mechanics. It's building out of composable parts. If the crates don't work together well, you spend time stitching them together. Having the shared vocabulary also means we spend more time building the new thing rather than recreating the old thing.

The survey results underscore this. The need to cover major programming areas like UIs, networking, async I/O, etc are the specific areas that need strong support.

To the extent that this gives crates an incentive to publish 1.0, it seems coercive (no doubt unintentionally) and not in the best interests of the library maker.

Are you saying here that you're worried that library authors will rush to hit 1.0 before their ready? That's definitely a valid concern, though I hope some of the curation ideas mentioned here help ameliorate it.

1 Like

Publishing a 1.0 creates more work for the library author in terms of support costs and maintaining backward compatibility and whatnot. (Otherwise, the version numbers are meaningless.) Including a library in a curated set on the condition that they publish a 1.0 means they have to choose between being doing that additional work or being an outcast. Both are bad choices for people that just want to keep making the code better. It would be great to find other incentives.

I don't understand this reasoning. Not endorsing a library is not some sort of punishment (surely someone who doesn't manage such a crate is not an 'outcast'), and the Rust team has the right to define the qualities of a library which are the minimum for endorsement, and stability and backward compatibility are very reasonable qualities to require.

There are existential questions about why anyone would want to maintain an open source libraries, but those seem outside the scope of this conversation.

I certainly agree this would be awesome. My experiences with other ecosystems have not made me confident that it happens often, though :confused:

1 Like

This might be a simplistic question, but I’ll ask it anyway:

Say my library is already using a large fraction of the rust-platform, as individual dependencies. Will there be any advantages in “migrating” to the rust platform? The obvious one seems to be that I will no longer have to investigate any compatibility issues between various crates. But are there any others?

A number of good points have been raised above and elsewhere, so I’ll try to avoid repeating those and just offer some brief suggestions, noting first that I’m not hostile to the ideas raised here, but I don’t think they’re particularly necessary, and I think some of the other pain points brought up here (e.g. the UI on crates.io) and on HN and Reddit are as or more important than whether we have a blessed meta-package or not.

  1. I strongly endorse the idea of enabling “meta-packages” without expressly endorsing any. Making the build and packaging tooling robust around this will enable the community to coalesce around the things which are valuable, and those can overlap in unique ways. Some of the best stuff going on in the Python ecosystem is around exactly those kinds of community-driven sets-of-libraries/metapackages, but the official tooling has lagged so badly that there are entire alternative distribution tools that have grown up around them (e.g. Conda etc.).

  2. Put a high priority on supporting and shipping the in-work tool for supplying docs in a centralized location (presumably, ultimately, as part of crates.io), and work at a cultural level to make that the default location for documentation.

    • This will help enormously with discovery, as well as providing the opportunity for integration with the “docset” tools out there (Dash/Zeal/etc. and anything compatible with that). I know I find that really nice when I want to go look at something for Elixir: you can just pull up hexdocs.pm, and those docs can be integrated into any docset tool.

    • As part of that effort, build tools to support the docs/various-content.md model which is used for the Rust docs themselves (which both @steveklabnik and I have taken abortive stabs at in the past) and make those integrate with the documentation hosting as well.

    It is hard to overstate how significant an impact I think this will have on the community.

  3. Look at building out something like Ember Observer. When I’m evaluating a Ember add-on for integration into our application, I evaluate it there first. This is a community-maintained resource, and there are definitely some things that would need to be handled differently for Rust crates than for Ember add-ons—but there are more than a few similarities, too.

Each of those could very well be integrated in with the broader idea you’ve thrown out, but they can also be useful regardless of whether any official Rust Platform shipped, and would help address many of the same underlying issues.


Separate from all of that: I think the idea of shipping a small set of standard tools (rustfmt, clippy, etc.) is great, and that should definitely happen. Small support applications feel very different to me (and, I suspect, to most other developers) than do libraries, especially libraries which can basically function as standard libraries.

6 Likes

And, even if a library was on 1.0, if a rust platform became available and I used it, I’d expect to be able to continue using it for a reasonable amount of time…but inclusion in a platform doesn’t mean the maintainer has the bandwidth to support an LTS version of their library with only cherry-picked fixes (and there’s at least one popular open source project with plenty of resources that simply doesn’t ‘do’ patch branches - if you’re not on the latest version and are having an issue, tough). So now you need to decide what compatibility guarantees a platform has and figure out if included projects can apply for exceptions either because of important fixes or lack of resources. If I were a maintainer of a popular library (I’m not, so I may be wrong), I might actively not want to be included in a platform due to the possible additional burden.

Taking a thought from upthread, potentially if some team or company wants to make a platform then they could release a -platform crate and just pub extern everything. At this point, I would consider them to have agreed to act as the gatekeepers for any issues that requires them to negotiate with upstream library authors, and possibly branch off themselves. If this (being willing to shoulder work themselves, rather than always defer to/push work upstream) seems like a huge burden and nobody wants to do it, maybe that’s a red flag!

The dependency bloat problem (mentioned upthread) also worries me. I’ve never written a line of haskell, and my single experience with it was downloading it to build a tool, at which point it seemed like half the world was being compiled and I ended up with a 100MB binary. Some crates feel like they’re on this road (which I can live with, the creators of these crates are delivering immense value) but I don’t know if the tradeoff is right for a platform (even with pre-compilation).

I haven’t really mentioned anything other than libs (e.g. compiler, cargo, and particularly cross-compilation targets) because I think there’s important lower hanging fruit/prerequisite work for them all - third parties right now can’t easily/officially swap out/release replacement parts of the toolchain, and a platform feels like it’d add to this list.

That's basically what happens anyway when an enterprise/lts distro includes a set of packages. It becomes a "platform" which that distro promises to maintain.

1 Like

Yes, I had linux distros and rails LTS in mind for this bit. But the blog post isn’t keen on the libs team doing it for a couple of reasons. My off-the-cuff suggestion was for ‘lib platforms’ to be always unofficial (just a normal crate, maybe with some bells and whistles) and require whatever team/company doing it to have the resources to support a distro model (rather than the proposed libs team collaboration model). I have a few reasons for not being a fan of the collaboration idea, but I’ll wait for the next blog post.

I like this approach (better than the simpler version I proposed in reddit). Also, the reason I prefer this, over the simple meta-crate alternative that many others brought up, is the fine-grained control over which crates are fetched and compiled. I have a few things to add (let’s call this tight group of crates: a clique):

[cliques]
platform = "2.6"
security = "1.1"
piston = "1.9"

[dependencies]
libc = "platform"
regex = "platform"
crypto = "security"
...

A few more advantages apart from the ones I mentioned in reddit:

  • Distributed curation of various cliques
  • Minimize effects of “endorsement”

But of course this brings an additional layer of version conflicts, and adds to the complexity; such as crate collisions (with conflicting versions) between cliques. One way to adapt to this is to strictly follow a few conventions. One that comes to my mind:

  • Every clique version should be compatible with at least one version of the “platform”.
  • Every clique version should maintain a list of rustc versions it compiles with.

To make this easy for clique maintenance, the “platform” clique should only contain a very few and extremely popular/useful crates (may be around 10-20) with a high degree of maintenance guarantee.

To make this easy for general users, may be cargo should suggest clique/crate versions when there are conflicts based on compatibility metadata with rustc and platform.

Update: -ed!

Update 2: – OR –

Avoid having library users from dealing with such complexty and let these “cliques” be used exclusively for automatic version matching and suggestion by cargo.

The users should not specify which cliques it depends on. And all crates should have membership metadata on various cliques for use by cargo intelligence.

PS: Sadly, the documentation concern is completely ignored!

I would point that there are a few algorithms around "scoring", and simply counting the number of +/- is the just the simplest of all.

Be it clicks or downloads or whatever, it is relatively easy to weigh them by the time they occurred on (so that the further in the past they occurred, the less weight they pull).

Similarly, there are algorithms for consensus, which use probabilities to determine whether the currently gathered votes can be used to derive a significative "average" vote or not (yet), to avoid having a single 5 stars vote push the library in the first position, or avoid having a 5 stars and 0 stars combine into a 2.5 stars average and instead wait for more votes to settle.


I personally think that there are various metrics which matter, with regard to a library:

  • its maturity/stability: is it likely to evolve into something very different?
  • its overall quality (generally linked to maturity): how happy are the users, how many bugs are opened, what are their age, ...?
  • its activity: an inactive library can be a risk (no maintainer, no one remembers how it works)
  • ...

The score will generally correlate with the happiness of its users (though generally during the honey moon period, where the tougher issues may not have surfaced yet).

On the other hand, the endorsement by a large community (as materialized by the inclusion in a well-regarded package) speaks both of maturity, stability and a commitment to maintenance.

For winapi I am holding back from 1.0 because I don't feel I have perfected the design yet, and that is mainly due to there being a lot of features I want that aren't implemented yet such as unions and alignment and packing and bitfields, many of which even have accepted RFCs! The Rust Platform would do nothing to fix that. The only thing that would fix that is for MIR to stop blocking everything and all the features I need to finally be implemented and stabilized. Then I can finally settle on a good design and hit 1.0.

I would love it if crates.io had some way to see popular keywords so I could know which keywords I should match up to. The case sensitivity of keywords is also kind of awkward.

I like this idea.

3 Likes

To expand on the question of why so many crates aren’t 1.0 yet: 95% of existing gamedev-related crates today will segfault if you do something wrong (ie. functions are marked safe when in reality they are unsafe). Publishing a memory-unsafe crate is ok, but version 1.0 usually means that the crate is production-ready, which is obviously not the case if it can crash at any time.

I’ve posted a follow up as a separate discuss post. Thanks again for the feedback, everyone!

I'm imagining something akin to Ruby Toolbox. It categorizes gems and shows statistics like how many favorites and downloads gems have, how recently it's been updated and how active it is in general, etc. These statistics can help people discover gems for the use cases they need and perhaps some ways to choose among the similar ones. There's no curation or official blessing involved.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.