Follow up: the Rust Platform


A bit of a meta-comment: when talking about large standard libraries, people sometimes point to Python as an example of how a large standard library boost adoption (“batteries included”), but one thing people rarely seem to bring up is that it only gained a packaging system fairly late in its lifetime (Python began development in 1991, and the first package-management tool was probably in 2005). People clamoured to add things to the standard library because despite all the downsides, it was still less painful than trying to distribute packages any other way. And sure enough, as Python’s packaging tools have matured, there’s been less pressure to include things in the standard library (although even today, they’re still less mature than, say, Cargo).

As @aturon says, the important things are the existence and discoverability of high-quality third-party libraries. For Python, reaching that goal involved having a large, standardised collection of packages, but Rust+cargo is in a very different place, and that goal may lie in a very different direction.


A amendment to meta-comment: also in Python you must distribute deps to the end user of your program, and that would make a cost of non-standard library dependency higher even if Python had perfect packaging tools.


I understand where you come from, but… how do two libraries communicate with each others then?

By your statement, the only way two libraries can talk to each other is either:

  • exposing std types
  • having the user write scaffolding to translate from one library type to the other library type

The first one is extremely restrictive, the second one is extremely boiler-platey (wasteful, error-prone, and a performance issue).

I am afraid that having a library accept, in its public API, a type created by another library (such as a Matrix) cannot be avoided. To push the reasoning to an extreme, proprietary code will not come put its XyConsumer type in the standard library, and yet many of the libraries defined in their repositories might need to access such a consumer.

Composition of 3rd-party libraries is necessary.

That being said, circumventing the orphan rule may not be the best solution. After all, Rust has 0-cost newtypes.

Today automatic derivation does not work for such newtypes, because it expects the inner type to implement the trait whereas the lack of implementation is precisely the reason why the newtype is necessary in the first place. Still, that’s just a short-coming of the derivation today, and maybe it could be solved generically.

(After, there’s still the issue of the newtype require wrapping/unwrapping, Deref, AsRef, From, etc… can probably help here)


In terms of improving Cargo search, it may be worthwhile to consider whether a crate includes a repository link, description, or documentation when sorting search results.


Java is very successful (especially in the enterprise world) at applying @withoutboats’s suggestion.

J2EE is set of additional industry standards that go beyond the regular SDK (language + stdlib). This contains standardized APIs for many global concerns that would require 3rd party libs to communicate such as - logging, DB access, serialization, dependency injection, etc.

Look at DI for example - first came Spring and than other containers followed, than a common practice and idiom was established and was dimmed worthy of standardization. Java does not actually provide any DI container, but it does define an industry standard API (via its JCP) that 3rd party vendors (Spring) conform to.

Rust could follow a similar path where common interfaces, abstraction and idioms are derived from existing best practices and added to the Rust distribution. I’d also like to adopt the distinction between the stdlib that is the most minimal basic building blocks and abstractions and a more encompassing set of interfaces as a “Rust platform”. In any case, the implementations should still be provided by 3rd party crates and the "Platform " will contain only the trait definitions that are the required glue for libs to communicate.


Thank you for really taking the time and effort to incorporate all the feedback with an open mind. This followup post and proposal really solidify my trust in the people behind rust, and the future of the platform.


Not sure if this has already been suggested, but what about having a set of quality criteria for packages to be included in ‘The Platform’ which can be automatically tested?

For example:

  • a minimum level of API coverage
  • a README and Users Guide
  • unit tests
  • standalone sample code
  • integration tests (*)
  • more…?

There should be enough metadata to process this automatically, and could provide feedback to developers as to which areas need improvement before they can be eligible for inclusion.

This avoids the problem of having to bless a single ‘category leader’ (ie. particular database bindings) and puts all packages on an equal footing. The more these things can be automated, the easier it will be to detect regressions and let them compete on merits.

As for integration testing, any package declaring a dependency on another should not only declare the required version but actively test for compatibility. For example, if chrono depends on serdes then it should test the features it requires.

An automated build system can then track the most recent version for which a build works, and not update the version in the Platform set of packages until all dependent packages build successfully. Obviously this is a non-trivial problem (!) but certainly doable, and doesn’t rely on manual curation or testing.


I think we all have to collect some deep statistics about dependencies inside our (not open source) software written in Rust. Which crates uses together? Which versions? Numbers replies to us that such a platform.

I didn’t say this before: I think the Python’s batteries are not great, because some APIs are extremelly awful. Let’s look at Tk, logging, etc. Users was forced to carry it.

I like metapackages idea, It’s cool and retains choice. Often I need something simple like crate reexport feature:

libc = { version = "...", pub = true }


Yeah, and this is one of many things the platform metapackage could solve. If we end up not providing it in the default installation, this may be one specific problem we do not solve - you’ll need to do at least one more step to get the common components necessary to do simple things. But that step might be as simple as adding a one liner to your Cargo.toml pulling in an unofficial collection of common crates. It’s not as simple as it could be, but still an improvement over the status quo.

Shipping a platform metapackage by default in some distributions (Debian) but not others (rustup) seems ineffective since nobody will be able to rely on it being available.

This is one avenue yeah, but even if we go this sort of minimal, interface-heavy route, I am always inclined not to literally put (big) new things directly in std. For example, I don’t want anything related to databases in std. If we were to define e.g. an ODBC-like interface for Rust and distribute it to everyone, I would want it in a different crate. At the same time, the way we distribute std is basically an artifact of its develeopment process and that it has unstable hooks directly into the compiler. Any crates ‘above’ std I strongly prefer to distribute through our standard mechanism (cargo).

Yeah, this is something we can do, and part of the intent - having a path from ‘new crate’ to ‘crate listed in an official directory’ to ‘crate that is an official part of rust’ gives us increasing leverage over various quality issues, improves the quality of the whole ecosystem. And as objective criteria it shouldn’t cause too much teeth-gnashing about favoritism.

Lots of great feedback here. Thanks, all.


I totally agree and it was the whole premise of my comment that the stdlib should remain as minimal as it is today.

All I’m saying is that if we want to have an endorsed official “Rust Platform” meta-package than it should contain at most the standardized APIs & interfaces.
Actually, I’d go further and suggest that we have an entire set of such officially endorsed meta-packages, each standardizes a specific area. So e.g. we could have the “Rust-DB” cargo meta-package that defines the official ODBC-like API.


Support! Rust is already not easy to docile. Let the platform gives enough sugar to enough people to have a “chemical reaction”. Very wanting to see more people focus on a smaller group of libs, then shows me more some kind of “standard answer” and more active and more useful discusses on the simliar problem. People need have enough talkings, then go to deeper talking. And I think it’s a tradeoff should be done right now. I think Rust must have some “killer framework” or “killer application” in three years to have a big step forward, and I think together people to “Rust Platform” is good to achieve it. Then when Rust go more futher, Rust have time to fix the disadvantage of this choice.


As a first step towards discoverability and blessed-packages, don’t underestimate the value of an opt-in from package managers. If, for instance, a package owner at has the ability to tag their package as a platform-candidate or non-platform-candidate, this might take care of some of the work.

I have a number of packages in another language’s shared package ecosystem. Most, I have added because I think there’s a chance they could be helpful to others and there were no appropriate alternatives at the time, but I wouldn’t want most of my packages in a “platform” if I could avoid it.


I really like how the Rust Platform is attempting to strike a balance between ad-hoc community libraries and pushing everything into the standard library. I think a lot of this criticism coming out of the resultant big/small standard library debate has to do with worries that this is an excuse for the standard library to stagnate. As much as we all hate it when a shitty API makes it into the standard library, we also hate it when incompatible APIs fracture the ecosystem up unnecessarily.


Good perspective. There are some considerable downsides to having your work pulled into the official distribution for sure, including stagnation, loss of control, increased maintainership responsibilities.


Thank you for collecting the feedback!

I especially love this part. If discoverability, integration, or quality can become better by using/enhancing Cargo that would be great! For someone who is new to the language it would be great if crates in Cargo could be flagged as something like “curated”/“blessed”/“de-facto standard” by a group of core community members. That would help a lot. Maybe this flagging could be evaluated again after x-months, so they are up to date.


I mean, someone has to care.

Sure, but why can’t that someone be a third party glue crate? I’d like to be able to have a chrono_diesel crate exist completely independently in the ecosystem.

Like always, I think modifying the coherence rules to enable more blanket impls is the solution here. We should have a base library (std presumably) defining the trait DateTime

I think the problem quickly becomes more general than what the standard library should be expected to cover. Take UUID as another example.


You’ll have to provide a way to adjudicate who is allowed to be the glue crate that guarantees there can only be one such crate. I don’t know of any way other than the way we have now - ‘one of the two crates involved depends on the other one and provides the impls.’

Consider the case in which two different chrono_serde and serde_chrono crates are uploaded to, and both become popular in the ecosystem. Both provide an impl Serialize for DateTime. The types in these impls are identical, there’s no way to make them coherent. These crates present a hard split in the ecosystem - my dependency tree cannot include both of them. Every time someone provides an orphan impl, it represents a potential hard split in the ecosystem. This is so much worse than what we have now.

There are three properties it would be desirable for the language to have, and you get to choose two:

  • Impls do not need to be specifically imported (e.g. you only need to have Extend and HashMap in scope to merge two hashmaps, you don’t also need to import the specific impl block).
  • When you build any two libraries together, you do not get coherence errors.
  • Anyone can implement any trait for any type.

Rust chose the first two. We could consider dropping the second for the third, but I think the second is clearly superior to the third.

This is definitely true, but we’re up against like the fundamental laws of information theory or something, so all we can do is find ways to mitigate it.


I didn’t completely read this thread, but regarding the “Improving discoverability” part we could take some inspiration from It helps a lot when choosing libraries for certain areas.


please continue the discussion on orphan impls in Vague proposal: Extending coherence with workspaces

I don’t think it’s relevant here, and there’s been a lot of discussion in the other thread.


@aturon , at the botom of the article, Scala may will split the stdlib to core and platform ~~~~