The Rustdoc Redux

Hard to say.

RUSTFLAGS="-Z save-analysis" CARGO_TARGET_DIR="target/rls" cargo build

is the command that it runs to generate things, maybe try adding in a -Z time-passes or --verbose?

Neat project @steveklabnik! Good luck.


Hm, must’ve been antivirus or something. Tried on another computer and the timing was much more reasonable (about 140 seconds overall). The save-analysis pass took about 60 seconds of the compilation time.

1 Like

Right, that also confirms my observations. What happens with traditional web sites is:

  1. The web browser makes a request to HTTP/2 - Wikipedia
  2. The website does some processing and replies with an html page with the content
  3. Some minor needed content gets requested by the browser like images or css styles.

What happens right now with is:

  1. The browser makes a request to Rust Package Registry
  2. The web server replies with an html page that contains a blank page and a <script tag for loading ember. I think its the same for all crates.
  3. The browser requests, parses and loads the ember javascript. I'm not very familiar with web development, but most probably stuff like that is already cached by the webserver. I think this step is pretty fast.
  4. Ember inside the browser does an XHR request to get a json file
  5. The server does some processing, and replies with the json file.
  6. Ember renders that content by traversing the json and modifying the DOM

With reloading with CTRL+R (having a warm cache), the process from 1 to 6 takes roughly 1 second until I get the "blue" line in my network tab (the DOMContentLoaded event of the page) on my desktop machine with Firefox 54. In comparison, loading the wikipedia page for HTTP/2 (which has far more content than the page!) through the traditional model takes roughly 0.48 seconds until the blue line. The blue line seems to be strongly correlated with when I see the page's content.

So there is clearly a difference here, and it might not be the direct fault of loading just the javascript (step 3), but instead I think its the combination of the steps above as each of them takes time. Also I have observed that the web server is slower (each network request takes hundreds of ms, while on wikipedia it takes roughly 10-20).

I don't know how this can be fixed. Maybe through ember fastboot, maybe through sending all the needed json inline inside a <script tag with the html page at step 2, maybe through sending it via HTTP/2 server push. But I don't think that this should be ignored, nor do I think that rustdoc should ship with that issue.


I find it disappointing that concerns about the use of Ember are ignored or waved away. We all know that the argument that “you can use a different frontend” is disingenuous as 99% of users will be content with using the default. If uses the new rustdoc, then it becomes almost impossible to view Rust documentation in any other way.

I also find the claim that using Ember will not impact performance to be laughable. As others have stated, is a prime example of this, but perhaps it is true that this is due to lack of development. What about this forum? When i signed up to post this, it took a full minute before the page finished loading, and then I had to click on a notification before I could even view this thread. Can we really say that this is due to a lack of development? If more development were done, how much slower would it be than rendering HTML on the server?

Another potential concern is UX regressions. As others have noted, the use of Ember will make viewing Rust documentation more painful for users of plugins like Noscript. How might this change impact vision-impaired users? I have heard that web pages requiring JavaScript to be viewed are often more difficult for vision-impaired users to navigate. What is the cost of ensuring that documentation displayed using Ember is navigable by vision-impaired users?

I’m glad that rustdoc is being rewritten to make it easier to work on, but I implore all involved to reconsider the usage of Javascript in as large a capacity as has been proposed.


I’m so happy to see this happen.

I have been working on almost exactly the same thing where I work, hoping that perhaps the wider rust community would find it useful once I’d finished. I had not heard about this initiative, and but I’m super excited it exists now!

I have a lot of issues with current day rustdoc, especially in the design and presentation of the html output, and that rustdoc currently lacks good machine readable output. I’d been attempting a stab at them, but I’d love to contribute here instead!

Would IRC be a better place to talk about this specifically?


The crate information can be shipped in the same network transfer that also pulls Ember. It’s not like we want to spin up a web server for this; in fact I watched @steveklabnik make sure you could just push off a bundle of docs without needing to put it in its own server. (Coincidentally, I specifically asked about making sure cargo doc && sudo cp -r target/doc /srv/http/doc would still work, right as he was working on that part.)

EDIT: Ah, one thing i guess was not clear; the new rustdoc is already intended to output JSON, so the intent would be to just leverage that functionality.

If you want to coordinate over IRC, that would be great! You can find us all in #rust-dev-tools. (Though we usually talk about rustdoc in #rust-docs, which has less traffic >_>)

Otherwise, sending suggestions via GitHub issues, or even just chatting within this thread, would work just as well!


I’d like to add my voice to the “please don’t require JavaScript” pleas. I have no issue with fancy features implemented on top using JS but I really think that without it you should still get reasonable documentation with working links, both for local copies and online docs. (I regularly an annoyed by the blank page on with NoScript or non-graphical browsers).

I’m not sure if anything else is being proposed, but if serves blank pages of JS in the future that would be sad.


I think it is awesome that you are working on this - thanks a lot! :slight_smile:

I’ve previously made a very minor contribution to the current rustdoc and I did find the inline HTML rendering difficult. It seemed deciding what to render was all mixed up with deciding how to render it. I think proper separation of the back-end processing and front-end rendering will make rustdoc much easier to work on.

I do wonder what benefits client-side rendering (with Ember or otherwise) will bring over plain-old server-side rendering (with something like the handlebars crate). Is there some planned functionality where client-side rendering would be advantageous?

I worry that requiring Node will make it harder to contribute to the front-end, potentially for little benefit. I know I have previously been put off casually contributing to because settings up a development environment seemed complex. (In the case of, this complexity may be justified by the benefits of using Ember - I don’t know enough about what does with it).

In either case, I’d happily contribute a front-end which statically generates HTML if/when pluggable front-ends are possible.

1 Like

I just played with gutenberg and the template engine seems pretty nice.
@steveklabnik What are your thoughts on supporting a templating engine like Tera for rich docs?
It might introduce complexity but I’m thinking rustdoc could export some Tera macros and import the defaults for the user.
I imagine this(templates) could help with referencing(the user himself or the tool after parsing the docs) other structures and types, without having to create the links yourself.

No one is ignoring performance, not for rustdoc and not for


97.6% of screen readers have JavaScript on. There are a number of techniques that make pages that use JavaScript to load content accessible. I'm sure Steve will be keeping them in mind and will be soliciting feedback from the people in our community who use screenreaders. This design and testing would have to be done no matter the technology-- it's quite possible to create static HTML pages that aren't accessible.


+1 for KaTex (or at least some version of LaTeX)


Unless you also want to do a build, you should use

RUSTFLAGS="-Z save-analysis" CARGO_TARGET_DIR="target/rls" cargo check

which should take quite a bit less time.

1 Like

This makes me happy because it's the exactly the kind of thing Inch CI (a documentation coverage tool with badges containing progress bars) has as a prerequisite for adding a new language

As a very UX- and PR-oriented guy, I treat such a lazy first impression (a blank page) as an indicator that a company is probably too incompetent to trust with my business.

...and that's for sites which are web applications in the truest sense of the word and, thus, truly do require JavaScript for anything beyond a static "Welcome to our product. Here are some feature blurbs and screenshots to encourage you to enable JavaScript" page.

For something as simple and minimally interactive as documentation, I see it as flat-out inexcusable for a JavaScript requirement to be default.

...especially when it could be interpreted as the makers of a high-performance language with strong static guarantees having so little faith in said principles as to write native applications in NW.js. (eg. Christian Heilmann has written quite a few good blog posts on the fragility and limitations of reinventing bits of the browser in JavaScript.)

Yep, as Rust developers, we definitely are incompetent at Ember, thank you for noticing! We have some contacts in the Ember community who are currently helping us to improve our use of Ember and eventually get us using fastboot. This has taken time and people, I'm sure you understand how that is and can cut us some slack.

1 Like

You actually quoted one of my points, then reacted to another.

The bit you quoted was specifically about presenting pages which are either completely blank (the worst) or, like, contain nothing but a message asking people to enable JavaScript. (ie. Nothing about what the site is and why users would want to enable it.)

A truly blank page (not implies incompetence, given how much effort has clearly gone into fine-tuning the marketing impact of what gets displayed once you do enable JavaScript.

A page with a visibly customized base template, but nothing beyond the “enable JavaScript” message (eg. not even a header or footer telling you what the site is) implicitly sends a “We have the technical know-how to present a case for enabling JavaScript… we just don’t care about your potential reasons for not wanting it enough to do so.” message, which a disdainful tone could be read into.

A very bad first impression to make, I’m sure you’ll agree.

As for the “especially when it could be interpreted as the makers of a high-performance language with strong static guarantees having so little faith in said principles as to write native applications in NW.js” that you seem to be reacting to, here’s what I worry about people reading into it:

  1. Rust is supposed to be chasing C and C++ when it comes to efficiency, yet these people seem to see no problem in reinventing and/or breaking built in functionality of the browser (something which began as a cross-referencing document viewer) in its inherently less-efficient internal scriping language.

  2. Rust is supposed to be about reliability, but reinventing browser-internal functionality or making it dependent on JavaScript is notoriously fragile. They must not really care as much as they claim to.

  3. Rust is supposed to be about compile-time verifiability, yet these people see no problem in sabotaging that for the core functionality by indirecting the much more statically-verifiable HTML behind a layer of Unityped, Turing-complete JavaScript.

  4. “Everyone knows what a hog Firefox is. This is just more proof that, be it incompetence or lack of will, Mozilla chronically doesn’t practice what they claim to preach.”

(As you might have guessed, that last one isn’t intended to be a rational argument but, rather, an impression that’s difficult to change once established.)

…and, since the ecosystem is centred around Cargo and, that can then make it more difficult to convince people coming from languages like C and C++ to trust Cargo to handle dependencies.

P.S. Sorry for being so tactless in my initial post.

To add a counterpoint to the complaints here: I for one think using Ember for the frontend is a great idea.

Noscript is an edge case, very very few people actually use it, and catering to this crowd is going to leave a vastly inferior product for everyone else. Most of today’s web requires JS anyway. Trying to avoid using JavaScript seems quixotic to me.

Using a static page generator for this would still require a great deal of JavaScript to make it work. At some point doing everything in JavaScript is just easier than a mish-mash of JS and static HTML.

As someone who has used static generators extensively, and also worked on SPA web applications extensively, I vastly prefer the latter. Static generators are a pain to work with, are universally badly maintained, the code always ends up messy, and you’re still not getting the quality you really want.

Ember is a well maintained framework with a large community, great learning resources, solid tooling and development workflows, and is essentially built precisely for this kind of thing: taking some data and rendering it. It’s a really good choice for this project.


I’ve worked with Sphinx and Doxygen in the past, and I think they can serve as examples of functionality that’s expected to be in rustdoc, especially if you want it to be used in projects for main documentation and guides.

I think the most important thing that’s strangely absent in rustdoc is cross-referencesing. Referring to a type, a method or a guide chapter is essential for documentation. For example, Doxygen automatically turns fully qualified name of a method into link to the corresponding section in the docs. Sphinx also allows to create and refer to arbitrary labels.