There are two serious problems with Rust’s documentation:
Rust-related search queries are polluted by outdated copies of old books, old libstd, unfinished content that shipped with docs in 2015, and a ton of second edition redirects and stubs lacking content.
There’s some deadlock or organizational blind spot about this issue. It’s been reported many many times, but hasn’t resulted in any meaningful action.
The SEO problem:
Many Rust-related queries in search engines bring the old first edition of the book, and stubs from a removed copy of the second edition that was meant only for offline doc users. Occasionally, also libstd links from historical Rust versions pop up.
In some cases the situation is so bad that the current version of the book isn’t even in the search results at all! I presume it’s being incorrectly removed as duplicated content, losing to the “there’s no book here” stubs.
The unfixability problem:
One way to solve this problem would be to block undesirable copies in
robots.txt (there are also other ways like redirects and canonical links, but these solutions ended up being rejected/posponed for various reasons).
robots.txt file is not in any public repo. I’ve heard on discord that it’s manually uploaded to an S3 bucket. Are there procedures how to change it? If not, could we create some?
Having exact same copy of the book for packaging (offline use) and online version makes sense in general, but in case of the book reorganization it has created an absurd situation that #1 thing you find online is a message that there’s no book for offline use.
Responses I’ve heard to this imply that doc deployment is unfixable, and docs are doomed to use the most inflexible primitive hosting that can’t do anything. Really? Can’t online deployment do
rm -rf of the folder intended for offline use only? Can I set up a server for you that supports 301 redirects? Can I give you a new
robots.txt file to upload? Maybe Rust needs SEO-WG?