A rust -> rust declarative ABI

Hear me out, at the moment, rust can use

extern "C"

to import programs from compiled C libraries

as such, what if we allowed the same but with compiled rust libs, perhaps not an extern "Rust" though perhaps a more native approach, downloading a static (or dynamic) library and then having a .rs file that acts kind of akin to a .d.ts file in typescript, declaring the functions with types and all of that cool stuff. (I am aware all of that would require each exported function to have a #[no_mangle] before it (or adding a #[no_mangle] on a mod to apply for all of its children, allowing the lib export of modules, or the introduction of a #[mangle] which has the opposite effect of mangling functions that'd otherwise be untangled), as such might I suggest that libraries have no_mangle as the default and binaries do not, in order to keep the benefits of mangling and allow libs to link with binaries with little change)

however each of those functions are implemented in the static library, and a compiled file and then link into the lib, creating what is essentially a Rust to Rust ABI, allowing for a fast and small distribution of libraries.

This would give the #[no_mangle] macro more use than for unsafe FFI bindings, instead allowing for safe rust to rust bindings aswell

for example, cargo could download these definition files, and a compiled lib file uploaded to crates.io, a git repo, etc.

This could increase the speed and likely reduce initial compile times on projects that use a lot of libraries (it may also substantially decrease data usage). An it would detach a reliance on cargo as a build system and allows easy of distribution of rust libs.

This sound like I'm suggesting rust header files, which isn't wrong, but there are many benefits to it and a system like such, there is probably a better way to do it than what I've suggested though this is what came to my head and if anyone has a better way of achieving the same goal, I encourage you to say it below.

TL:DR; it would be cool to have a .d.rs to declare functions in a static lib to allow code to be linked with them (assuming the no_mangle option is used or becomes default)

edit: my brain is weird, and just spits out ideas, and grammar

Ok. I'm not quite sure what you're looking for, but I think all of the things you're asking for either exist or seem to be a bit of a misunderstanding of how programs are linked.

as such, what if we allowed the same but with compiled rust libs, perhaps not an extern "Rust" though perhaps a more native approach, downloading a static (or dynamic) library and then having a .rs file that acts kind of akin to a .d.ts file in typescript, declaring the functions with types and all of that cool stuff

This exists (something comparable to a TypeScript declaration file, at least): it's the output of rustc --emit=metadata. The compiler can generate it on its own, and is the contents of an .rlib needed for dependents to understand what was in the crate. It is little more than a binary-formatted "header", so to speak. There is no reason to cache this remotely, since it can be generated from source (perhaps your remote build system cache wants to cache them, but that's orthogonal.

(I am aware all of that would require each exported function to have a #[no_mangle] before it (or adding a #[no_mangle] on a mod to apply for all of its children, allowing the lib export of modules, or the introduction of a #[mangle] which has the opposite effect of mangling functions that'd otherwise be untangled), as such might I suggest that libraries have no_mangle as the default and binaries do not, in order to keep the benefits of mangling and allow libs to link with binaries with little change)

Rust is intimately aware of its own mangling scheme; the names in an rlib are already mangled, and Rust knows how to unmangle them. (An rlib is just your typical .a static library but with extra stuff that rustc knows how to read.)

This could increase the speed and likely reduce initial compile times on projects that use a lot of libraries (it may also substantially decrease data usage). An it would detach a reliance on cargo as a build system and allows easy of distribution of rust libs.

The largest sink in compilation time is LLVM, unfortunately. This doesn't help with that. You could pre-generate all of the metadata for all of the crates (very fast) and then lower each to assembly in parallel, but this destroys cross-crate inlining if you don't use LTO. I don't think Cargo has an option to do this, but I could be wrong. Cargo on its own falls over way before you get to the point where this scheme would be helpful, I suspect, though it seems like a sensible approach for a more powerful system like Bazel...

I repeat, this notion of a "header" already exists, and you can ask rustc to generate it.

TL:DR; it would be cool to have a .d.rs to declare functions in a static lib to allow code to be linked with them (assuming the no_mangle option is used or becomes default)

You can't link code without actually compiling the library. What it sounds like you're saying is "link against", which is already something you can do. rlib metadata has all the information you need here.

2 Likes

my god im smort

thank you why don't I just rtfm

you have just made my passion project possible

1 Like

The important limitation of any such scheme (using extern "Rust", which is the default if you don't write extern), is that the "Rust" ABI is completely unstable.

If you only ever use one compiler, and only ever use one set of compiler flags, the resulting object files are safe to link together, as they calculate the same ABI for extern "Rust" items and functions.

But the moment you change a compiler flag or upgrade the compiler, details of the ABI no longer are guaranteed to match (due to e.g. (but not limited to) niche optimizations, PGO, whole program analysis, field randomization, MIR optimization fuel, ...), and you're no longer guaranteed to be able to safely link objects together (though it may incidentally end up working).

If the general purpose is just reusing compilation time effort, check out sccache or even just setting $env:CARGO_TARGET_DIR to a shared directory.

6 Likes

The idea to distribute pre-built binaries on crates.io has come up before. But as other people have mentioned, the monomorphism + cross-crate inlining + unstable ABI issues make it hard to do in general.

For procedural macros, which have long compile times but are otherwise self-contained, there's a proof of concept which bundles the macro as WebAssembly: watt.

Cargo also supports pipelining, where we spit out the "header" part first, so that higher up packages can start to build while the rest of the original crate is still compiling. (The linked post says it's experimental, but it's stable and enabled by default now :slightly_smiling_face:)

2 Likes