Why are we not stripping symbols on Mac and Linux binaries by default?

Why are we not stripping symbols on Mac and Linux binaries by default?


For some context, nightly Cargo currently has a profile option to control which symbols are stripped.

Until recently, stabilization was blocked because stripping did not work on MacOS. This was fixed last month. The tracking issue for stabilizing this option is: Tracking issue for `-Z strip=val` option · Issue #72110 · rust-lang/rust · GitHub

There was some discussion in the tracking issue of enabling stripping by default for release builds, after the option is stabilized.


Personally, I prefer unstripped binaries, which can at least be understood to a limited extent even if you don't have debug info – which is sometimes painful to obtain even for open source software, and generally impossible for proprietary software. (Proprietary software often strips its binaries anyway, but it's nice when it doesn't.) Symbols are especially helpful in combination with a binary reverse engineering tool.


Could it become e.g. a Cargo.toml setting? That way it could be set to the requirements of the binary crate's author.

It can be an option. But at least for release builds, as a default option binaries should be stripped.

1 Like

I disagree. Even in release mode symbol names are useful. What if your program crashes or hangs in production? Without symbols, you won't have a chance finding where the crash happened. Stripping debuginfo other than symbol names in release mode when debuginfo is not explicitly enabled for any crate makes sense to me though.

1 Like

I think ultimately any deployment that cares about things at this level should be setting most, if not all of these settings manually, and should at a minimum review the default settings to make sure they're appropriate for their specific use case.

If you ask what the default should be not for the people who are going to check anyway, but for the people who aren't going to check and just rely on the defaults to provide some reasonable default, then it makes sense to not strip by default, such that backtraces have some amount of utility.

The people for whom stripping is required will check that the binary is stripped. The people for whom stripping is optional will appreciate the extra information in the backtrace.

Addendum: the default with opt-level s or z but no explicit strip setting should almost certainly be to strip, even with the above reasoning, as an extent of the above reasoning: if you're asking for the smallest possible binary (via opt-level), then the reasonable default is to strip out the debug tables as well.


I'm going to go ahead and link

in case anyone here hasn't seen the current state.

1 Like

With opt-level=s/z when you need every last byte (think microcontroller), I don't think it is uncommon to do a post processing step that moves the symbol table to a separate debug file, so you still have the symbol table, just not consuming space in the executable.


How to keep the symbols in Windows release binaries?

On Windows, symbols are in a separate pdb file.

I believe we made a consensus that default options should not surprise people who don't even know that option exist. For example in stdlib [T]::sort() is stable sort and HashMap's default hasher has DOS resistance.

I think people will be surprised if their production process crash without backtrace information.

1 Like

There will also be people surprised that their release binaries are 10x the size they expect.

No matter what we do, the defaults will never be perfect for anyone. We should endeavor to make the defaults work well for many people, and make it easy to change them.

Personally, I think we should default to separate debug symbols. That way, you have debug symbols available if you need them, but not directly in the binary.


I also agree with your opinion.

Theres [profiles.release] debug = x for controlling this.

You can use [profiles.release] debug = 1 if you want some symbols in release executables.

However, [profiles.release] debug = false including any debug symbols in the executable is IMHO simply a bug. If it's configured not to add symbols, then it shouldn't.

Not all symbols are debug symbols. debug=false cannot suppress all symbols, certainly not in staticlib or rlib crates (and dylib/cdylib, though they get turned into dynamic symbols). Symbols and symbol references will be emitted into objects by the compiler, and linkers don't remove the symbol table when generating final targets (shared objects or executables).