Why are we not stripping symbols on Mac and Linux binaries by default?

I think ultimately any deployment that cares about things at this level should be setting most, if not all of these settings manually, and should at a minimum review the default settings to make sure they're appropriate for their specific use case.

If you ask what the default should be not for the people who are going to check anyway, but for the people who aren't going to check and just rely on the defaults to provide some reasonable default, then it makes sense to not strip by default, such that backtraces have some amount of utility.

The people for whom stripping is required will check that the binary is stripped. The people for whom stripping is optional will appreciate the extra information in the backtrace.


Addendum: the default with opt-level s or z but no explicit strip setting should almost certainly be to strip, even with the above reasoning, as an extent of the above reasoning: if you're asking for the smallest possible binary (via opt-level), then the reasonable default is to strip out the debug tables as well.

13 Likes