The correct solution seems to be to not erase lifetimes and pass them lifetimes to monomorphization, perhaps taking monomorphization out of trans and doing it as a separate MIR pass before it.
The only thing that matters however is whether lifetimes are 'static, and how they compare to each other, which means that they should just be abstract numbers without region information along with a partial order between them, and monomorphization needs to normalize them so that the first lifetime that appears in the signature being monomorphized is renamed to ā0ā, the first different one to ā1ā and so on, and the result augmented with the ordering information used as the key for monomorphization (and symbol mangling).
This basic normalization should already reduce code generation explosion, but an extra pass could be added before monomorphization that determines which lifetime comparisons specialization could depend on, allowing to do a more powerful normalization taking them in account.
Then lifetime assignment rules need to made more precise, probably defined to be such that a lifetime is assigned to be, if possible, equal to the largest other lifetime it can be equal to (so if it can be 'static it should be). Or, with a global view, lifetimes are assigned so that the number of distinct lifetimes including 'static is minimized. Not totally sure if these criterias are correct though.
Of course this is assuming that lifetime specialization is a desirable feature in the first place, which for instance means being able to have a function that takes &'a T and turns it into a Cow, either assigning to the Borrowed variant or cloning and assigning to the Owned variant depending on 'a.