Display for Duration

Isn't Duration too cautious about with having a Display impl?

I understand the concern about SystemTime. Dates are messy and need localization, but Duration is very simple and has an obvious universal representation, already implemented in its Debug format.

Duration's existing debug formatting seems perfectly acceptable to me as an output for Display too. It's using a standard SI unit, and fractional seconds aren't more of an internationalization issue than existing Display of f32. I can imagine users may want to display Duration in other ways, like H:M:S or natural-language phrases, but Display doesn't promise to be the end of all formatting. bool's Display has already set the bar very low, and Duration can already do better than that.


It's using a standard SI unit, and fractional seconds aren't more of an internationalization issue than existing Display of f32 .

I think it's a bit more: 100ms would be 100мс in Russain, unit names are translated.

That being said, Display is quite inadequate to provide i18n, we already have true and false, and we already use English error messages for Displays for various errors. So I think it's OK for Display in general to assume us-international locale, and we probably should just document that.


And "0.5s" would be "0,5s" in German, for example.


However f32: Display is already not localized that way, so it would be consistent for Duration to also not have a localized decimal separator.


I guess the question is where to draw the line between 0.5 and 12/31/2022. I don't mind the first, but would mind the latter. My personal impression is that 2.1s is somewhere in between. However, in my case:

  • I always use a decimal point instead of decimal comma because I'm so used to that because of programming.
  • "s" is the abbreviation for seconds in German as well.

People from other background might cringe if they see an "s" because perhaps it's the abbreviation for hours in their language. Just hypothetically speaking to raise attention to that issue (not sure if it's really an issue, I don't speak many languages).

There are some things in std which later turned out to be a not-so-good idea. So taking "X is also implemented in std" as an argument may be misleading (not wanting to say it's the case here, I'm really not sure).


"s" is the official SI abbreviation of the unit, for what it’s worth, like J or Pa. But I’m not sure what the standard has to say about unit names in non-Latin alphabets (I guess it does have something to say at least). Still, being standardized by the SI carries a fair bit more weight than just being a conventional abbreviation in a few languages.


I like the argument that it should follow the SI & ISO 31-0 typographic conventions.

Note that Debug doesn't. Correct SI is "1 s", not "1s". I think keeping it one white-space-separable token is good for Debug, but perhaps Display should be "1.234 567 ms" instead of "1.234567ms".

(Apparently they allow , or . for the decimal separator, so we'd have to pick one of those, but . is pretty consistent with what we do in other Displays, so it'd probably be that one.)

And my interpretation is that the unit symbols should not be translated.

Les symboles des grandeurs, imprimés en italique, sont généralement de simples lettres de l’alphabet grec ou latin et constituent des recommandations. Les symboles des unités, imprimés en caractères romains (droits), sont obligatoires (voir chapitre 5).

EDIT: Regarding the comment after this, if we're following correct SI then it's definitely μs.


Since Display has stronger stability promises, we would have to agree how to abbreviate microseconds, at least:

EDIT: Regarding the EDIT above, I'm also on team μs, even though copy-paste is easier than typing it.


If there are really stability guarantees regarding this decision, then it doesn't feel good (to me), because there are many "style" questions that cannot be corrected later (space vs no space, μ vs u, etc).

I would like to ask: What is the gain of implementing Display for Duration? That should be compared with the possible downsides. Maybe the downsides are small, but so is the gain (perhaps?).

I'll repeat my argument: I think this is too cautious. It's a bikeshed detail, and if it's not perfect, it doesn't matter.

People can debate spaces and ASCII vs Unicode in terminal forever, which means that Duration will never have any good-enough Display implementation just out of paralysis of the endless discussion.

I suggest stabilizing whatever it uses for Debug today, as-is, just to avoid bikeshedding. The details are just not important. Display is not a trait for getting formatting perfect. It already formats every single type suboptimally: numbers don't have thousands separators, floats have US typography and ASCIIfied exponent notation, bool hardcodes English, Display for Path does not match displayNameAtPath on macOS, Display for errors is ad-hoc and weird with chaining. If someone cares whether durations should have a space or not, or needs to localize them, they will have to use something else than Rust's Display that is little more than Debug without brackets.


Since Display has stronger stability promises,

Imo this is a mischaracterization of the API. While Debug explicitly disclaims any stability most Display impls actually don't make any positive promise. There are unsettled previous debates about how stable they should be, if round-tripping through FromStr should be guaranteed and other things. Some definitely have changed before (e.g. SocketAddrV6) and may change again. Stability could also vary between the default and the alternate form.

Additionally there currently is some conflation between Display and ToString due to the blanket impl (which was a mistake imo). If specialization were available they could be separated again and we could say Display is for humans and ToString for round-tripping.

We definitely have an option to implement Display without promising that it'll remain forever unchanged. Especially if we don't offer a matching FromStr

1 Like

Random thought: if we impl Display for Duration, at some point we'd probably impl FromStr for Duration as well.

1 Like

Not necessarily. For example one could do a compact, approximate, human-friendly duration impl that only shows the 2 most significant units. E.g. 1 year + 1 day + 5 minutes displays as 1y1d not 1y1d5m while 1 year + 1 minute displays as 1y1m. For a human (1 year + small/tiny bit more) is usually all that's needed to eyeball something.

Obviously that kind of thing would be very inadequate for round-tripping and so a FromStr impl wouldn't be appropriate. Or at least it should use a different format. And in Std we do have the privilege of being able to use specialization, so we could offer a separate ToString impl if necessary.

I don't think approximating by default is a good idea. That's what the precision field of the format string is for. If the precision field is not set, the full precision should be displayed, just like it is for f64 (sort of).


Note that there is an ISO 8601 duration encoding (unfortunately RFC 3339 doesn't specify a more sensible subset of it), so users may conceptually expect a FromStr impl to parse that.

1 Like

The ISO8601 duration includes calendar durations (like "P1M"), however.

So that's a reasonable expectation for a chrono::Duration, but not for std::time::Duration which cannot represent such durations.

Java Duration uses ISO 8601 duration formatting regardless. It is limited to non-calendar durations in the way that Duration::toString returns a PT representation and Duration::parse converts calendar-based units in a non-exact way (e.g., a day equals exactly to 24 hours).

Java Instant uses ISO 8601 extended date + time representation in UTC (a.k.a. ISO instant) for both Instant::toString and Instant::parse.

Both seems quite sensible to me and definitely usable for implementing Debug trait in Rust, maybe even good enough for implementing Display. After all, they are rather low-level representations and therefore I personally would completely accept that. I would not even object if parsing Duration were limited to only the PT part unlike Java's acceptance of calendar units, converted to seconds under the hood, which might be surprising sometimes.


This, so hard.

There's a point where the amount of effort spent debating a feature and the time the feature is delayed outweighs any possible gains from choosing one strategy over another, and at that point we should just shrug, pick one implementation and stick with it.

Bikeshedding debates aren't just annoying, they're a trap for the developer brain.

1 Like

Regarding the ISO-8601 format:

I'm quoting from ISO-8601:2000 final draft, which is not the latest version, so some of this might be outdated.

I believe the "P" prefix should not be there for Duration. The "P" prefix is used when a duration is part of a "time interval", but is not part of the "duration" itself:

Time-interval: a portion of time between two time-points. These time-points are respectively labelled “start” and “end”. Time intervals may be specified by these two time-points, by one of these time-points and the temporal distance between the points or by the temporal distance between these points only. [...]

Duration: a quantity (“length”) of time. [...]

In basic and extended format the complete representation for duration shall be nYnMnDTnHnMnS or nW. [...]

A time interval is expressed according to the following rules: [...]

b) for 5.5.1 b), c) and d) the designator [P] shall precede, without spaces, the representation of the duration

So a duration of 5 seconds would be T5S, not PT5S.

It's OK to assume days have exactly 24 hours, because ISO-8601 is agnostic about the time standard -- it doesn't have to be UTC or anything:

This International Standard does not assign any particular meaning or interpretation to any data element that uses representations in accordance with this International Standard. Such meaning will be determined by the context of the application. [...]

day: a time-unit of 24 hours

It's even allowed to assume that a month has 30 days!

month: unit of time of 28, 29, 30 or 31 days

NOTE In certain applications a month is regarded as a unit of time of 30 days.

year: a time-unit of 12 months, considered to approximate the duration required for one revolution of the earth around the sun

Display probably shouldn't use months and years, but FromStr could assume that months are 30 days are years are 12*30 = 360 days (hmm...). Or it could just reject months and years as ambiguous.

For fractions of a second, commas or periods are allowed, but commas are preferred. So if Duration were to use ISO-8601, maybe it should use a comma in Display?

If necessary for a particular application a decimal fraction of hour, minute or second may be included. If a decimal fraction is included, lower order components (if any) shall be omitted and the decimal fraction shall be divided from the integer part by the decimal sign specified in ISO 31-0: i.e. the comma [,] or full stop [.]. Of these, the comma is the preferred sign.

I am in possession of ISO 8601-1:2019, which is the latest standard. The "P" prefix is required.

Not quite (emphasis mine):

For example, the durations of a year, month, week, day, hour or minute, may depend on when they occur [in a Gregorian calendar (, a calendar month ( can have a duration of 28, 29, 30, or 31 days; in a 24-hour clock (, a clock minute ( can have a duration of 59, 60, or 61 seconds, etc.]. Therefore, the exact duration can only be evaluated if the exact duration of each is known.

No such preference is indicated in ISO 8601-1:2019. It may be indicated in ISO 80000-1, which is referenced, but I am not in possession of that.

1 Like