We can potentially go as far as accepting an arbitrary pattern or expression in this position (except for type ascription) and report an ambiguity error only if the expression has more than one identifier.
FIELD_PAT = IDENT: PAT | PAT
FIELD_EXPR = IDENT: EXPR | EXPR
e.g.
let field = ...;
let s = Struct { *&(-(field))? }; // OK, equivalent to `Struct { field: *&(-(field))? }`
let z = Struct { field + other_field }; // ERROR, not clear which one is the field
There’s a natural limit on complexity since it’s hard to come up with anything complex having only one identifier, so it should be ok from readability point of view in practice.
With type ascruption (always type ascription…) K { field: T } is ambiguous in expression position, so I suppose that’s a thing to ban. What about K { self.foo }? Should that be K { foo: self.foo }? (self is a keyword I guess?) If method calls are ok (K { foo.bar() } -> K { foo: foo.bar() }) I think you’ll wind up with a lot of potentially wild combinations, possibly with some .await and ? too.
Can you elaborate on what this would mean, precisely?
Field punning with & seems ambiguous to read in a way that punning without & doesn’t. For example, when you write &baz in a struct literal, that feels like it could mean “let the field baz be a reference to the local baz” (baz: &baz), or it could mean “let &baz be the local baz” (baz: *baz)? Generally only one of those will typecheck, but this still feels ambiguous to read to me.
For pattern matching, I prefer ref. And for literals, I find the ambiguity confusing to read.
I understand that, but it’s not hard to read it as Foo { *bar: bar, &baz: baz } instead, because both field and value have the same name. I understand that’s not the intended reading, but that ambiguity makes it easy to stumble when reading.
In struct literals this doesn’t seem ambiguous to me at all, in light of what C# does (as @scottmcm mentioned, new Foo { foo.bar.baz } is new Foo { baz = foo.bar.baz }). It’s the “innermost” thing that needs to match, and the whole expression is the thing that gets evaluated and stored. This behaves exactly the same as any other expression, while your alternative interpretation feels more like something you would expect from a pattern.
It’s the use in patterns that feels weird to me. I might get used to let Foo { &baz } = ... because it simply transplants existing pattern syntax into field position, preserving a useful property of patterns: it matches actual concrete structure in the object (a pointer). But let Foo { *bar } = ... is problematic because *bar is not a pattern: there is no “anti-pointer” in the object’s structure to match, and *bar can never be a full pattern anyway (that’s why it’s a binding mode spelled ref bar instead).
Hmm… yeah, I do see your point. For me, the bigger improvement in ergonomics would come from doing this for struct literals, rather than pattern matching.
An alternative for struct literals would be an auto ref/deref, such that you still write Foo { bar }, which is expanded to Foo { bar: &bar } or Foo { bar: *bar } if necessary
I’m personally not in favor of autoref/deref. I have found things like default-match bindings make my life much harder because what you write has all sorts of weird semantics behind it. I would rather have to go through an extra compiler error but know what the type of the binding is.
I’m not too favorable of this approach either… in Fortran, C, and C++, the proliferation of flags and optional features means that it can be a pain to get anything to compile and run reproducibly because you have to find the right incantation.
Strongly agreed. One of the best things about Rust from a developer experience point of view—and very much by contrast with the aforementioned languages—is that by and large you just run cargo build, and there’s not much notion of different strictness/warning levels/etc. (Modulo Clippy and a few other lints, perhaps, but that’s a rather different story.) Also, my experience with TypeScript in the last few years has me very sad about configurations that change the strictness of a compiler—and it has more than once ended up with some very surprising intersections of (un)strictness along the way.