Python chooses @ for matrix multiplication. How about also giving it to rustaceans?
This has been discussed before, please use the search. The short answer is that it would have more drawbacks than benfits.
Can you speak to why you think it's needed? What's wrong with overloading *
to work with matrices too? We have static types, so we don't need different operators for different types.
@H2CO3 I see there is Add a matrix multiplication operator like Python? about mat_mul is not commonly used. Would you please elaborate on the drawbacks?
Can you speak to why you think it's needed?
Why not? For Python people switching to Rust, it's a welcome sign.
What's wrong with overloading
*
to work with matrices too?
Maybe a halfsecond hold back for people new to a library, is it an elementwise or matrix multiplication?
This is an argument for adding every popular feature from every language ever, and thus isn't persuasive for anything.
Even with normal linear algebra matrices perelementpair multiplication is an important operation, and thus one must still pick between "dot multiplication" and "matrix multiplication".
I agree that wanton operator overloading can help create unreadable messes, but esoteric operators (perhaps ones that you cannot see a use for) can be extremely helpful in growing a language and making it useful and convenient for sometimes very specific purposes. I personally can see a use for really quite a lot of operators, and I use decent portions of that list on a regular basis.
Is that so? I don't remember coming across this operation over a 2 semester linear algebra course.
I'd be interested what its applications are (for 2 dimensional matrices specifically).
Err. I'm confused here. If you can add every popular feature together and they fit well, then it's probably a good one.
The burden for new features is generally on those requesting them. Keep in mind that Rust is its own language. Python having an operator is not sufficient reason for Rust to implement it, no matter how "popular" it is there.
Why do you believe this is a common enough operation to warrant its own operator, and why do existing options not work?
@pipehappy helpful note: you can highlight text in a post, and then there will be a Quote button pop up which will insert a quote of the text you're replying to (like in @jhpratt's post) to make it clear what part of your post is the quote you're replying to, and what part is your reply. Alternatively, you can start a line with >
to insert a block quote without a forum quote.
Also interesting to note is that there isn't a matrix type in std, so adding @
as an operator would be adding an operator without any implementations on std types.
What you want to search for to see the downsides @H2CO3 alludes to is the more general idea of "custom operators." The key arguments against custom operators are that they make it harder to read code when using operators not drilled into our heads since grade school (this is arguable; well used/known operators in a specific domain can make certain things easier to read), but also specifically about binding strength / precedence.
With the standard arithmatic operators std defines, we (mostly, barring horrible gotcha posts that are deliberately ambiguous to gain interactions and go viral) agree on the order of operations between them. For custom operators, you either need to:
 have no precedence, and always require bracketing, removing the benefit of using operators;
 allow the custom operator impl to define binding strength, and somehow deal with that in the parser; or
 pre define a binding strength (probably just straight leftassosciative) for custom operators and just be wrong for a number of them.
And defining precidence isn't even good enough, because you can still get multiple valid parses, if you allow to define the same operator over different types with different associativity/binding strength. (Orphan rules may prevent this from being problematic? I'm really not certain.)
Consider as an example, A @ B @ C
with A @ B => T
, B @ C => U
, T @ C => Q
, and A @ U => Z
. Either (A @ B) @ C
or A @ (B @ C)
are valid parses, depending on the different associativity and binding strength where all of these operators are defined.
So, we pull back from "full custom operators" to just some small defined set with defined precidence and associativity... but... whose? That's the other problem with operators beyond the basic ones: there's less than no practical agreement on what their associativity and relative binding strength is (and far from agreement on usage).
Adding one operator on its own can seem harmless enough, but as soon as you ask "why this operator and not others" then complications come from every direction. And you can't pick a reasonable subset of operators (where e.g. for identifiers, Unicode defines what an identifier should be defined as),^{[1]} beyond the simple mathematic operators we already have, without playing favorites between equally reasonable options.
So the burden falls on proving that adding something new to the language is worth the extra cost of teaching it, and with the downsides of not having a clearly best design, it's very hard to pass.

Actually, the same UAX#31 defines
PATTERN_SYNTAX
as a potential set ofcharacterscodepoints to be used for syntactical elements in a code language, to leave all others as nonsemantic to the language. This can be used to define e.g. what codepoints are allowed in custom operators in a mostly naturallanguageagnostic manner, but does nothing to resolve the inherent difficulty of actually supporting custom operators without a) them being abused and b) having to deal with all of the issues of associativity and binding strength.For a note, Haskell, which uses custom operators a lot, created its own search engine to search for its custom operators. That's another big downside of custom operators: searchability in conventional search engines is next to none. â†©ď¸Ž
@CAD97 Thanks! This is quite informative.
Also interesting to note is that there isn't a matrix type in std, so adding
@
as an operator would be adding an operator without any implementations on std types.
This looks like an ecosystem perspective and depends on how you vision it. There are Matlab, Mathematica having it builtin; there are implementations in libraries, like NumPy.
The key arguments against custom operators are that they make it harder to read code when using operators not drilled into our heads since grade school
Agree. Something that may help here is the assumption that people will not use hardtoread code on a daily basis.
also specifically about binding strength / precedence.
Under specific usecase inmind, can we just do like straight leftassosciative.
beyond the simple mathematic operators we already have, without playing favorites between equally reasonable options.
No, I feel there is no equally options. Someone is favored, others are not.
So the burden falls on proving that adding something new to the language is worth the extra cost of teaching it, and with the downsides of not having a clearly best design,
Using @ not something like .* is a reduction in education cost.
Python having an operator is not sufficient reason for Rust to implement it, no matter how "popular" it is there.
Agree. The Python part is for why '@' may be an option but others.
Why do you believe this is a common enough operation to warrant its own operator, and why do existing options not work?
I don't see it's common enough or not. I don't have numbers. Let's say the 'why' part like this:
Adding operator @ allows an implementation to support both elementwise and dot product at the operator level in the same unit of code.
This gets to CAD's point, though: Why stop there? If elementwise and dot product are worth having operators, why not cross product too? Or tensor product?
And the calculus are different for a unityped language like Python and a multityped language like Rust. In Rust we can have https://doc.rustlang.org/nightly/std/simd/struct.Simd.html#implMul<Simd<f32%2C%20LANES>> for elementwise, for example.
This gets to CAD's point, though: Why stop there? If elementwise and dot product are worth having operators, why not cross product too? Or tensor product?
If you want to go ahead, sure you can. But do you?
"Common gradelevel operators" is a strongly supported stopping point for adding a fixed list of operators into the language.
If you want to add any more, you need to provide why the new set of operators is more justified than this preexisting set. (And you also need to provide why it's better than "your set, but with my least favorite new operator swapped out for my favorite.")
From the dialog, it seems there are two parts about the discussion:
Can we have one more operator in std::ops, intend to support dot and elementwise product, together with std::ops::Mul?
The current spec sees * as the arithmetic multiplication. By least surprise, tensor library author may overload std::ops::Mul as the elementwise product as well. Meanwhile, dot product is also common operation may want to overload std::ops::Mul too. Adding one more operator allows these two operations having their own symbol to overload. Without it, people will fall back to function call.
One discussion is about popularity, if no one use it then don't do it. The argument is in two folds. First, there are languages supporting both product at operator level, like Python, Matlab, and Mathematica. One common label can see here is that they are mathmodel heavy. second part is does Rust fit here? I think Rust fits and people will use it.
The other discussion is about other tensor operations. It's an subjective topic, but there is common behavior and expectation. People feels sugarsyntax two kinds of product is enough, suggested in other implementation.
Another concern is about the use case beside multiplication, which may cause hardtoread code. Readiness has higher priority these days, hardtoread code won't be popular. If there is hardtoread code is popular, then that's a sublanguage.
That one more operator can be @.
The bottom line is other symbol will fit. But 1. the @ is already in the language and its description is in the bottom section of that chapter in the book, 2. it's used as dot product in other language, thus less surprise.
To my mind, the principle of least surprise suggests that neither form of multiplication should use *
, because either a reader or writer could mistakenly assume which form of multiplication it means.
For the record, @
already has a meaning in Rust, as a binary operator in patterns. The fact that this meaning has nothing to do with matrix multiplication is rather an argument against using @
.
Also note that you seem to be using the term â€śdot productâ€ť wrong. Itâ€™s IMO rather unfortunate that pythonâ€™s .dot
method, as well as Rustâ€™s ndarray use a method called â€śdotâ€ť for matrix multiplication (even though, for two vectors, the dot product [aka â€śscalar productâ€ť] can be the same as matrix multiplication, if the left one is made into a rowvector and the second into a columnvector).
Also note that Iâ€™ve personally never seen â€ś@â€ť being used for matrix multiplication before in any programming language that I knew (I obviously donâ€™t know Python all that well).
It's even worse than that! The dot product has a scalar output, while matrix multiplication between a row vector and a column vector produces a 1x1 matrix, which, for us programmers, is different than a scalar. Moreover, if you multiply together two 1x1 matrices, all 3 options are valid (dot product, elementwise multiplication, and matrix multiplication...)
I know. I didnâ€™t want to go into this level of detail . I had a professor for linear algebra who cared a lot about differentiating things like matrices from vectors or functions. Linear transformations are not the same as matrices, 1Ă—n matrices or mĂ—1 matrices are not the same as vectors. Itâ€™s often formally inconsistent or at least quite nontrivial to â€śidentifyâ€ť things in ways itâ€™s often done in mathematics.
Still, it is nonetheless common do these â€śidentificationsâ€ť in mathematics, and itâ€™s not really problematic if it just means that you treat these things as â€śalmost the sameâ€ť and mean to have implicit conversions between them when itâ€™s necessary; but this approach only works when thereâ€™s a human around that can â€śintuitivelyâ€ť tell you where implicit conversions are necessary, itâ€™s the same as lots of other ambiguities in standard mathematical notation. If you do want to apply mathematical notation/convention to programming, you need to disambiguate by introducing more/different notation, and/or by carefully overloading based on consistent and unsurprising rules, and thereâ€™s often multiple ways to design a system like this.
I donâ€™t know how Python handles it, but I wouldnâ€™t be surprised if, through the magic of dynamic typing, 1Ă—1 matrices would automatically become scalars.
For ndarray, thereâ€™s separate vector and matrix types, so the overloaded dot
operation is just a trait operation that does vectorvector multiplication yielding a scalar, or matrixvector or vectormatrix multiplication yielding a matrix, or matrixmatrix multiplication.
Note that if you do interpret 1x1 matrices as basicallythesame as scalars, then while of course
the three operations also produce the same result.
I do consider elementwise multiplication as quite unnatural if youâ€™re working with actual matrices in the context of linear algebra, not just with â€śarbitraryâ€ť multidimensional array data. In this sense, I do find it quite logical how ndarray
makes *
be elementwise on its Array2
type, while nalgebra
uses matrix multiplication on its DMatrix
type, and other Matrix<â€¦>
types (all are twodimensional, but many are fixedsize).
Everyone else has made really good points on why this is probably not a good idea, so I'm going to come at the problem from a completely different angle: how common is @
on keyboards worldwide?
I ask because of the problems that APL ran into that were solved by making special keyboards so you could type the symbols without having to memorize what keys mapped to what operators. The @
symbol is probably a safe bet if for no other reason than everyone's email address has the symbol in it, but assuming that your proposal was accepted, which symbol is used for the next operator? And the one after that? Etc.?
If you decide to give up @
and other single character operators, you might use something like #k"mat_mul"
^{1} to create an operator, but at that point I'd really prefer to see a.mat_mul(b)
.
Also, please don't say 'Unicode'. Yes, I know that every possible math symbol is going to be encoded in there somewhere, but if it isn't on my keyboard, I don't want to deal with it.
^{1}There was an RFC at one point about reserving a namespace for new keywords, but I can't find it. I hope I got it right!