For swizzles like this frunk - Rust can be used.
For units of measurement there are a few crates like crates.io: Rust Package Registry
Note that it's already possible to write
(v.x, v.x, v.y, v.y) = (1.0, 1.0, 2.0, 2.0);
today, and that compiles fine.
You might also be interested in the nalgebra-glm crate. Your micro example could better be written as:
let a = glm::vec4(1.0f32, 2.0, 3.0, 4.0);
vec4 mut b = a.xxy().push(a.z);
(b.x, b.y) = (b.z, b.w);
There are no semantics in the proposal that can't be provided by a library. The only new thing added by OP here is syntax, namely:
- Type constructor functions (e.g.
vec4) which are overloaded for any sequence of scalar and vector values which has the correct number of components. - Swizzle field access and assignment.
- Looser implicit type coercions around vectors (the only example usage of which is the constructor functions which is likely just taking any type which can
asto the inferred type, and likely defaulting tof32instead off64for unspecified type{float}values used for vectors.).
Of the ones named by the OP:
- cgmath is still used, but is five years old without update and still disclaims "not all of the functionality has been implemented yet." I think its fair to call it superseded.
- nalgebra aims to be a highly generic linear algebra crate that supports vectors and matrices of any type and any size, including dynamically sized and struct-of-array-of-struct uses. (Instead of the
f32vec4type beingf32x4, nalgebra suggests using e.g.Vector4<f32x4>where you make SIMD vectors the element of the algebraic vector.) This generality has significant impact on ergonomics/DevX by way of compile time, type errors (e.g. big generic type names instead of simple named structs), and documentation (although this did get a bit better with generated method documentation for type aliases). - glam deliberately defines only the concrete types it considers useful, in the simple obvious manner, for the explicit reason that it results in a simpler to use library than nalgebra's much more complicated and involved setup. Personally, I see people choosing to use glam unless they get the not-invented-here urge to edit/improve either library's API surface. Or to create an nth standard that surely everyone will adopt. (I have been tempted in the past.)
- vek appears to be an attempted middle ground. I'm not much sure about it.
- The notable extra example of bevy_math just directly reexports glam's types, although IIRC it used to wrap them and provide a custom API surface instead.
I understand that maybe none of the existing ones are it, but if one claims to know they have the one correct design for it in core, it might be good to at least demonstrate that in a crate first -- and then there's a question whether it needs to move at all.
My default answer there is that if the syntax is worth having, it should be a general thing available to all libraries, not something special just for these types.
And thus the RFC for that syntax (like swizzle pseudo-fields) needs to justify its own existence with the vector types being just one possible motivation.
(For example, it needs to explain why a .zyxw() method wouldn't be good enough, since this isn't GLSL.)
Use for what purpose? Game dev? That wouldn't work well for std. What if I want to do 4D game (I know of at least one 4D puzzle game) or a visualiser for hypercubes projecting into 3D?
When doing some Monteo Carlo simulations some years ago I needed to represent the state as (iirc) 6D vectors.
And what about other fields such as quantum physics, chemistry etc? I don't know what vector lengths would be useful in those, but I think a comprehensive survey should be done before putting anything into std. Gamedev/graphics isn't everything, and there are uses for small vectors outside this. Not everything falls neatly into large-and-sparse or 4D-is-enough.
I would really like to see this concern addressed in the RFC, I suspect the current design would end up useless to a lot of folk outside of gamedev. I would prefer the dimensionality to be a type level parameter.
glam's set of types is different, but a choice of types that is directly precedented would be those available in GLSL, namely (using OP's name construction):
boolvec2,boolvec3,boolvec4,i32vec2,i32vec3,i32vec4,u32vec2,u32vec3,u32vec4,f32vec2,f32vec3,f32vec4,f64vec2,f64vec3,f64vec4,f32mat2,f32mat3,f32mat4,f64mat2,f64mat3,f64mat4,f32mat2x2,f32mat2x3,f32mat2x4,f32mat3x2,f32mat3x3,f32mat3x4,f32mat4x2,f32mat4x3,f32mat4x4,f64mat2x2,f64mat2x3,f64mat2x4,f64mat3x2,f64mat3x3,f64mat3x4,f64mat4x2,f64mat4x3, andf64mat4x4.
Extending to cover the "stable" extensions provided by the GLM library adds:
f32quat,f64quat,boolvec1,i32vec1,u32vec1,f32vec1, andf64vec1.
Extending to cover the "recommended" extensions provided by GLM adds:
i32mat2,i32mat3,i32mat4,u32mat2,u32mat3,u32mat4,i32mat2x2,i32mat2x3,i32mat2x4,i32mat3x2,i32mat3x3,i32mat3x4,i32mat4x2,i32mat4x3,i32mat4x4,u32mat2x2,u32mat2x3,u32mat2x4,u32mat3x2,u32mat3x3,u32mat3x4,u32mat4x2,u32mat4x3, andu32mat4x4.
For completion, extending to cover the "experimental" extensions adds:
f32dualquat,f64dualquat,isizevec1,isizevec2,isizevec3, andisizevec4.
So, highly game/graphics focused. My point stands, that this is too narrow of a view for something that is going into std.
Yeah i think the fundamental issue of this proposal is that Rust is not trying to have "tier-1" gamedev usability and this would not be super useful to anyone outside of physics (engineering, think of simulations for materials / shape solvers), sound (music editors, think of adobe audition), game, graphics (realtime rendering, offline solutions, things like photoshop and blender), and possibly asset tool (think of ffmpeg, image processing) programming.
Which is maybe something Rust should not cover? I was under impression it tried to when i wrote this pre-rfc, but now im not. Those areas (except games) are less about complicated systems, and usually dont benefit from aliasing analysis and complex type systems as much as e.g. drivers, and are more about "raw processing power" wrapped nicely. Its sad that there is no as high quality and polished compilers that try to cover such niches tho.
I believe 12D would be an internally consistent arbitrarily chosen maximum length for small vectors.
I'm not against solving the issue for gamedev. What I'm worried about is if the solution is too narrow so it isn't also more widely useful. In particular, if the design happens to be such that it can't grow to support more use cases down the line. I don't want this to be a dead end for everyone else. Remember, once something is stable in std it is there forever, warts and all.