I believe this topic has been discussed here before, so for the interest of brevity I'll resume the suggestion as integrating 8BitMate's and dtolnay's
reflect crate into the compiler.
That crate's ambitions would be the central point of this proposal: a simpler API for macro authors to make data-generic macros while avoiding edge cases with lifetimes and generic types.
This reflection would happen only at compile-time, and compile down to a proc-macro for use within a larger program, and as shown in the aforementioned crate, this need not come at the cost of runtime speed, or introducing any kind of reflection at all to runtime.
What this would look like AIUI would be more API surface in the
The important questions to answer:
- What would the API look like?
proc_macro still only deal in
TokenStream? Would it get minimal AST types?
- What are the benefits of providing this over the
proc_macro bridge instead of in a library (i.e.
- Likely: performance, avoiding reparsing, automatically(ish) handling new constructs and features
- How does this avoid stabilizing the syntax tree?
- i.e. not doing so is why
proc_macro only deals in the much more stable token trees
As per the API, I'd imagine the proc_macro API wouldn't change much at all. I really view them as two separate APIs.
The point of doing this at the compiler level instead of at a library level is exactly about handling new compiler constructs in a better way. For example, the
reflect library barely has support for associated types, nevermind GATs which are incoming. I really see no reason to implement a whole new compiler in a library if we've already got one right here.
Avoiding stabilizing the syntax tree would be an issue, no doubt. I have no easy solutions for that.
So how do you see reflection being used, if not in proc macros, then?
Oh, I meant to say that the API for the macro developer would be separate. The API for the macro consumer would stay the same as proc macros. It's just I can see difficulties handling both reflection types and token trees at the same level of abstraction.
I've come back to think about this, and I think I've got an answer for the third question.
In dtolnay's crate, there actually is a separate representation of AST than what's internally used by the compiler. So perhaps a compromise between TokenStreams and stabilizing the internal AST would be to create a separate AST specifically for this API, which gets translated to and from the internal compiler AST.
I realize this would be a massive amount of work, but it would allow separation of concerns between the internal API and the external API.
A stable representation is actually being worked on as part of the
Well that's convenient. Would it be suitable for this purpose?
Dunno; it's not done, not am I actively working on it. @xFrednet is the one doing the work.
The goal of the API I'm currently designing is to create a stable and driver independent semantic representation of Rust code. Though, I'm creating this representation for analysis and not code generation. This goal has a few implications.
- It tries to hide the internal representation and prevent users from creating nodes themselves. This allows drivers to expect that all input is valid and don't require and validation
- Analysis most often requires type information. Therefore, it'll attach to rustc in the
LateLintPass, while macros get expanded way earlier in compilation.
Based on this, I would guess that the representation for
rust-linting is not suitable.
Understandable. Thank you for your time.
In this situation, I'd have to guess a situation more like dtolnay's API would be required.
A representation of the AST that abstracts away generics and lifetimes, as dtolnay said himself "That conceptually represents the monomorphized type".
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.