I believe this topic has been discussed here before, so for the interest of brevity I'll resume the suggestion as integrating 8BitMate's and dtolnay's reflect crate into the compiler.
That crate's ambitions would be the central point of this proposal: a simpler API for macro authors to make data-generic macros while avoiding edge cases with lifetimes and generic types.
This reflection would happen only at compile-time, and compile down to a proc-macro for use within a larger program, and as shown in the aforementioned crate, this need not come at the cost of runtime speed, or introducing any kind of reflection at all to runtime.
As per the API, I'd imagine the proc_macro API wouldn't change much at all. I really view them as two separate APIs.
The point of doing this at the compiler level instead of at a library level is exactly about handling new compiler constructs in a better way. For example, the reflect library barely has support for associated types, nevermind GATs which are incoming. I really see no reason to implement a whole new compiler in a library if we've already got one right here.
Avoiding stabilizing the syntax tree would be an issue, no doubt. I have no easy solutions for that.
Oh, I meant to say that the API for the macro developer would be separate. The API for the macro consumer would stay the same as proc macros. It's just I can see difficulties handling both reflection types and token trees at the same level of abstraction.
I've come back to think about this, and I think I've got an answer for the third question.
In dtolnay's crate, there actually is a separate representation of AST than what's internally used by the compiler. So perhaps a compromise between TokenStreams and stabilizing the internal AST would be to create a separate AST specifically for this API, which gets translated to and from the internal compiler AST.
I realize this would be a massive amount of work, but it would allow separation of concerns between the internal API and the external API.
The goal of the API I'm currently designing is to create a stable and driver independent semantic representation of Rust code. Though, I'm creating this representation for analysis and not code generation. This goal has a few implications.
It tries to hide the internal representation and prevent users from creating nodes themselves. This allows drivers to expect that all input is valid and don't require and validation
Analysis most often requires type information. Therefore, it'll attach to rustc in the LateLintPass, while macros get expanded way earlier in compilation.
Based on this, I would guess that the representation for rust-linting is not suitable.