rust/src/libproc_macro/quote.rs
Alex Crichton 553c04d9eb proc_macro: Reorganize public API
This commit is a reorganization of the `proc_macro` crate's public user-facing
API. This is the result of a number of discussions at the recent Rust All-Hands
where we're hoping to get the `proc_macro` crate into ship shape for
stabilization of a subset of its functionality in the Rust 2018 release.

The reorganization here is motivated by experiences from the `proc-macro2`,
`quote`, and `syn` crates on crates.io (and other crates which depend on them).
The main focus is future flexibility along with making a few more operations
consistent and/or fixing bugs. A summary of the changes made from today's
`proc_macro` API is:

* The `TokenNode` enum has been removed and the public fields of `TokenTree`
  have also been removed. Instead the `TokenTree` type is now a public enum
  (what `TokenNode` was) and each variant is an opaque struct which internally
  contains `Span` information. This makes the various tokens a bit more
  consistent, require fewer wrappers, and otherwise provides good
  future-compatibility as opaque structs are easy to modify later on.

* `Literal` integer constructors have been expanded to be unambiguous as to what
  they're doing and also allow for more future flexibility. Previously
  constructors like `Literal::float` and `Literal::integer` were used to create
  unsuffixed literals and the concrete methods like `Literal::i32` would create
  a suffixed token. This wasn't immediately clear to all users (the
  suffixed/unsuffixed aspect) and having *one* constructor for unsuffixed
  literals required us to pick a largest type which may not always be true. To
  fix these issues all constructors are now of the form
  `Literal::i32_unsuffixed` or `Literal::i32_suffixed` (for all integral types).
  This should allow future compatibility as well as being immediately clear
  what's suffixed and what isn't.

* Each variant of `TokenTree` internally contains a `Span` which can also be
  configured via `set_span`. For example `Literal` and `Term` now both
  internally contain a `Span` rather than having it stored in an auxiliary
  location.

* Constructors of all tokens are called `new` now (aka `Term::intern` is gone)
  and most do not take spans. Manufactured tokens typically don't have a fresh
  span to go with them and the span is purely used for error-reporting
  **except** the span for `Term`, which currently affects hygiene. The default
  spans for all these constructed tokens is `Span::call_site()` for now.

  The `Term` type's constructor explicitly requires passing in a `Span` to
  provide future-proofing against possible hygiene changes. It's intended that a
  first pass of stabilization will likely only stabilize `Span::call_site()`
  which is an explicit opt-in for "I would like no hygiene here please". The
  intention here is to make this explicit in procedural macros to be
  forwards-compatible with a hygiene-specifying solution.

* Some of the conversions for `TokenStream` have been simplified a little.

* The `TokenTreeIter` iterator was renamed to `token_stream::IntoIter`.

Overall the hope is that this is the "final pass" at the API of `TokenStream`
and most of `TokenTree` before stabilization. Explicitly left out here is any
changes to `Span`'s API which will likely need to be re-evaluated before
stabilization.

All changes in this PR have already been reflected to the [`proc-macro2`],
`quote`, and `syn` crates. New versions of all these crates have also been
published to crates.io.

Once this lands in nightly I plan on making an internals post again summarizing
the changes made here and also calling on all macro authors to give the APIs a
spin and see how they work. Hopefully pending no major issues we can then have
an FCP to stabilize later this cycle!

[`proc-macro2`]: https://docs.rs/proc-macro2/0.3.1/proc_macro2/
2018-04-02 13:48:34 -07:00

293 lines
9.1 KiB
Rust

// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! # Quasiquoter
//! This file contains the implementation internals of the quasiquoter provided by `quote!`.
//! This quasiquoter uses macros 2.0 hygiene to reliably access
//! items from `proc_macro`, to build a `proc_macro::TokenStream`.
use {Delimiter, Literal, Spacing, Span, Term, Op, Group, TokenStream, TokenTree};
use syntax::ext::base::{ExtCtxt, ProcMacro};
use syntax::parse::token;
use syntax::tokenstream;
pub struct Quoter;
pub fn unquote<T: Into<TokenStream> + Clone>(tokens: &T) -> TokenStream {
tokens.clone().into()
}
pub trait Quote {
fn quote(self) -> TokenStream;
}
macro_rules! tt2ts {
($e:expr) => (TokenStream::from(TokenTree::from($e)))
}
macro_rules! quote_tok {
(,) => { tt2ts!(Op::new(',', Spacing::Alone)) };
(.) => { tt2ts!(Op::new('.', Spacing::Alone)) };
(:) => { tt2ts!(Op::new(':', Spacing::Alone)) };
(|) => { tt2ts!(Op::new('|', Spacing::Alone)) };
(::) => {
[
TokenTree::from(Op::new(':', Spacing::Joint)),
TokenTree::from(Op::new(':', Spacing::Alone)),
].iter()
.cloned()
.map(|mut x| {
x.set_span(Span::def_site());
x
})
.collect::<TokenStream>()
};
(!) => { tt2ts!(Op::new('!', Spacing::Alone)) };
(<) => { tt2ts!(Op::new('<', Spacing::Alone)) };
(>) => { tt2ts!(Op::new('>', Spacing::Alone)) };
(_) => { tt2ts!(Op::new('_', Spacing::Alone)) };
(0) => { tt2ts!(Literal::i8_unsuffixed(0)) };
(&) => { tt2ts!(Op::new('&', Spacing::Alone)) };
($i:ident) => { tt2ts!(Term::new(stringify!($i), Span::def_site())) };
}
macro_rules! quote_tree {
((unquote $($t:tt)*)) => { $($t)* };
((quote $($t:tt)*)) => { ($($t)*).quote() };
(($($t:tt)*)) => { tt2ts!(Group::new(Delimiter::Parenthesis, quote!($($t)*))) };
([$($t:tt)*]) => { tt2ts!(Group::new(Delimiter::Bracket, quote!($($t)*))) };
({$($t:tt)*}) => { tt2ts!(Group::new(Delimiter::Brace, quote!($($t)*))) };
($t:tt) => { quote_tok!($t) };
}
macro_rules! quote {
() => { TokenStream::empty() };
($($t:tt)*) => {
[$(quote_tree!($t),)*].iter()
.cloned()
.flat_map(|x| x.into_iter())
.collect::<TokenStream>()
};
}
impl ProcMacro for Quoter {
fn expand<'cx>(&self, cx: &'cx mut ExtCtxt,
_: ::syntax_pos::Span,
stream: tokenstream::TokenStream)
-> tokenstream::TokenStream {
let mut info = cx.current_expansion.mark.expn_info().unwrap();
info.callee.allow_internal_unstable = true;
cx.current_expansion.mark.set_expn_info(info);
::__internal::set_sess(cx, || TokenStream(stream).quote().0)
}
}
impl<T: Quote> Quote for Option<T> {
fn quote(self) -> TokenStream {
match self {
Some(t) => quote!(Some((quote t))),
None => quote!(None),
}
}
}
impl Quote for TokenStream {
fn quote(self) -> TokenStream {
if self.is_empty() {
return quote!(::TokenStream::empty());
}
let mut after_dollar = false;
let tokens = self.into_iter().filter_map(|tree| {
if after_dollar {
after_dollar = false;
match tree {
TokenTree::Term(_) => {
let tree = TokenStream::from(tree);
return Some(quote!(::__internal::unquote(&(unquote tree)),));
}
TokenTree::Op(ref tt) if tt.op() == '$' => {}
_ => panic!("`$` must be followed by an ident or `$` in `quote!`"),
}
} else if let TokenTree::Op(tt) = tree {
if tt.op() == '$' {
after_dollar = true;
return None;
}
}
Some(quote!(::TokenStream::from((quote tree)),))
}).flat_map(|t| t.into_iter()).collect::<TokenStream>();
if after_dollar {
panic!("unexpected trailing `$` in `quote!`");
}
quote!(
[(unquote tokens)].iter()
.cloned()
.flat_map(|x| x.into_iter())
.collect::<::TokenStream>()
)
}
}
impl Quote for TokenTree {
fn quote(self) -> TokenStream {
match self {
TokenTree::Op(tt) => quote!(::TokenTree::Op( (quote tt) )),
TokenTree::Group(tt) => quote!(::TokenTree::Group( (quote tt) )),
TokenTree::Term(tt) => quote!(::TokenTree::Term( (quote tt) )),
TokenTree::Literal(tt) => quote!(::TokenTree::Literal( (quote tt) )),
}
}
}
impl Quote for char {
fn quote(self) -> TokenStream {
TokenTree::from(Literal::character(self)).into()
}
}
impl<'a> Quote for &'a str {
fn quote(self) -> TokenStream {
TokenTree::from(Literal::string(self)).into()
}
}
impl Quote for usize {
fn quote(self) -> TokenStream {
TokenTree::from(Literal::usize_unsuffixed(self)).into()
}
}
impl Quote for Group {
fn quote(self) -> TokenStream {
quote!(::Group::new((quote self.delimiter()), (quote self.stream())))
}
}
impl Quote for Op {
fn quote(self) -> TokenStream {
quote!(::Op::new((quote self.op()), (quote self.spacing())))
}
}
impl Quote for Term {
fn quote(self) -> TokenStream {
quote!(::Term::new((quote self.as_str()), (quote self.span())))
}
}
impl Quote for Span {
fn quote(self) -> TokenStream {
quote!(::Span::def_site())
}
}
macro_rules! literals {
($($i:ident),*; $($raw:ident),*) => {
pub enum LiteralKind {
$($i,)*
$($raw(usize),)*
}
impl LiteralKind {
pub fn with_contents_and_suffix(self, contents: Term, suffix: Option<Term>)
-> Literal {
let sym = contents.sym;
let suffix = suffix.map(|t| t.sym);
match self {
$(LiteralKind::$i => {
Literal {
token: token::Literal(token::Lit::$i(sym), suffix),
span: contents.span,
}
})*
$(LiteralKind::$raw(n) => {
Literal {
token: token::Literal(token::Lit::$raw(sym, n), suffix),
span: contents.span,
}
})*
}
}
}
impl Literal {
fn kind_contents_and_suffix(self) -> (LiteralKind, Term, Option<Term>) {
let (lit, suffix) = match self.token {
token::Literal(lit, suffix) => (lit, suffix),
_ => panic!("unsupported literal {:?}", self.token),
};
let (kind, contents) = match lit {
$(token::Lit::$i(contents) => (LiteralKind::$i, contents),)*
$(token::Lit::$raw(contents, n) => (LiteralKind::$raw(n), contents),)*
};
let suffix = suffix.map(|sym| Term::new(&sym.as_str(), self.span()));
(kind, Term::new(&contents.as_str(), self.span()), suffix)
}
}
impl Quote for LiteralKind {
fn quote(self) -> TokenStream {
match self {
$(LiteralKind::$i => quote! {
::__internal::LiteralKind::$i
},)*
$(LiteralKind::$raw(n) => quote! {
::__internal::LiteralKind::$raw((quote n))
},)*
}
}
}
impl Quote for Literal {
fn quote(self) -> TokenStream {
let (kind, contents, suffix) = self.kind_contents_and_suffix();
quote! {
(quote kind).with_contents_and_suffix((quote contents), (quote suffix))
}
}
}
}
}
literals!(Byte, Char, Float, Str_, Integer, ByteStr; StrRaw, ByteStrRaw);
impl Quote for Delimiter {
fn quote(self) -> TokenStream {
macro_rules! gen_match {
($($i:ident),*) => {
match self {
$(Delimiter::$i => { quote!(::Delimiter::$i) })*
}
}
}
gen_match!(Parenthesis, Brace, Bracket, None)
}
}
impl Quote for Spacing {
fn quote(self) -> TokenStream {
macro_rules! gen_match {
($($i:ident),*) => {
match self {
$(Spacing::$i => { quote!(::Spacing::$i) })*
}
}
}
gen_match!(Alone, Joint)
}
}