auto merge of #19118 : jakub-/rust/roll-up, r=jakub-

This commit is contained in:
bors 2014-11-20 00:27:07 +00:00
commit 399ff259e1
104 changed files with 1627 additions and 1099 deletions

1
configure vendored
View file

@ -1031,6 +1031,7 @@ do
make_dir $h/test/doc-guide-tasks
make_dir $h/test/doc-guide-plugin
make_dir $h/test/doc-guide-crates
make_dir $h/test/doc-guide-error-handling
make_dir $h/test/doc-rust
done

View file

@ -27,7 +27,7 @@
######################################################################
DOCS := index intro tutorial guide guide-ffi guide-macros guide-lifetimes \
guide-tasks guide-container guide-pointers guide-testing \
guide-plugin guide-crates complement-bugreport \
guide-plugin guide-crates complement-bugreport guide-error-handling \
complement-lang-faq complement-design-faq complement-project-faq \
rustdoc guide-unsafe guide-strings reference

View file

@ -30,17 +30,25 @@ endef
$(BG):
$(Q)mkdir -p $(BG)
$(BG)RustLexer.class: $(SG)RustLexer.g4
$(BG)RustLexer.class: $(BG) $(SG)RustLexer.g4
$(Q)$(CFG_ANTLR4) -o $(B)grammar $(SG)RustLexer.g4
$(Q)$(CFG_JAVAC) -d $(BG) $(BG)RustLexer.java
$(BG)verify: $(SG)verify.rs rustc-stage2-H-$(CFG_BUILD) $(LD)stamp.regex_macros $(LD)stamp.rustc
$(Q)$(RUSTC) -O --out-dir $(BG) -L $(L) $(SG)verify.rs
check-build-lexer-verifier: $(BG)verify
ifeq ($(NO_REBUILD),)
VERIFY_DEPS := rustc-stage2-H-$(CFG_BUILD) $(LD)stamp.regex_macros $(LD)stamp.rustc
else
VERIFY_DEPS :=
endif
$(BG)verify: $(BG) $(SG)verify.rs $(VERIFY_DEPS)
$(Q)$(RUSTC) --out-dir $(BG) -L $(L) $(SG)verify.rs
ifdef CFG_JAVAC
ifdef CFG_ANTLR4
ifdef CFG_GRUN
check-lexer: $(BG) $(BG)RustLexer.class $(BG)verify
check-lexer: $(BG) $(BG)RustLexer.class check-build-lexer-verifier
$(info Verifying libsyntax against the reference lexer ...)
$(Q)$(SG)check.sh $(S) "$(BG)" \
"$(CFG_GRUN)" "$(BG)verify" "$(BG)RustLexer.tokens"

View file

@ -199,7 +199,7 @@ check-docs: cleantestlibs cleantmptestlogs check-stage2-docs
# Some less critical tests that are not prone to breakage.
# Not run as part of the normal test suite, but tested by bors on checkin.
check-secondary: check-build-compiletest check-lexer check-pretty
check-secondary: check-build-compiletest check-build-lexer-verifier check-lexer check-pretty
# check + check-secondary.
#

View file

@ -0,0 +1,228 @@
% Error Handling in Rust
> The best-laid plans of mice and men
> Often go awry
>
> "Tae a Moose", Robert Burns
Sometimes, things just go wrong. It's important to have a plan for when the
inevitable happens. Rust has rich support for handling errors that may (let's
be honest: will) occur in your programs.
There are two main kinds of errors that can occur in your programs: failures,
and panics. Let's talk about the difference between the two, and then discuss
how to handle each. Then, we'll discuss upgrading failures to panics.
# Failure vs. Panic
Rust uses two terms to differentiate between two forms of error: failure, and
panic. A **failure** is an error that can be recovered from in some way. A
**panic** is an error that cannot be recovered from.
What do we mean by 'recover'? Well, in most cases, the possibility of an error
is expected. For example, consider the `from_str` function:
```{rust,ignore}
from_str("5");
```
This function takes a string argument and converts it into another type. But
because it's a string, you can't be sure that the conversion actually works.
For example, what should this convert to?
```{rust,ignore}
from_str("hello5world");
```
This won't work. So we know that this function will only work properly for some
inputs. It's expected behavior. We call this kind of error 'failure.'
On the other hand, sometimes, there are errors that are unexpected, or which
we cannot recover from. A classic example is an `assert!`:
```{rust,ignore}
assert!(x == 5);
```
We use `assert!` to declare that something is true. If it's not true, something
is very wrong. Wrong enough that we can't continue with things in the current
state. Another example is using the `unreachable!()` macro
```{rust,ignore}
enum Event {
NewRelease,
}
fn probability(_: &Event) -> f64 {
// real implementation would be more complex, of course
0.95
}
fn descriptive_probability(event: Event) -> &'static str {
match probability(&event) {
1.00 => "certain",
0.00 => "impossible",
0.00 ... 0.25 => "very unlikely",
0.25 ... 0.50 => "unlikely",
0.50 ... 0.75 => "likely",
0.75 ... 1.00 => "very likely",
}
}
fn main() {
std::io::println(descriptive_probability(NewRelease));
}
```
This will give us an error:
```{notrust,ignore}
error: non-exhaustive patterns: `_` not covered [E0004]
```
While we know that we've covered all possible cases, Rust can't tell. It
doesn't know that probability is between 0.0 and 1.0. So we add another case:
```rust
use Event::NewRelease;
enum Event {
NewRelease,
}
fn probability(_: &Event) -> f64 {
// real implementation would be more complex, of course
0.95
}
fn descriptive_probability(event: Event) -> &'static str {
match probability(&event) {
1.00 => "certain",
0.00 => "impossible",
0.00 ... 0.25 => "very unlikely",
0.25 ... 0.50 => "unlikely",
0.50 ... 0.75 => "likely",
0.75 ... 1.00 => "very likely",
_ => unreachable!()
}
}
fn main() {
println!("{}", descriptive_probability(NewRelease));
}
```
We shouldn't ever hit the `_` case, so we use the `unreachable!()` macro to
indicate this. `unreachable!()` gives a different kind of error than `Result`.
Rust calls these sorts of errors 'panics.'
# Handling errors with `Option` and `Result`
The simplest way to indicate that a function may fail is to use the `Option<T>`
type. Remember our `from_str()` example? Here's its type signature:
```{rust,ignore}
pub fn from_str<A: FromStr>(s: &str) -> Option<A>
```
`from_str()` returns an `Option<A>`. If the conversion succeeds, it will return
`Some(value)`, and if it fails, it will return `None`.
This is appropriate for the simplest of cases, but doesn't give us a lot of
information in the failure case. What if we wanted to know _why_ the conversion
failed? For this, we can use the `Result<T, E>` type. It looks like this:
```rust
enum Result<T, E> {
Ok(T),
Err(E)
}
```
This enum is provided by Rust itself, so you don't need to define it to use it
in your code. The `Ok(T)` variant represents a success, and the `Err(E)` variant
represents a failure. Returning a `Result` instead of an `Option` is recommended
for all but the most trivial of situations.
Here's an example of using `Result`:
```rust
#[deriving(Show)]
enum Version { Version1, Version2 }
#[deriving(Show)]
enum ParseError { InvalidHeaderLength, InvalidVersion }
fn parse_version(header: &[u8]) -> Result<Version, ParseError> {
if header.len() < 1 {
return Err(ParseError::InvalidHeaderLength);
}
match header[0] {
1 => Ok(Version::Version1),
2 => Ok(Version::Version2),
_ => Err(ParseError::InvalidVersion)
}
}
let version = parse_version(&[1, 2, 3, 4]);
match version {
Ok(v) => {
println!("working with version: {}", v);
}
Err(e) => {
println!("error parsing header: {}", e);
}
}
```
This function makes use of an enum, `ParseError`, to enumerate the various
errors that can occur.
# Non-recoverable errors with `panic!`
In the case of an error that is unexpected and not recoverable, the `panic!`
macro will induce a panic. This will crash the current task, and give an error:
```{rust,ignore}
panic!("boom");
```
gives
```{notrust,ignore}
task '<main>' panicked at 'boom', hello.rs:2
```
when you run it.
Because these kinds of situations are relatively rare, use panics sparingly.
# Upgrading failures to panics
In certain circumstances, even though a function may fail, we may want to treat
it as a panic instead. For example, `io::stdin().read_line()` returns an
`IoResult<String>`, a form of `Result`, when there is an error reading the
line. This allows us to handle and possibly recover from this sort of error.
If we don't want to handle this error, and would rather just abort the program,
we can use the `unwrap()` method:
```{rust,ignore}
io::stdin().read_line().unwrap();
```
`unwrap()` will `panic!` if the `Option` is `None`. This basically says "Give
me the value, and if something goes wrong, just crash." This is less reliable
than matching the error and attempting to recover, but is also significantly
shorter. Sometimes, just crashing is appropriate.
There's another way of doing this that's a bit nicer than `unwrap()`:
```{rust,ignore}
let input = io::stdin().read_line()
.ok()
.expect("Failed to read line");
```
`ok()` converts the `IoResult` into an `Option`, and `expect()` does the same
thing as `unwrap()`, but takes a message. This message is passed along to the
underlying `panic!`, providing a better error message if the code errors.

View file

@ -159,6 +159,8 @@ $ ./main # or main.exe on Windows
Hello, world!
```
You can also run these examples on [play.rust-lang.org](http://play.rust-lang.org/) by clicking on the arrow that appears in the upper right of the example when you mouse over the code.
Success! Let's go over what just happened in detail.
```{rust}

View file

@ -59,6 +59,7 @@ a guide that can help you out:
* [References and Lifetimes](guide-lifetimes.html)
* [Crates and modules](guide-crates.html)
* [Tasks and Communication](guide-tasks.html)
* [Error Handling](guide-error-handling.html)
* [Foreign Function Interface](guide-ffi.html)
* [Writing Unsafe and Low-Level Code](guide-unsafe.html)
* [Macros](guide-macros.html)

View file

@ -20,6 +20,7 @@
[type: text] src/doc/guide-testing.md $lang:doc/l10n/$lang/guide-testing.md
[type: text] src/doc/guide-unsafe.md $lang:doc/l10n/$lang/guide-unsafe.md
[type: text] src/doc/guide-crates.md $lang:doc/l10n/$lang/guide-crates.md
[type: text] src/doc/guide-error-handling.md $lang:doc/l10n/$lang/guide-error-handling.md
[type: text] src/doc/guide.md $lang:doc/l10n/$lang/guide.md
[type: text] src/doc/index.md $lang:doc/l10n/$lang/index.md
[type: text] src/doc/intro.md $lang:doc/l10n/$lang/intro.md

View file

@ -216,9 +216,15 @@ rather than referring to it by name or some other evaluation rule. A literal is
a form of constant expression, so is evaluated (primarily) at compile time.
```{.ebnf .gram}
literal : string_lit | char_lit | byte_string_lit | byte_lit | num_lit ;
lit_suffix : ident;
literal : [ string_lit | char_lit | byte_string_lit | byte_lit | num_lit ] lit_suffix ?;
```
The optional suffix is only used for certain numeric literals, but is
reserved for future extension, that is, the above gives the lexical
grammar, but a Rust parser will reject everything but the 12 special
cases mentioned in [Number literals](#number-literals) below.
#### Character and string literals
```{.ebnf .gram}
@ -371,27 +377,20 @@ b"\\x52"; br"\x52"; // \x52
#### Number literals
```{.ebnf .gram}
num_lit : nonzero_dec [ dec_digit | '_' ] * num_suffix ?
| '0' [ [ dec_digit | '_' ] * num_suffix ?
| 'b' [ '1' | '0' | '_' ] + int_suffix ?
| 'o' [ oct_digit | '_' ] + int_suffix ?
| 'x' [ hex_digit | '_' ] + int_suffix ? ] ;
num_lit : nonzero_dec [ dec_digit | '_' ] * float_suffix ?
| '0' [ [ dec_digit | '_' ] * float_suffix ?
| 'b' [ '1' | '0' | '_' ] +
| 'o' [ oct_digit | '_' ] +
| 'x' [ hex_digit | '_' ] + ] ;
num_suffix : int_suffix | float_suffix ;
float_suffix : [ exponent | '.' dec_lit exponent ? ] ? ;
int_suffix : 'u' int_suffix_size ?
| 'i' int_suffix_size ? ;
int_suffix_size : [ '8' | "16" | "32" | "64" ] ;
float_suffix : [ exponent | '.' dec_lit exponent ? ] ? float_suffix_ty ? ;
float_suffix_ty : 'f' [ "32" | "64" ] ;
exponent : ['E' | 'e'] ['-' | '+' ] ? dec_lit ;
dec_lit : [ dec_digit | '_' ] + ;
```
A _number literal_ is either an _integer literal_ or a _floating-point
literal_. The grammar for recognizing the two kinds of literals is mixed, as
they are differentiated by suffixes.
literal_. The grammar for recognizing the two kinds of literals is mixed.
##### Integer literals
@ -406,9 +405,9 @@ An _integer literal_ has one of four forms:
* A _binary literal_ starts with the character sequence `U+0030` `U+0062`
(`0b`) and continues as any mixture of binary digits and underscores.
An integer literal may be followed (immediately, without any spaces) by an
_integer suffix_, which changes the type of the literal. There are two kinds of
integer literal suffix:
Like any literal, an integer literal may be followed (immediately,
without any spaces) by an _integer suffix_, which forcibly sets the
type of the literal. There are 10 valid values for an integer suffix:
* The `i` and `u` suffixes give the literal type `int` or `uint`,
respectively.
@ -443,11 +442,9 @@ A _floating-point literal_ has one of two forms:
* A single _decimal literal_ followed by an _exponent_.
By default, a floating-point literal has a generic type, and, like integer
literals, the type must be uniquely determined from the context. A
floating-point literal may be followed (immediately, without any spaces) by a
_floating-point suffix_, which changes the type of the literal. There are two
floating-point suffixes: `f32`, and `f64` (the 32-bit and 64-bit floating point
types).
literals, the type must be uniquely determined from the context. There are two valid
_floating-point suffixes_, `f32` and `f64` (the 32-bit and 64-bit floating point
types), which explicitly determine the type of the literal.
Examples of floating-point literals of various forms:
@ -3433,7 +3430,7 @@ use to avoid conflicts is simply to name variants with upper-case letters, and
local variables with lower-case letters.
Multiple match patterns may be joined with the `|` operator. A range of values
may be specified with `..`. For example:
may be specified with `...`. For example:
```
# let x = 2i;
@ -4042,19 +4039,19 @@ initialized; this is enforced by the compiler.
### Boxes
An _box_ is a reference to a heap allocation holding another value, which is
A _box_ is a reference to a heap allocation holding another value, which is
constructed by the prefix operator `box`. When the standard library is in use,
the type of an box is `std::owned::Box<T>`.
the type of a box is `std::owned::Box<T>`.
An example of an box type and value:
An example of a box type and value:
```
let x: Box<int> = box 10;
```
Box values exist in 1:1 correspondence with their heap allocation, copying an
Box values exist in 1:1 correspondence with their heap allocation, copying a
box value makes a shallow copy of the pointer. Rust will consider a shallow
copy of an box to move ownership of the value. After a value has been moved,
copy of a box to move ownership of the value. After a value has been moved,
the source location cannot be used unless it is reinitialized.
```

View file

@ -92,49 +92,35 @@ fragment CHAR_ESCAPE
| 'U' HEXIT HEXIT HEXIT HEXIT HEXIT HEXIT HEXIT HEXIT
;
fragment SUFFIX
: IDENT
;
LIT_CHAR
: '\'' ( '\\' CHAR_ESCAPE | ~[\\'\n\t\r] ) '\''
: '\'' ( '\\' CHAR_ESCAPE | ~[\\'\n\t\r] ) '\'' SUFFIX?
;
LIT_BYTE
: 'b\'' ( '\\' ( [xX] HEXIT HEXIT | [nrt\\'"0] ) | ~[\\'\n\t\r] ) '\''
;
fragment INT_SUFFIX
: 'i'
| 'i8'
| 'i16'
| 'i32'
| 'i64'
| 'u'
| 'u8'
| 'u16'
| 'u32'
| 'u64'
: 'b\'' ( '\\' ( [xX] HEXIT HEXIT | [nrt\\'"0] ) | ~[\\'\n\t\r] ) '\'' SUFFIX?
;
LIT_INTEGER
: [0-9][0-9_]* INT_SUFFIX?
| '0b' [01][01_]* INT_SUFFIX?
| '0o' [0-7][0-7_]* INT_SUFFIX?
| '0x' [0-9a-fA-F][0-9a-fA-F_]* INT_SUFFIX?
;
fragment FLOAT_SUFFIX
: 'f32'
| 'f64'
: [0-9][0-9_]* SUFFIX?
| '0b' [01][01_]* SUFFIX?
| '0o' [0-7][0-7_]* SUFFIX?
| '0x' [0-9a-fA-F][0-9a-fA-F_]* SUFFIX?
;
LIT_FLOAT
: [0-9][0-9_]* ('.' | ('.' [0-9][0-9_]*)? ([eE] [-+]? [0-9][0-9_]*)? FLOAT_SUFFIX?)
: [0-9][0-9_]* ('.' | ('.' [0-9][0-9_]*)? ([eE] [-+]? [0-9][0-9_]*)? SUFFIX?)
;
LIT_STR
: '"' ('\\\n' | '\\\r\n' | '\\' CHAR_ESCAPE | .)*? '"'
: '"' ('\\\n' | '\\\r\n' | '\\' CHAR_ESCAPE | .)*? '"' SUFFIX?
;
LIT_BINARY : 'b' LIT_STR ;
LIT_BINARY_RAW : 'rb' LIT_STR_RAW ;
LIT_BINARY : 'b' LIT_STR SUFFIX?;
LIT_BINARY_RAW : 'rb' LIT_STR_RAW SUFFIX?;
/* this is a bit messy */
@ -148,7 +134,7 @@ fragment LIT_STR_RAW_INNER2
;
LIT_STR_RAW
: 'r' LIT_STR_RAW_INNER
: 'r' LIT_STR_RAW_INNER SUFFIX?
;
IDENT : XID_start XID_continue* ;

View file

@ -26,21 +26,21 @@ use std::io::File;
use syntax::parse;
use syntax::parse::lexer;
use rustc::driver::{session, config};
use rustc::session::{mod, config};
use syntax::ast;
use syntax::ast::Name;
use syntax::parse::token;
use syntax::parse::lexer::TokenAndSpan;
fn parse_token_list(file: &str) -> HashMap<String, Token> {
fn id() -> Token {
fn parse_token_list(file: &str) -> HashMap<String, token::Token> {
fn id() -> token::Token {
token::Ident(ast::Ident { name: Name(0), ctxt: 0, }, token::Plain)
}
let mut res = HashMap::new();
res.insert("-1".to_string(), EOF);
res.insert("-1".to_string(), token::Eof);
for line in file.split('\n') {
let eq = match line.trim().rfind('=') {
@ -60,8 +60,8 @@ fn parse_token_list(file: &str) -> HashMap<String, Token> {
"INT_SUFFIX" => id(),
"SHL" => token::BinOp(token::Shl),
"LBRACE" => token::OpenDelim(token::Brace),
"RARROW" => token::Rarrow,
"LIT_STR" => token::LitStr(Name(0)),
"RARROW" => token::RArrow,
"LIT_STR" => token::Literal(token::Str_(Name(0))),
"DOTDOT" => token::DotDot,
"MOD_SEP" => token::ModSep,
"DOTDOTDOT" => token::DotDotDot,
@ -71,17 +71,17 @@ fn parse_token_list(file: &str) -> HashMap<String, Token> {
"ANDAND" => token::AndAnd,
"AT" => token::At,
"LBRACKET" => token::OpenDelim(token::Bracket),
"LIT_STR_RAW" => token::LitStrRaw(Name(0), 0),
"LIT_STR_RAW" => token::Literal(token::StrRaw(Name(0), 0)),
"RPAREN" => token::CloseDelim(token::Paren),
"SLASH" => token::BinOp(token::Slash),
"COMMA" => token::Comma,
"LIFETIME" => token::Lifetime(ast::Ident { name: Name(0), ctxt: 0 }),
"CARET" => token::BinOp(token::Caret),
"TILDE" => token::Tilde,
"IDENT" => token::Id(),
"IDENT" => id(),
"PLUS" => token::BinOp(token::Plus),
"LIT_CHAR" => token::LitChar(Name(0)),
"LIT_BYTE" => token::LitByte(Name(0)),
"LIT_CHAR" => token::Literal(token::Char(Name(0))),
"LIT_BYTE" => token::Literal(token::Byte(Name(0))),
"EQ" => token::Eq,
"RBRACKET" => token::CloseDelim(token::Bracket),
"COMMENT" => token::Comment,
@ -95,9 +95,9 @@ fn parse_token_list(file: &str) -> HashMap<String, Token> {
"BINOP" => token::BinOp(token::Plus),
"POUND" => token::Pound,
"OROR" => token::OrOr,
"LIT_INTEGER" => token::LitInteger(Name(0)),
"LIT_INTEGER" => token::Literal(token::Integer(Name(0))),
"BINOPEQ" => token::BinOpEq(token::Plus),
"LIT_FLOAT" => token::LitFloat(Name(0)),
"LIT_FLOAT" => token::Literal(token::Float(Name(0))),
"WHITESPACE" => token::Whitespace,
"UNDERSCORE" => token::Underscore,
"MINUS" => token::BinOp(token::Minus),
@ -107,8 +107,8 @@ fn parse_token_list(file: &str) -> HashMap<String, Token> {
"OR" => token::BinOp(token::Or),
"GT" => token::Gt,
"LE" => token::Le,
"LIT_BINARY" => token::LitBinary(Name(0)),
"LIT_BINARY_RAW" => token::LitBinaryRaw(Name(0), 0),
"LIT_BINARY" => token::Literal(token::Binary(Name(0))),
"LIT_BINARY_RAW" => token::Literal(token::BinaryRaw(Name(0), 0)),
_ => continue,
};
@ -119,7 +119,7 @@ fn parse_token_list(file: &str) -> HashMap<String, Token> {
res
}
fn str_to_binop(s: &str) -> BinOpToken {
fn str_to_binop(s: &str) -> token::BinOpToken {
match s {
"+" => token::Plus,
"/" => token::Slash,
@ -167,7 +167,7 @@ fn count(lit: &str) -> uint {
lit.chars().take_while(|c| *c == '#').count()
}
fn parse_antlr_token(s: &str, tokens: &HashMap<String, Token>) -> TokenAndSpan {
fn parse_antlr_token(s: &str, tokens: &HashMap<String, token::Token>) -> TokenAndSpan {
let re = regex!(
r"\[@(?P<seq>\d+),(?P<start>\d+):(?P<end>\d+)='(?P<content>.+?)',<(?P<toknum>-?\d+)>,\d+:\d+]"
);
@ -178,7 +178,7 @@ fn parse_antlr_token(s: &str, tokens: &HashMap<String, Token>) -> TokenAndSpan {
let toknum = m.name("toknum");
let content = m.name("content");
let proto_tok = tokens.get(&toknum).expect(format!("didn't find token {} in the map",
let proto_tok = tokens.get(toknum).expect(format!("didn't find token {} in the map",
toknum).as_slice());
let nm = parse::token::intern(content);
@ -189,22 +189,25 @@ fn parse_antlr_token(s: &str, tokens: &HashMap<String, Token>) -> TokenAndSpan {
token::BinOp(..) => token::BinOp(str_to_binop(content)),
token::BinOpEq(..) => token::BinOpEq(str_to_binop(content.slice_to(
content.len() - 1))),
token::LitStr(..) => token::LitStr(fix(content)),
token::LitStrRaw(..) => token::LitStrRaw(fix(content), count(content)),
token::LitChar(..) => token::LitChar(fixchar(content)),
token::LitByte(..) => token::LitByte(fixchar(content)),
token::Literal(token::Str_(..)) => token::Literal(token::Str_(fix(content))),
token::Literal(token::StrRaw(..)) => token::Literal(token::StrRaw(fix(content),
count(content))),
token::Literal(token::Char(..)) => token::Literal(token::Char(fixchar(content))),
token::Literal(token::Byte(..)) => token::Literal(token::Byte(fixchar(content))),
token::DocComment(..) => token::DocComment(nm),
token::LitInteger(..) => token::LitInteger(nm),
token::LitFloat(..) => token::LitFloat(nm),
token::LitBinary(..) => token::LitBinary(nm),
token::LitBinaryRaw(..) => token::LitBinaryRaw(fix(content), count(content)),
token::Literal(token::Integer(..)) => token::Literal(token::Integer(nm)),
token::Literal(token::Float(..)) => token::Literal(token::Float(nm)),
token::Literal(token::Binary(..)) => token::Literal(token::Binary(nm)),
token::Literal(token::BinaryRaw(..)) => token::Literal(token::BinaryRaw(fix(content),
count(content))),
token::Ident(..) => token::Ident(ast::Ident { name: nm, ctxt: 0 },
token::ModName),
token::Lifetime(..) => token::Lifetime(ast::Ident { name: nm, ctxt: 0 }),
ref t => t.clone()
};
let offset = if real_tok == EOF {
let offset = if real_tok == token::Eof
{
1
} else {
0
@ -222,7 +225,7 @@ fn parse_antlr_token(s: &str, tokens: &HashMap<String, Token>) -> TokenAndSpan {
}
}
fn tok_cmp(a: &Token, b: &Token) -> bool {
fn tok_cmp(a: &token::Token, b: &token::Token) -> bool {
match a {
&token::Ident(id, _) => match b {
&token::Ident(id2, _) => id == id2,
@ -240,17 +243,17 @@ fn main() {
let args = std::os::args();
let mut token_file = File::open(&Path::new(args.get(2).as_slice()));
let mut token_file = File::open(&Path::new(args[2].as_slice()));
let token_map = parse_token_list(token_file.read_to_string().unwrap().as_slice());
let mut stdin = std::io::stdin();
let mut antlr_tokens = stdin.lines().map(|l| parse_antlr_token(l.unwrap().as_slice().trim(),
&token_map));
let code = File::open(&Path::new(args.get(1).as_slice())).unwrap().read_to_string().unwrap();
let code = File::open(&Path::new(args[1].as_slice())).unwrap().read_to_string().unwrap();
let options = config::basic_options();
let session = session::build_session(options, None,
syntax::diagnostics::registry::Registry::new([]));
syntax::diagnostics::registry::Registry::new(&[]));
let filemap = parse::string_to_filemap(&session.parse_sess,
code,
String::from_str("<n/a>"));
@ -258,7 +261,7 @@ fn main() {
for antlr_tok in antlr_tokens {
let rustc_tok = next(&mut lexer);
if rustc_tok.tok == EOF && antlr_tok.tok == EOF {
if rustc_tok.tok == token::Eof && antlr_tok.tok == token::Eof {
continue
}
@ -284,19 +287,19 @@ fn main() {
)
matches!(
LitByte(..),
LitChar(..),
LitInteger(..),
LitFloat(..),
LitStr(..),
LitStrRaw(..),
LitBinary(..),
LitBinaryRaw(..),
Ident(..),
Lifetime(..),
Interpolated(..),
DocComment(..),
Shebang(..)
token::Literal(token::Byte(..)),
token::Literal(token::Char(..)),
token::Literal(token::Integer(..)),
token::Literal(token::Float(..)),
token::Literal(token::Str_(..)),
token::Literal(token::StrRaw(..)),
token::Literal(token::Binary(..)),
token::Literal(token::BinaryRaw(..)),
token::Ident(..),
token::Lifetime(..),
token::Interpolated(..),
token::DocComment(..),
token::Shebang(..)
);
}
}

View file

@ -1678,10 +1678,10 @@ impl<'a> Iterator<uint> for TwoBitPositions<'a> {
mod tests {
use std::prelude::*;
use std::iter::range_step;
use std::u32;
use std::rand;
use std::rand::Rng;
use test::Bencher;
use std::u32;
use test::{Bencher, black_box};
use super::{Bitv, BitvSet, from_fn, from_bytes};
use bitv;
@ -2676,8 +2676,8 @@ mod tests {
for _ in range(0u, 100) {
bitv |= 1 << ((r.next_u32() as uint) % u32::BITS);
}
&bitv
})
black_box(&bitv)
});
}
#[bench]
@ -2688,8 +2688,8 @@ mod tests {
for _ in range(0u, 100) {
bitv.set((r.next_u32() as uint) % BENCH_BITS, true);
}
&bitv
})
black_box(&bitv)
});
}
#[bench]
@ -2700,8 +2700,8 @@ mod tests {
for _ in range(0u, 100) {
bitv.set((r.next_u32() as uint) % BENCH_BITS, r.gen());
}
&bitv
})
black_box(&bitv);
});
}
#[bench]
@ -2712,8 +2712,8 @@ mod tests {
for _ in range(0u, 100) {
bitv.set((r.next_u32() as uint) % u32::BITS, true);
}
&bitv
})
black_box(&bitv);
});
}
#[bench]
@ -2724,8 +2724,8 @@ mod tests {
for _ in range(0u, 100) {
bitv.insert((r.next_u32() as uint) % u32::BITS);
}
&bitv
})
black_box(&bitv);
});
}
#[bench]
@ -2736,8 +2736,8 @@ mod tests {
for _ in range(0u, 100) {
bitv.insert((r.next_u32() as uint) % BENCH_BITS);
}
&bitv
})
black_box(&bitv);
});
}
#[bench]

View file

@ -2084,7 +2084,7 @@ mod bench {
use std::rand::{weak_rng, Rng};
use std::mem;
use std::ptr;
use test::Bencher;
use test::{Bencher, black_box};
use vec::Vec;
@ -2140,8 +2140,8 @@ mod bench {
let mut vec: Vec<uint> = vec![];
b.iter(|| {
vec.push(0);
&vec
})
black_box(&vec);
});
}
#[bench]

View file

@ -18,28 +18,28 @@ use intrinsics;
use std::kinds::marker;
use cell::UnsafeCell;
/// An atomic boolean type.
/// A boolean type which can be safely shared between threads.
#[stable]
pub struct AtomicBool {
v: UnsafeCell<uint>,
nocopy: marker::NoCopy
}
/// A signed atomic integer type, supporting basic atomic arithmetic operations
/// A signed integer type which can be safely shared between threads.
#[stable]
pub struct AtomicInt {
v: UnsafeCell<int>,
nocopy: marker::NoCopy
}
/// An unsigned atomic integer type, supporting basic atomic arithmetic operations
/// An unsigned integer type which can be safely shared between threads.
#[stable]
pub struct AtomicUint {
v: UnsafeCell<uint>,
nocopy: marker::NoCopy
}
/// An unsafe atomic pointer. Only supports basic atomic operations
/// A raw pointer type which can be safely shared between threads.
#[stable]
pub struct AtomicPtr<T> {
p: UnsafeCell<uint>,
@ -54,43 +54,42 @@ pub struct AtomicPtr<T> {
/// to be moved either before or after the atomic operation; on the other end
/// "relaxed" atomics allow all reorderings.
///
/// Rust's memory orderings are the same as in C++[1].
///
/// 1: http://gcc.gnu.org/wiki/Atomic/GCCMM/AtomicSync
/// Rust's memory orderings are [the same as
/// C++'s](http://gcc.gnu.org/wiki/Atomic/GCCMM/AtomicSync).
#[stable]
pub enum Ordering {
/// No ordering constraints, only atomic operations
/// No ordering constraints, only atomic operations.
#[stable]
Relaxed,
/// When coupled with a store, all previous writes become visible
/// to another thread that performs a load with `Acquire` ordering
/// on the same value
/// on the same value.
#[stable]
Release,
/// When coupled with a load, all subsequent loads will see data
/// written before a store with `Release` ordering on the same value
/// in another thread
/// in another thread.
#[stable]
Acquire,
/// When coupled with a load, uses `Acquire` ordering, and with a store
/// `Release` ordering
/// `Release` ordering.
#[stable]
AcqRel,
/// Like `AcqRel` with the additional guarantee that all threads see all
/// sequentially consistent operations in the same order.
#[stable]
SeqCst
SeqCst,
}
/// An `AtomicBool` initialized to `false`
/// An `AtomicBool` initialized to `false`.
#[unstable = "may be renamed, pending conventions for static initalizers"]
pub const INIT_ATOMIC_BOOL: AtomicBool =
AtomicBool { v: UnsafeCell { value: 0 }, nocopy: marker::NoCopy };
/// An `AtomicInt` initialized to `0`
/// An `AtomicInt` initialized to `0`.
#[unstable = "may be renamed, pending conventions for static initalizers"]
pub const INIT_ATOMIC_INT: AtomicInt =
AtomicInt { v: UnsafeCell { value: 0 }, nocopy: marker::NoCopy };
/// An `AtomicUint` initialized to `0`
/// An `AtomicUint` initialized to `0`.
#[unstable = "may be renamed, pending conventions for static initalizers"]
pub const INIT_ATOMIC_UINT: AtomicUint =
AtomicUint { v: UnsafeCell { value: 0, }, nocopy: marker::NoCopy };
@ -99,7 +98,16 @@ pub const INIT_ATOMIC_UINT: AtomicUint =
const UINT_TRUE: uint = -1;
impl AtomicBool {
/// Create a new `AtomicBool`
/// Creates a new `AtomicBool`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::AtomicBool;
///
/// let atomic_true = AtomicBool::new(true);
/// let atomic_false = AtomicBool::new(false);
/// ```
#[inline]
#[stable]
pub fn new(v: bool) -> AtomicBool {
@ -107,18 +115,42 @@ impl AtomicBool {
AtomicBool { v: UnsafeCell::new(val), nocopy: marker::NoCopy }
}
/// Load the value
/// Loads a value from the bool.
///
/// `load` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Panics
///
/// Panics if `order` is `Release` or `AcqRel`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicBool, Ordering};
///
/// let some_bool = AtomicBool::new(true);
///
/// let value = some_bool.load(Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn load(&self, order: Ordering) -> bool {
unsafe { atomic_load(self.v.get() as *const uint, order) > 0 }
}
/// Store the value
/// Stores a value into the bool.
///
/// `store` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicBool, Ordering};
///
/// let some_bool = AtomicBool::new(true);
///
/// some_bool.store(false, Ordering::Relaxed);
/// ```
///
/// # Panics
///
@ -131,7 +163,19 @@ impl AtomicBool {
unsafe { atomic_store(self.v.get(), val, order); }
}
/// Store a value, returning the old value
/// Stores a value into the bool, returning the old value.
///
/// `swap` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicBool, Ordering};
///
/// let some_bool = AtomicBool::new(true);
///
/// let value = some_bool.swap(false, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn swap(&self, val: bool, order: Ordering) -> bool {
@ -140,48 +184,21 @@ impl AtomicBool {
unsafe { atomic_swap(self.v.get(), val, order) > 0 }
}
/// If the current value is the same as expected, store a new value
/// Stores a value into the bool if the current value is the same as the expected value.
///
/// Compare the current value with `old`; if they are the same then
/// replace the current value with `new`. Return the previous value.
/// If the return value is equal to `old` then the value was updated.
///
/// `swap` also takes an `Ordering` argument which describes the memory ordering of this
/// operation.
///
/// # Examples
///
/// ```rust
/// use std::sync::Arc;
/// use std::sync::atomic::{AtomicBool, SeqCst};
/// use std::task::deschedule;
/// ```
/// use std::sync::atomic::{AtomicBool, Ordering};
///
/// fn main() {
/// let spinlock = Arc::new(AtomicBool::new(false));
/// let spinlock_clone = spinlock.clone();
/// let some_bool = AtomicBool::new(true);
///
/// spawn(proc() {
/// with_lock(&spinlock, || println!("task 1 in lock"));
/// });
///
/// spawn(proc() {
/// with_lock(&spinlock_clone, || println!("task 2 in lock"));
/// });
/// }
///
/// fn with_lock(spinlock: &Arc<AtomicBool>, f: || -> ()) {
/// // CAS loop until we are able to replace `false` with `true`
/// while spinlock.compare_and_swap(false, true, SeqCst) != false {
/// // Since tasks may not be preemptive (if they are green threads)
/// // yield to the scheduler to let the other task run. Low level
/// // concurrent code needs to take into account Rust's two threading
/// // models.
/// deschedule();
/// }
///
/// // Now we have the spinlock
/// f();
///
/// // Release the lock
/// spinlock.store(false, SeqCst);
/// }
/// let value = some_bool.store(false, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
@ -192,10 +209,11 @@ impl AtomicBool {
unsafe { atomic_compare_and_swap(self.v.get(), old, new, order) > 0 }
}
/// A logical "and" operation
/// Logical "and" with a boolean value.
///
/// Performs a logical "and" operation on the current value and the argument `val`, and sets
/// the new value to the result.
///
/// Performs a logical "and" operation on the current value and the
/// argument `val`, and sets the new value to the result.
/// Returns the previous value.
///
/// # Examples
@ -223,10 +241,11 @@ impl AtomicBool {
unsafe { atomic_and(self.v.get(), val, order) > 0 }
}
/// A logical "nand" operation
/// Logical "nand" with a boolean value.
///
/// Performs a logical "nand" operation on the current value and the argument `val`, and sets
/// the new value to the result.
///
/// Performs a logical "nand" operation on the current value and the
/// argument `val`, and sets the new value to the result.
/// Returns the previous value.
///
/// # Examples
@ -255,10 +274,11 @@ impl AtomicBool {
unsafe { atomic_nand(self.v.get(), val, order) > 0 }
}
/// A logical "or" operation
/// Logical "or" with a boolean value.
///
/// Performs a logical "or" operation on the current value and the argument `val`, and sets the
/// new value to the result.
///
/// Performs a logical "or" operation on the current value and the
/// argument `val`, and sets the new value to the result.
/// Returns the previous value.
///
/// # Examples
@ -286,10 +306,11 @@ impl AtomicBool {
unsafe { atomic_or(self.v.get(), val, order) > 0 }
}
/// A logical "xor" operation
/// Logical "xor" with a boolean value.
///
/// Performs a logical "xor" operation on the current value and the argument `val`, and sets
/// the new value to the result.
///
/// Performs a logical "xor" operation on the current value and the
/// argument `val`, and sets the new value to the result.
/// Returns the previous value.
///
/// # Examples
@ -319,25 +340,57 @@ impl AtomicBool {
}
impl AtomicInt {
/// Create a new `AtomicInt`
/// Creates a new `AtomicInt`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::AtomicInt;
///
/// let atomic_forty_two = AtomicInt::new(42);
/// ```
#[inline]
#[stable]
pub fn new(v: int) -> AtomicInt {
AtomicInt {v: UnsafeCell::new(v), nocopy: marker::NoCopy}
}
/// Load the value
/// Loads a value from the int.
///
/// `load` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Panics
///
/// Panics if `order` is `Release` or `AcqRel`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicInt, Ordering};
///
/// let some_int = AtomicInt::new(5);
///
/// let value = some_int.load(Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn load(&self, order: Ordering) -> int {
unsafe { atomic_load(self.v.get() as *const int, order) }
}
/// Store the value
/// Stores a value into the int.
///
/// `store` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicInt, Ordering};
///
/// let some_int = AtomicInt::new(5);
///
/// some_int.store(10, Ordering::Relaxed);
/// ```
///
/// # Panics
///
@ -348,25 +401,48 @@ impl AtomicInt {
unsafe { atomic_store(self.v.get(), val, order); }
}
/// Store a value, returning the old value
/// Stores a value into the int, returning the old value.
///
/// `swap` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicInt, Ordering};
///
/// let some_int = AtomicInt::new(5);
///
/// let value = some_int.swap(10, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn swap(&self, val: int, order: Ordering) -> int {
unsafe { atomic_swap(self.v.get(), val, order) }
}
/// If the current value is the same as expected, store a new value
/// Stores a value into the int if the current value is the same as the expected value.
///
/// Compare the current value with `old`; if they are the same then
/// replace the current value with `new`. Return the previous value.
/// If the return value is equal to `old` then the value was updated.
///
/// `compare_and_swap` also takes an `Ordering` argument which describes the memory ordering of
/// this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicInt, Ordering};
///
/// let some_int = AtomicInt::new(5);
///
/// let value = some_int.compare_and_swap(5, 10, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn compare_and_swap(&self, old: int, new: int, order: Ordering) -> int {
unsafe { atomic_compare_and_swap(self.v.get(), old, new, order) }
}
/// Add to the current value, returning the previous
/// Add an int to the current value, returning the previous value.
///
/// # Examples
///
@ -383,7 +459,7 @@ impl AtomicInt {
unsafe { atomic_add(self.v.get(), val, order) }
}
/// Subtract from the current value, returning the previous
/// Subtract an int from the current value, returning the previous value.
///
/// # Examples
///
@ -400,7 +476,7 @@ impl AtomicInt {
unsafe { atomic_sub(self.v.get(), val, order) }
}
/// Bitwise and with the current value, returning the previous
/// Bitwise and with the current int, returning the previous value.
///
/// # Examples
///
@ -416,7 +492,7 @@ impl AtomicInt {
unsafe { atomic_and(self.v.get(), val, order) }
}
/// Bitwise or with the current value, returning the previous
/// Bitwise or with the current int, returning the previous value.
///
/// # Examples
///
@ -432,7 +508,7 @@ impl AtomicInt {
unsafe { atomic_or(self.v.get(), val, order) }
}
/// Bitwise xor with the current value, returning the previous
/// Bitwise xor with the current int, returning the previous value.
///
/// # Examples
///
@ -450,25 +526,57 @@ impl AtomicInt {
}
impl AtomicUint {
/// Create a new `AtomicUint`
/// Creates a new `AtomicUint`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::AtomicUint;
///
/// let atomic_forty_two = AtomicUint::new(42u);
/// ```
#[inline]
#[stable]
pub fn new(v: uint) -> AtomicUint {
AtomicUint { v: UnsafeCell::new(v), nocopy: marker::NoCopy }
}
/// Load the value
/// Loads a value from the uint.
///
/// `load` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Panics
///
/// Panics if `order` is `Release` or `AcqRel`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicUint, Ordering};
///
/// let some_uint = AtomicUint::new(5);
///
/// let value = some_uint.load(Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn load(&self, order: Ordering) -> uint {
unsafe { atomic_load(self.v.get() as *const uint, order) }
}
/// Store the value
/// Stores a value into the uint.
///
/// `store` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicUint, Ordering};
///
/// let some_uint = AtomicUint::new(5);
///
/// some_uint.store(10, Ordering::Relaxed);
/// ```
///
/// # Panics
///
@ -479,25 +587,48 @@ impl AtomicUint {
unsafe { atomic_store(self.v.get(), val, order); }
}
/// Store a value, returning the old value
/// Stores a value into the uint, returning the old value.
///
/// `swap` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicUint, Ordering};
///
/// let some_uint = AtomicUint::new(5);
///
/// let value = some_uint.swap(10, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn swap(&self, val: uint, order: Ordering) -> uint {
unsafe { atomic_swap(self.v.get(), val, order) }
}
/// If the current value is the same as expected, store a new value
/// Stores a value into the uint if the current value is the same as the expected value.
///
/// Compare the current value with `old`; if they are the same then
/// replace the current value with `new`. Return the previous value.
/// If the return value is equal to `old` then the value was updated.
///
/// `compare_and_swap` also takes an `Ordering` argument which describes the memory ordering of
/// this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicUint, Ordering};
///
/// let some_uint = AtomicUint::new(5);
///
/// let value = some_uint.compare_and_swap(5, 10, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn compare_and_swap(&self, old: uint, new: uint, order: Ordering) -> uint {
unsafe { atomic_compare_and_swap(self.v.get(), old, new, order) }
}
/// Add to the current value, returning the previous
/// Add to the current uint, returning the previous value.
///
/// # Examples
///
@ -514,7 +645,7 @@ impl AtomicUint {
unsafe { atomic_add(self.v.get(), val, order) }
}
/// Subtract from the current value, returning the previous
/// Subtract from the current uint, returning the previous value.
///
/// # Examples
///
@ -531,7 +662,7 @@ impl AtomicUint {
unsafe { atomic_sub(self.v.get(), val, order) }
}
/// Bitwise and with the current value, returning the previous
/// Bitwise and with the current uint, returning the previous value.
///
/// # Examples
///
@ -547,7 +678,7 @@ impl AtomicUint {
unsafe { atomic_and(self.v.get(), val, order) }
}
/// Bitwise or with the current value, returning the previous
/// Bitwise or with the current uint, returning the previous value.
///
/// # Examples
///
@ -563,7 +694,7 @@ impl AtomicUint {
unsafe { atomic_or(self.v.get(), val, order) }
}
/// Bitwise xor with the current value, returning the previous
/// Bitwise xor with the current uint, returning the previous value.
///
/// # Examples
///
@ -581,18 +712,40 @@ impl AtomicUint {
}
impl<T> AtomicPtr<T> {
/// Create a new `AtomicPtr`
/// Creates a new `AtomicPtr`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::AtomicPtr;
///
/// let ptr = &mut 5i;
/// let atomic_ptr = AtomicPtr::new(ptr);
/// ```
#[inline]
#[stable]
pub fn new(p: *mut T) -> AtomicPtr<T> {
AtomicPtr { p: UnsafeCell::new(p as uint), nocopy: marker::NoCopy }
}
/// Load the value
/// Loads a value from the pointer.
///
/// `load` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Panics
///
/// Panics if `order` is `Release` or `AcqRel`.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicPtr, Ordering};
///
/// let ptr = &mut 5i;
/// let some_ptr = AtomicPtr::new(ptr);
///
/// let value = some_ptr.load(Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn load(&self, order: Ordering) -> *mut T {
@ -601,7 +754,22 @@ impl<T> AtomicPtr<T> {
}
}
/// Store the value
/// Stores a value into the pointer.
///
/// `store` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicPtr, Ordering};
///
/// let ptr = &mut 5i;
/// let some_ptr = AtomicPtr::new(ptr);
///
/// let other_ptr = &mut 10i;
///
/// some_ptr.store(other_ptr, Ordering::Relaxed);
/// ```
///
/// # Panics
///
@ -612,18 +780,48 @@ impl<T> AtomicPtr<T> {
unsafe { atomic_store(self.p.get(), ptr as uint, order); }
}
/// Store a value, returning the old value
/// Stores a value into the pointer, returning the old value.
///
/// `swap` takes an `Ordering` argument which describes the memory ordering of this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicPtr, Ordering};
///
/// let ptr = &mut 5i;
/// let some_ptr = AtomicPtr::new(ptr);
///
/// let other_ptr = &mut 10i;
///
/// let value = some_ptr.swap(other_ptr, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn swap(&self, ptr: *mut T, order: Ordering) -> *mut T {
unsafe { atomic_swap(self.p.get(), ptr as uint, order) as *mut T }
}
/// If the current value is the same as expected, store a new value
/// Stores a value into the pointer if the current value is the same as the expected value.
///
/// Compare the current value with `old`; if they are the same then
/// replace the current value with `new`. Return the previous value.
/// If the return value is equal to `old` then the value was updated.
///
/// `compare_and_swap` also takes an `Ordering` argument which describes the memory ordering of
/// this operation.
///
/// # Examples
///
/// ```
/// use std::sync::atomic::{AtomicPtr, Ordering};
///
/// let ptr = &mut 5i;
/// let some_ptr = AtomicPtr::new(ptr);
///
/// let other_ptr = &mut 10i;
/// let another_ptr = &mut 10i;
///
/// let value = some_ptr.compare_and_swap(other_ptr, another_ptr, Ordering::Relaxed);
/// ```
#[inline]
#[stable]
pub fn compare_and_swap(&self, old: *mut T, new: *mut T, order: Ordering) -> *mut T {
@ -777,7 +975,7 @@ unsafe fn atomic_xor<T>(dst: *mut T, val: T, order: Ordering) -> T {
///
/// # Panics
///
/// Panics if `order` is `Relaxed`
/// Panics if `order` is `Relaxed`.
#[inline]
#[stable]
pub fn fence(order: Ordering) {

View file

@ -12,8 +12,6 @@
#![allow(unused_variables)]
pub use self::FormatError::*;
use any;
use cell::{Cell, Ref, RefMut};
use iter::{Iterator, range};
@ -23,10 +21,9 @@ use option::{Option, Some, None};
use ops::Deref;
use result::{Ok, Err};
use result;
use slice::{AsSlice, SlicePrelude};
use slice::SlicePrelude;
use slice;
use str::StrPrelude;
use str;
pub use self::num::radix;
pub use self::num::Radix;
@ -36,18 +33,16 @@ mod num;
mod float;
pub mod rt;
pub type Result = result::Result<(), FormatError>;
#[experimental = "core and I/O reconciliation may alter this definition"]
pub type Result = result::Result<(), Error>;
/// The error type which is returned from formatting a message into a stream.
///
/// This type does not support transmission of an error other than that an error
/// occurred. Any extra information must be arranged to be transmitted through
/// some other means.
pub enum FormatError {
/// A generic write error occurred during formatting, no other information
/// is transmitted via this variant.
WriteError,
}
#[experimental = "core and I/O reconciliation may alter this definition"]
pub struct Error;
/// A collection of methods that are required to format a message into a stream.
///
@ -58,6 +53,7 @@ pub enum FormatError {
/// This trait should generally not be implemented by consumers of the standard
/// library. The `write!` macro accepts an instance of `io::Writer`, and the
/// `io::Writer` trait is favored over implementing this trait.
#[experimental = "waiting for core and I/O reconciliation"]
pub trait FormatWriter {
/// Writes a slice of bytes into this writer, returning whether the write
/// succeeded.
@ -81,17 +77,13 @@ pub trait FormatWriter {
/// A struct to represent both where to emit formatting strings to and how they
/// should be formatted. A mutable version of this is passed to all formatting
/// traits.
#[unstable = "name may change and implemented traits are also unstable"]
pub struct Formatter<'a> {
/// Flags for formatting (packed version of rt::Flag)
pub flags: uint,
/// Character used as 'fill' whenever there is alignment
pub fill: char,
/// Boolean indication of whether the output should be left-aligned
pub align: rt::Alignment,
/// Optionally specified integer width that the output should be
pub width: Option<uint>,
/// Optionally specified precision for numeric types
pub precision: Option<uint>,
flags: uint,
fill: char,
align: rt::Alignment,
width: Option<uint>,
precision: Option<uint>,
buf: &'a mut FormatWriter+'a,
curarg: slice::Items<'a, Argument<'a>>,
@ -104,6 +96,7 @@ enum Void {}
/// family of functions. It contains a function to format the given value. At
/// compile time it is ensured that the function and the value have the correct
/// types, and then this struct is used to canonicalize arguments to one type.
#[experimental = "implementation detail of the `format_args!` macro"]
pub struct Argument<'a> {
formatter: extern "Rust" fn(&Void, &mut Formatter) -> Result,
value: &'a Void,
@ -115,6 +108,7 @@ impl<'a> Arguments<'a> {
/// which is valid because the compiler performs all necessary validation to
/// ensure that the resulting call to format/write would be safe.
#[doc(hidden)] #[inline]
#[experimental = "implementation detail of the `format_args!` macro"]
pub unsafe fn new<'a>(pieces: &'static [&'static str],
args: &'a [Argument<'a>]) -> Arguments<'a> {
Arguments {
@ -128,6 +122,7 @@ impl<'a> Arguments<'a> {
/// The `pieces` array must be at least as long as `fmt` to construct
/// a valid Arguments structure.
#[doc(hidden)] #[inline]
#[experimental = "implementation detail of the `format_args!` macro"]
pub unsafe fn with_placeholders<'a>(pieces: &'static [&'static str],
fmt: &'static [rt::Argument<'static>],
args: &'a [Argument<'a>]) -> Arguments<'a> {
@ -148,6 +143,7 @@ impl<'a> Arguments<'a> {
/// and pass it to a function or closure, passed as the first argument. The
/// macro validates the format string at compile-time so usage of the `write`
/// and `format` functions can be safely performed.
#[stable]
pub struct Arguments<'a> {
// Format string pieces to print.
pieces: &'a [&'a str],
@ -169,84 +165,57 @@ impl<'a> Show for Arguments<'a> {
/// When a format is not otherwise specified, types are formatted by ascribing
/// to this trait. There is not an explicit way of selecting this trait to be
/// used for formatting, it is only if no other format is specified.
#[unstable = "I/O and core have yet to be reconciled"]
pub trait Show for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `b` character
pub trait Bool for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `c` character
pub trait Char for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `i` and `d` characters
pub trait Signed for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `u` character
pub trait Unsigned for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `o` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait Octal for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `t` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait Binary for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `x` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait LowerHex for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `X` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait UpperHex for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `s` character
pub trait String for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `p` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait Pointer for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `f` character
pub trait Float for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `e` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait LowerExp for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
}
/// Format trait for the `E` character
#[unstable = "I/O and core have yet to be reconciled"]
pub trait UpperExp for Sized? {
/// Formats the value using the given formatter.
fn fmt(&self, &mut Formatter) -> Result;
@ -271,6 +240,8 @@ static DEFAULT_ARGUMENT: rt::Argument<'static> = rt::Argument {
///
/// * output - the buffer to write output to
/// * args - the precompiled arguments generated by `format_args!`
#[experimental = "libcore and I/O have yet to be reconciled, and this is an \
implementation detail which should not otherwise be exported"]
pub fn write(output: &mut FormatWriter, args: &Arguments) -> Result {
let mut formatter = Formatter {
flags: 0,
@ -368,6 +339,7 @@ impl<'a> Formatter<'a> {
///
/// This function will correctly account for the flags provided as well as
/// the minimum width. It will not take precision into account.
#[unstable = "definition may change slightly over time"]
pub fn pad_integral(&mut self,
is_positive: bool,
prefix: &str,
@ -440,6 +412,7 @@ impl<'a> Formatter<'a> {
/// is longer than this length
///
/// Notably this function ignored the `flag` parameters
#[unstable = "definition may change slightly over time"]
pub fn pad(&mut self, s: &str) -> Result {
// Make sure there's a fast path up front
if self.width.is_none() && self.precision.is_none() {
@ -516,19 +489,48 @@ impl<'a> Formatter<'a> {
/// Writes some data to the underlying buffer contained within this
/// formatter.
#[unstable = "reconciling core and I/O may alter this definition"]
pub fn write(&mut self, data: &[u8]) -> Result {
self.buf.write(data)
}
/// Writes some formatted information into this instance
#[unstable = "reconciling core and I/O may alter this definition"]
pub fn write_fmt(&mut self, fmt: &Arguments) -> Result {
write(self.buf, fmt)
}
/// Flags for formatting (packed version of rt::Flag)
#[experimental = "return type may change and method was just created"]
pub fn flags(&self) -> uint { self.flags }
/// Character used as 'fill' whenever there is alignment
#[unstable = "method was just created"]
pub fn fill(&self) -> char { self.fill }
/// Flag indicating what form of alignment was requested
#[unstable = "method was just created"]
pub fn align(&self) -> rt::Alignment { self.align }
/// Optionally specified integer width that the output should be
#[unstable = "method was just created"]
pub fn width(&self) -> Option<uint> { self.width }
/// Optionally specified precision for numeric types
#[unstable = "method was just created"]
pub fn precision(&self) -> Option<uint> { self.precision }
}
impl Show for Error {
fn fmt(&self, f: &mut Formatter) -> Result {
"an error occurred when formatting an argument".fmt(f)
}
}
/// This is a function which calls are emitted to by the compiler itself to
/// create the Argument structures that are passed into the `format` function.
#[doc(hidden)] #[inline]
#[experimental = "implementation detail of the `format_args!` macro"]
pub fn argument<'a, T>(f: extern "Rust" fn(&T, &mut Formatter) -> Result,
t: &'a T) -> Argument<'a> {
unsafe {
@ -542,15 +544,17 @@ pub fn argument<'a, T>(f: extern "Rust" fn(&T, &mut Formatter) -> Result,
/// When the compiler determines that the type of an argument *must* be a string
/// (such as for select), then it invokes this method.
#[doc(hidden)] #[inline]
#[experimental = "implementation detail of the `format_args!` macro"]
pub fn argumentstr<'a>(s: &'a &str) -> Argument<'a> {
argument(String::fmt, s)
argument(Show::fmt, s)
}
/// When the compiler determines that the type of an argument *must* be a uint
/// (such as for plural), then it invokes this method.
#[doc(hidden)] #[inline]
#[experimental = "implementation detail of the `format_args!` macro"]
pub fn argumentuint<'a>(s: &'a uint) -> Argument<'a> {
argument(Unsigned::fmt, s)
argument(Show::fmt, s)
}
// Implementations of the core formatting traits
@ -565,32 +569,26 @@ impl<'a> Show for &'a Show+'a {
fn fmt(&self, f: &mut Formatter) -> Result { (*self).fmt(f) }
}
impl Bool for bool {
impl Show for bool {
fn fmt(&self, f: &mut Formatter) -> Result {
String::fmt(if *self { "true" } else { "false" }, f)
Show::fmt(if *self { "true" } else { "false" }, f)
}
}
impl<T: str::Str> String for T {
fn fmt(&self, f: &mut Formatter) -> Result {
f.pad(self.as_slice())
}
}
impl String for str {
impl Show for str {
fn fmt(&self, f: &mut Formatter) -> Result {
f.pad(self)
}
}
impl Char for char {
impl Show for char {
fn fmt(&self, f: &mut Formatter) -> Result {
use char::Char;
let mut utf8 = [0u8, ..4];
let amt = self.encode_utf8(&mut utf8).unwrap_or(0);
let s: &str = unsafe { mem::transmute(utf8[..amt]) };
String::fmt(s, f)
Show::fmt(s, f)
}
}
@ -620,7 +618,7 @@ impl<'a, T> Pointer for &'a mut T {
}
macro_rules! floating(($ty:ident) => {
impl Float for $ty {
impl Show for $ty {
fn fmt(&self, fmt: &mut Formatter) -> Result {
use num::Float;
@ -688,19 +686,6 @@ floating!(f64)
// Implementation of Show for various core types
macro_rules! delegate(($ty:ty to $other:ident) => {
impl Show for $ty {
fn fmt(&self, f: &mut Formatter) -> Result {
$other::fmt(self, f)
}
}
})
delegate!(str to String)
delegate!(bool to Bool)
delegate!(char to Char)
delegate!(f32 to Float)
delegate!(f64 to Float)
impl<T> Show for *const T {
fn fmt(&self, f: &mut Formatter) -> Result { Pointer::fmt(self, f) }
}

View file

@ -109,6 +109,7 @@ radix!(UpperHex, 16, "0x", x @ 0 ... 9 => b'0' + x,
/// A radix with in the range of `2..36`.
#[deriving(Clone, PartialEq)]
#[unstable = "may be renamed or move to a different module"]
pub struct Radix {
base: u8,
}
@ -132,6 +133,7 @@ impl GenericRadix for Radix {
}
/// A helper type for formatting radixes.
#[unstable = "may be renamed or move to a different module"]
pub struct RadixFmt<T, R>(T, R);
/// Constructs a radix formatter in the range of `2..36`.
@ -142,6 +144,7 @@ pub struct RadixFmt<T, R>(T, R);
/// use std::fmt::radix;
/// assert_eq!(format!("{}", radix(55i, 36)), "1j".to_string());
/// ```
#[unstable = "may be renamed or move to a different module"]
pub fn radix<T>(x: T, base: u8) -> RadixFmt<T, Radix> {
RadixFmt(x, Radix::new(base))
}
@ -167,7 +170,6 @@ macro_rules! int_base {
macro_rules! integer {
($Int:ident, $Uint:ident) => {
int_base!(Show for $Int as $Int -> Decimal)
int_base!(Signed for $Int as $Int -> Decimal)
int_base!(Binary for $Int as $Uint -> Binary)
int_base!(Octal for $Int as $Uint -> Octal)
int_base!(LowerHex for $Int as $Uint -> LowerHex)
@ -175,7 +177,6 @@ macro_rules! integer {
radix_fmt!($Int as $Int, fmt_int)
int_base!(Show for $Uint as $Uint -> Decimal)
int_base!(Unsigned for $Uint as $Uint -> Decimal)
int_base!(Binary for $Uint as $Uint -> Binary)
int_base!(Octal for $Uint as $Uint -> Octal)
int_base!(LowerHex for $Uint as $Uint -> LowerHex)

View file

@ -14,6 +14,8 @@
//! These definitions are similar to their `ct` equivalents, but differ in that
//! these can be statically allocated and are slightly optimized for the runtime
#![experimental = "implementation detail of the `format_args!` macro"]
pub use self::Alignment::*;
pub use self::Count::*;
pub use self::Position::*;

View file

@ -108,7 +108,10 @@ macro_rules! try(
/// Writing a formatted string into a writer
#[macro_export]
macro_rules! write(
($dst:expr, $($arg:tt)*) => (format_args_method!($dst, write_fmt, $($arg)*))
($dst:expr, $($arg:tt)*) => ({
let dst = &mut *$dst;
format_args!(|args| { dst.write_fmt(args) }, $($arg)*)
})
)
/// Writing a formatted string plus a newline into a writer
@ -119,15 +122,5 @@ macro_rules! writeln(
)
)
/// Write some formatted data into a stream.
///
/// Identical to the macro in `std::macros`
#[macro_export]
macro_rules! write(
($dst:expr, $($arg:tt)*) => ({
format_args_method!($dst, write_fmt, $($arg)*)
})
)
#[macro_export]
macro_rules! unreachable( () => (panic!("unreachable code")) )

View file

@ -227,52 +227,6 @@
//! ```
//!
//! `try!` is imported by the prelude, and is available everywhere.
//!
//! # `Result` and `Option`
//!
//! The `Result` and [`Option`](../option/index.html) types are
//! similar and complementary: they are often employed to indicate a
//! lack of a return value; and they are trivially converted between
//! each other, so `Result`s are often handled by first converting to
//! `Option` with the [`ok`](type.Result.html#method.ok) and
//! [`err`](type.Result.html#method.ok) methods.
//!
//! Whereas `Option` only indicates the lack of a value, `Result` is
//! specifically for error reporting, and carries with it an error
//! value. Sometimes `Option` is used for indicating errors, but this
//! is only for simple cases and is generally discouraged. Even when
//! there is no useful error value to return, prefer `Result<T, ()>`.
//!
//! Converting to an `Option` with `ok()` to handle an error:
//!
//! ```
//! use std::io::Timer;
//! let mut t = Timer::new().ok().expect("failed to create timer!");
//! ```
//!
//! # `Result` vs. `panic!`
//!
//! `Result` is for recoverable errors; `panic!` is for unrecoverable
//! errors. Callers should always be able to avoid panics if they
//! take the proper precautions, for example, calling `is_some()`
//! on an `Option` type before calling `unwrap`.
//!
//! The suitability of `panic!` as an error handling mechanism is
//! limited by Rust's lack of any way to "catch" and resume execution
//! from a thrown exception. Therefore using panics for error
//! handling requires encapsulating code that may panic in a task.
//! Calling the `panic!` macro, or invoking `panic!` indirectly should be
//! avoided as an error reporting strategy. Panics is only for
//! unrecoverable errors and a panicking task is typically the sign of
//! a bug.
//!
//! A module that instead returns `Results` is alerting the caller
//! that failure is possible, and providing precise control over how
//! it is handled.
//!
//! Furthermore, panics may not be recoverable at all, depending on
//! the context. The caller of `panic!` should assume that execution
//! will not resume after the panic, that a panic is catastrophic.
#![stable]

View file

@ -21,16 +21,16 @@ fn test_format_int() {
assert!(format!("{}", 1i16).as_slice() == "1");
assert!(format!("{}", 1i32).as_slice() == "1");
assert!(format!("{}", 1i64).as_slice() == "1");
assert!(format!("{:d}", -1i).as_slice() == "-1");
assert!(format!("{:d}", -1i8).as_slice() == "-1");
assert!(format!("{:d}", -1i16).as_slice() == "-1");
assert!(format!("{:d}", -1i32).as_slice() == "-1");
assert!(format!("{:d}", -1i64).as_slice() == "-1");
assert!(format!("{:t}", 1i).as_slice() == "1");
assert!(format!("{:t}", 1i8).as_slice() == "1");
assert!(format!("{:t}", 1i16).as_slice() == "1");
assert!(format!("{:t}", 1i32).as_slice() == "1");
assert!(format!("{:t}", 1i64).as_slice() == "1");
assert!(format!("{}", -1i).as_slice() == "-1");
assert!(format!("{}", -1i8).as_slice() == "-1");
assert!(format!("{}", -1i16).as_slice() == "-1");
assert!(format!("{}", -1i32).as_slice() == "-1");
assert!(format!("{}", -1i64).as_slice() == "-1");
assert!(format!("{:b}", 1i).as_slice() == "1");
assert!(format!("{:b}", 1i8).as_slice() == "1");
assert!(format!("{:b}", 1i16).as_slice() == "1");
assert!(format!("{:b}", 1i32).as_slice() == "1");
assert!(format!("{:b}", 1i64).as_slice() == "1");
assert!(format!("{:x}", 1i).as_slice() == "1");
assert!(format!("{:x}", 1i8).as_slice() == "1");
assert!(format!("{:x}", 1i16).as_slice() == "1");
@ -52,16 +52,11 @@ fn test_format_int() {
assert!(format!("{}", 1u16).as_slice() == "1");
assert!(format!("{}", 1u32).as_slice() == "1");
assert!(format!("{}", 1u64).as_slice() == "1");
assert!(format!("{:u}", 1u).as_slice() == "1");
assert!(format!("{:u}", 1u8).as_slice() == "1");
assert!(format!("{:u}", 1u16).as_slice() == "1");
assert!(format!("{:u}", 1u32).as_slice() == "1");
assert!(format!("{:u}", 1u64).as_slice() == "1");
assert!(format!("{:t}", 1u).as_slice() == "1");
assert!(format!("{:t}", 1u8).as_slice() == "1");
assert!(format!("{:t}", 1u16).as_slice() == "1");
assert!(format!("{:t}", 1u32).as_slice() == "1");
assert!(format!("{:t}", 1u64).as_slice() == "1");
assert!(format!("{:b}", 1u).as_slice() == "1");
assert!(format!("{:b}", 1u8).as_slice() == "1");
assert!(format!("{:b}", 1u16).as_slice() == "1");
assert!(format!("{:b}", 1u32).as_slice() == "1");
assert!(format!("{:b}", 1u64).as_slice() == "1");
assert!(format!("{:x}", 1u).as_slice() == "1");
assert!(format!("{:x}", 1u8).as_slice() == "1");
assert!(format!("{:x}", 1u16).as_slice() == "1");
@ -79,9 +74,9 @@ fn test_format_int() {
assert!(format!("{:o}", 1u64).as_slice() == "1");
// Test a larger number
assert!(format!("{:t}", 55i).as_slice() == "110111");
assert!(format!("{:b}", 55i).as_slice() == "110111");
assert!(format!("{:o}", 55i).as_slice() == "67");
assert!(format!("{:d}", 55i).as_slice() == "55");
assert!(format!("{}", 55i).as_slice() == "55");
assert!(format!("{:x}", 55i).as_slice() == "37");
assert!(format!("{:X}", 55i).as_slice() == "37");
}
@ -89,15 +84,13 @@ fn test_format_int() {
#[test]
fn test_format_int_zero() {
assert!(format!("{}", 0i).as_slice() == "0");
assert!(format!("{:d}", 0i).as_slice() == "0");
assert!(format!("{:t}", 0i).as_slice() == "0");
assert!(format!("{:b}", 0i).as_slice() == "0");
assert!(format!("{:o}", 0i).as_slice() == "0");
assert!(format!("{:x}", 0i).as_slice() == "0");
assert!(format!("{:X}", 0i).as_slice() == "0");
assert!(format!("{}", 0u).as_slice() == "0");
assert!(format!("{:u}", 0u).as_slice() == "0");
assert!(format!("{:t}", 0u).as_slice() == "0");
assert!(format!("{:b}", 0u).as_slice() == "0");
assert!(format!("{:o}", 0u).as_slice() == "0");
assert!(format!("{:x}", 0u).as_slice() == "0");
assert!(format!("{:X}", 0u).as_slice() == "0");
@ -105,11 +98,11 @@ fn test_format_int_zero() {
#[test]
fn test_format_int_flags() {
assert!(format!("{:3d}", 1i).as_slice() == " 1");
assert!(format!("{:>3d}", 1i).as_slice() == " 1");
assert!(format!("{:>+3d}", 1i).as_slice() == " +1");
assert!(format!("{:<3d}", 1i).as_slice() == "1 ");
assert!(format!("{:#d}", 1i).as_slice() == "1");
assert!(format!("{:3}", 1i).as_slice() == " 1");
assert!(format!("{:>3}", 1i).as_slice() == " 1");
assert!(format!("{:>+3}", 1i).as_slice() == " +1");
assert!(format!("{:<3}", 1i).as_slice() == "1 ");
assert!(format!("{:#}", 1i).as_slice() == "1");
assert!(format!("{:#x}", 10i).as_slice() == "0xa");
assert!(format!("{:#X}", 10i).as_slice() == "0xA");
assert!(format!("{:#5x}", 10i).as_slice() == " 0xa");
@ -119,25 +112,25 @@ fn test_format_int_flags() {
assert!(format!("{:<8x}", 10i).as_slice() == "a ");
assert!(format!("{:>8x}", 10i).as_slice() == " a");
assert!(format!("{:#08x}", 10i).as_slice() == "0x00000a");
assert!(format!("{:08d}", -10i).as_slice() == "-0000010");
assert!(format!("{:08}", -10i).as_slice() == "-0000010");
assert!(format!("{:x}", -1u8).as_slice() == "ff");
assert!(format!("{:X}", -1u8).as_slice() == "FF");
assert!(format!("{:t}", -1u8).as_slice() == "11111111");
assert!(format!("{:b}", -1u8).as_slice() == "11111111");
assert!(format!("{:o}", -1u8).as_slice() == "377");
assert!(format!("{:#x}", -1u8).as_slice() == "0xff");
assert!(format!("{:#X}", -1u8).as_slice() == "0xFF");
assert!(format!("{:#t}", -1u8).as_slice() == "0b11111111");
assert!(format!("{:#b}", -1u8).as_slice() == "0b11111111");
assert!(format!("{:#o}", -1u8).as_slice() == "0o377");
}
#[test]
fn test_format_int_sign_padding() {
assert!(format!("{:+5d}", 1i).as_slice() == " +1");
assert!(format!("{:+5d}", -1i).as_slice() == " -1");
assert!(format!("{:05d}", 1i).as_slice() == "00001");
assert!(format!("{:05d}", -1i).as_slice() == "-0001");
assert!(format!("{:+05d}", 1i).as_slice() == "+0001");
assert!(format!("{:+05d}", -1i).as_slice() == "-0001");
assert!(format!("{:+5}", 1i).as_slice() == " +1");
assert!(format!("{:+5}", -1i).as_slice() == " -1");
assert!(format!("{:05}", 1i).as_slice() == "00001");
assert!(format!("{:05}", -1i).as_slice() == "-0001");
assert!(format!("{:+05}", 1i).as_slice() == "+0001");
assert!(format!("{:+05}", -1i).as_slice() == "-0001");
}
#[test]
@ -169,7 +162,7 @@ mod uint {
#[bench]
fn format_bin(b: &mut Bencher) {
let mut rng = weak_rng();
b.iter(|| { format!("{:t}", rng.gen::<uint>()); })
b.iter(|| { format!("{:b}", rng.gen::<uint>()); })
}
#[bench]
@ -181,7 +174,7 @@ mod uint {
#[bench]
fn format_dec(b: &mut Bencher) {
let mut rng = weak_rng();
b.iter(|| { format!("{:u}", rng.gen::<uint>()); })
b.iter(|| { format!("{}", rng.gen::<uint>()); })
}
#[bench]
@ -205,7 +198,7 @@ mod int {
#[bench]
fn format_bin(b: &mut Bencher) {
let mut rng = weak_rng();
b.iter(|| { format!("{:t}", rng.gen::<int>()); })
b.iter(|| { format!("{:b}", rng.gen::<int>()); })
}
#[bench]
@ -217,7 +210,7 @@ mod int {
#[bench]
fn format_dec(b: &mut Bencher) {
let mut rng = weak_rng();
b.iter(|| { format!("{:d}", rng.gen::<int>()); })
b.iter(|| { format!("{}", rng.gen::<int>()); })
}
#[bench]

View file

@ -130,7 +130,7 @@ mod tests {
input.len());
let cmp = deflate_bytes(input.as_slice()).expect("deflation failed");
let out = inflate_bytes(cmp.as_slice()).expect("inflation failed");
debug!("{} bytes deflated to {} ({:.1f}% size)",
debug!("{} bytes deflated to {} ({:.1}% size)",
input.len(), cmp.len(),
100.0 * ((cmp.len() as f64) / (input.len() as f64)));
assert_eq!(input.as_slice(), out.as_slice());

View file

@ -221,7 +221,7 @@ pub fn render_to<W:Writer>(output: &mut W) {
impl<'a> dot::Labeller<'a, Nd<'a>, Ed<'a>> for Graph {
fn graph_id(&'a self) -> dot::Id<'a> { dot::Id::new("example3").unwrap() }
fn node_id(&'a self, n: &Nd<'a>) -> dot::Id<'a> {
dot::Id::new(format!("N{:u}", n.val0())).unwrap()
dot::Id::new(format!("N{}", n.val0())).unwrap()
}
fn node_label<'a>(&'a self, n: &Nd<'a>) -> dot::LabelText<'a> {
let &(i, _) = n;
@ -635,7 +635,7 @@ mod tests {
}
fn id_name<'a>(n: &Node) -> Id<'a> {
Id::new(format!("N{:u}", *n)).unwrap()
Id::new(format!("N{}", *n)).unwrap()
}
impl<'a> Labeller<'a, Node, &'a Edge> for LabelledGraph {

View file

@ -241,13 +241,6 @@ impl fmt::Show for LogLevel {
}
}
impl fmt::Signed for LogLevel {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
let LogLevel(level) = *self;
write!(fmt, "{}", level)
}
}
impl Logger for DefaultLogger {
fn log(&mut self, record: &LogRecord) {
match writeln!(&mut self.handle,

View file

@ -237,7 +237,7 @@ pub trait Rng {
/// use std::rand::{task_rng, Rng};
///
/// let mut rng = task_rng();
/// println!("{:b}", rng.gen_weighted_bool(3));
/// println!("{}", rng.gen_weighted_bool(3));
/// ```
fn gen_weighted_bool(&mut self, n: uint) -> bool {
n == 0 || self.gen_range(0, n) == 0

View file

@ -994,7 +994,7 @@ impl LintPass for NonSnakeCase {
self.check_snake_case(cx, "trait method", t.ident, t.span);
}
fn check_lifetime_decl(&mut self, cx: &Context, t: &ast::LifetimeDef) {
fn check_lifetime_def(&mut self, cx: &Context, t: &ast::LifetimeDef) {
self.check_snake_case(cx, "lifetime", t.lifetime.name.ident(), t.lifetime.span);
}

View file

@ -725,8 +725,8 @@ impl<'a, 'tcx, 'v> Visitor<'v> for Context<'a, 'tcx> {
run_lints!(self, check_lifetime_ref, lt);
}
fn visit_lifetime_decl(&mut self, lt: &ast::LifetimeDef) {
run_lints!(self, check_lifetime_decl, lt);
fn visit_lifetime_def(&mut self, lt: &ast::LifetimeDef) {
run_lints!(self, check_lifetime_def, lt);
}
fn visit_explicit_self(&mut self, es: &ast::ExplicitSelf) {

View file

@ -155,7 +155,7 @@ pub trait LintPass {
fn check_variant_post(&mut self, _: &Context, _: &ast::Variant, _: &ast::Generics) { }
fn check_opt_lifetime_ref(&mut self, _: &Context, _: Span, _: &Option<ast::Lifetime>) { }
fn check_lifetime_ref(&mut self, _: &Context, _: &ast::Lifetime) { }
fn check_lifetime_decl(&mut self, _: &Context, _: &ast::LifetimeDef) { }
fn check_lifetime_def(&mut self, _: &Context, _: &ast::LifetimeDef) { }
fn check_explicit_self(&mut self, _: &Context, _: &ast::ExplicitSelf) { }
fn check_mac(&mut self, _: &Context, _: &ast::Mac) { }
fn check_path(&mut self, _: &Context, _: &ast::Path, _: ast::NodeId) { }

View file

@ -102,7 +102,7 @@ pub fn check_crate(tcx: &ty::ctxt) {
fn make_stat(bccx: &BorrowckCtxt, stat: uint) -> String {
let total = bccx.stats.guaranteed_paths as f64;
let perc = if total == 0.0 { 0.0 } else { stat as f64 * 100.0 / total };
format!("{} ({:.0f}%)", stat, perc)
format!("{} ({:.0}%)", stat, perc)
}
}

View file

@ -53,7 +53,7 @@ impl<'a, 'ast> dot::Labeller<'a, Node<'a>, Edge<'a>> for LabelledCFG<'a, 'ast> {
fn graph_id(&'a self) -> dot::Id<'a> { dot::Id::new(self.name.as_slice()).unwrap() }
fn node_id(&'a self, &(i,_): &Node<'a>) -> dot::Id<'a> {
dot::Id::new(format!("N{:u}", i.node_id())).unwrap()
dot::Id::new(format!("N{}", i.node_id())).unwrap()
}
fn node_label(&'a self, &(i, n): &Node<'a>) -> dot::LabelText<'a> {

View file

@ -81,8 +81,7 @@ impl<'a> fmt::Show for Matrix<'a> {
try!(write!(f, "+"));
for (column, pat_str) in row.into_iter().enumerate() {
try!(write!(f, " "));
f.width = Some(column_widths[column]);
try!(f.pad(pat_str.as_slice()));
try!(write!(f, "{:1$}", pat_str, column_widths[column]));
try!(write!(f, " +"));
}
try!(write!(f, "\n"));

View file

@ -194,7 +194,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
let words_per_id = (bits_per_id + uint::BITS - 1) / uint::BITS;
let num_nodes = cfg.graph.all_nodes().len();
debug!("DataFlowContext::new(analysis_name: {:s}, id_range={}, \
debug!("DataFlowContext::new(analysis_name: {}, id_range={}, \
bits_per_id={}, words_per_id={}) \
num_nodes: {}",
analysis_name, id_range, bits_per_id, words_per_id,
@ -223,7 +223,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
pub fn add_gen(&mut self, id: ast::NodeId, bit: uint) {
//! Indicates that `id` generates `bit`
debug!("{:s} add_gen(id={}, bit={})",
debug!("{} add_gen(id={}, bit={})",
self.analysis_name, id, bit);
assert!(self.nodeid_to_index.contains_key(&id));
assert!(self.bits_per_id > 0);
@ -236,7 +236,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
pub fn add_kill(&mut self, id: ast::NodeId, bit: uint) {
//! Indicates that `id` kills `bit`
debug!("{:s} add_kill(id={}, bit={})",
debug!("{} add_kill(id={}, bit={})",
self.analysis_name, id, bit);
assert!(self.nodeid_to_index.contains_key(&id));
assert!(self.bits_per_id > 0);
@ -249,7 +249,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
fn apply_gen_kill(&self, cfgidx: CFGIndex, bits: &mut [uint]) {
//! Applies the gen and kill sets for `cfgidx` to `bits`
debug!("{:s} apply_gen_kill(cfgidx={}, bits={}) [before]",
debug!("{} apply_gen_kill(cfgidx={}, bits={}) [before]",
self.analysis_name, cfgidx, mut_bits_to_string(bits));
assert!(self.bits_per_id > 0);
@ -259,7 +259,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
let kills = self.kills.slice(start, end);
bitwise(bits, kills, &Subtract);
debug!("{:s} apply_gen_kill(cfgidx={}, bits={}) [after]",
debug!("{} apply_gen_kill(cfgidx={}, bits={}) [after]",
self.analysis_name, cfgidx, mut_bits_to_string(bits));
}
@ -316,7 +316,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
temp_bits.as_slice()
}
};
debug!("{:s} each_bit_for_node({}, cfgidx={}) bits={}",
debug!("{} each_bit_for_node({}, cfgidx={}) bits={}",
self.analysis_name, e, cfgidx, bits_to_string(slice));
self.each_bit(slice, f)
}
@ -337,7 +337,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
let cfgidx = to_cfgidx_or_die(id, &self.nodeid_to_index);
let (start, end) = self.compute_id_range(cfgidx);
let gens = self.gens.slice(start, end);
debug!("{:s} each_gen_bit(id={}, gens={})",
debug!("{} each_gen_bit(id={}, gens={})",
self.analysis_name, id, bits_to_string(gens));
self.each_bit(gens, f)
}
@ -385,7 +385,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
//! This is usually called (if it is called at all), after
//! all add_gen and add_kill calls, but before propagate.
debug!("{:s} add_kills_from_flow_exits", self.analysis_name);
debug!("{} add_kills_from_flow_exits", self.analysis_name);
if self.bits_per_id == 0 {
// Skip the surprisingly common degenerate case. (Note
// compute_id_range requires self.words_per_id > 0.)
@ -408,7 +408,7 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
}
}
None => {
debug!("{:s} add_kills_from_flow_exits flow_exit={} \
debug!("{} add_kills_from_flow_exits flow_exit={} \
no cfg_idx for exiting_scope={}",
self.analysis_name, flow_exit, node_id);
}
@ -417,10 +417,10 @@ impl<'a, 'tcx, O:DataFlowOperator> DataFlowContext<'a, 'tcx, O> {
if changed {
let bits = self.kills.slice_mut(start, end);
debug!("{:s} add_kills_from_flow_exits flow_exit={} bits={} [before]",
debug!("{} add_kills_from_flow_exits flow_exit={} bits={} [before]",
self.analysis_name, flow_exit, mut_bits_to_string(bits));
bits.clone_from_slice(orig_kills.as_slice());
debug!("{:s} add_kills_from_flow_exits flow_exit={} bits={} [after]",
debug!("{} add_kills_from_flow_exits flow_exit={} bits={} [after]",
self.analysis_name, flow_exit, mut_bits_to_string(bits));
}
true
@ -453,7 +453,7 @@ impl<'a, 'tcx, O:DataFlowOperator+Clone+'static> DataFlowContext<'a, 'tcx, O> {
}
}
debug!("Dataflow result for {:s}:", self.analysis_name);
debug!("Dataflow result for {}:", self.analysis_name);
debug!("{}", {
self.pretty_print_to(box io::stderr(), blk).unwrap();
""
@ -474,7 +474,7 @@ impl<'a, 'b, 'tcx, O:DataFlowOperator> PropagationContext<'a, 'b, 'tcx, O> {
fn walk_cfg(&mut self,
cfg: &cfg::CFG,
in_out: &mut [uint]) {
debug!("DataFlowContext::walk_cfg(in_out={}) {:s}",
debug!("DataFlowContext::walk_cfg(in_out={}) {}",
bits_to_string(in_out), self.dfcx.analysis_name);
assert!(self.dfcx.bits_per_id > 0);
@ -519,7 +519,7 @@ impl<'a, 'b, 'tcx, O:DataFlowOperator> PropagationContext<'a, 'b, 'tcx, O> {
edge: &cfg::CFGEdge) {
let source = edge.source();
let cfgidx = edge.target();
debug!("{:s} propagate_bits_into_entry_set_for(pred_bits={}, {} to {})",
debug!("{} propagate_bits_into_entry_set_for(pred_bits={}, {} to {})",
self.dfcx.analysis_name, bits_to_string(pred_bits), source, cfgidx);
assert!(self.dfcx.bits_per_id > 0);
@ -530,7 +530,7 @@ impl<'a, 'b, 'tcx, O:DataFlowOperator> PropagationContext<'a, 'b, 'tcx, O> {
bitwise(on_entry, pred_bits, &self.dfcx.oper)
};
if changed {
debug!("{:s} changed entry set for {} to {}",
debug!("{} changed entry set for {} to {}",
self.dfcx.analysis_name, cfgidx,
bits_to_string(self.dfcx.on_entry.slice(start, end)));
self.changed = true;

View file

@ -36,7 +36,7 @@ be indexed by the direction (see the type `Direction`).
#![allow(dead_code)] // still WIP
use std::fmt::{Formatter, FormatError, Show};
use std::fmt::{Formatter, Error, Show};
use std::uint;
pub struct Graph<N,E> {
@ -57,7 +57,7 @@ pub struct Edge<E> {
}
impl<E: Show> Show for Edge<E> {
fn fmt(&self, f: &mut Formatter) -> Result<(), FormatError> {
fn fmt(&self, f: &mut Formatter) -> Result<(), Error> {
write!(f, "Edge {{ next_edge: [{}, {}], source: {}, target: {}, data: {} }}",
self.next_edge[0], self.next_edge[1], self.source,
self.target, self.data)

View file

@ -633,6 +633,7 @@ enum TraitReferenceType {
TraitDerivation, // trait T : SomeTrait { ... }
TraitBoundingTypeParameter, // fn f<T:SomeTrait>() { ... }
TraitObject, // Box<for<'a> SomeTrait>
TraitQPath, // <T as SomeTrait>::
}
impl NameBindings {
@ -4532,6 +4533,7 @@ impl<'a> Resolver<'a> {
TraitImplementation => "implement",
TraitDerivation => "derive",
TraitObject => "reference",
TraitQPath => "extract an associated type from",
};
let msg = format!("attempt to {} a nonexistent trait `{}`", usage_str, path_str);
@ -4969,65 +4971,8 @@ impl<'a> Resolver<'a> {
}
TyQPath(ref qpath) => {
self.resolve_type(&*qpath.for_type);
let current_module = self.current_module.clone();
let module_path: Vec<_> =
qpath.trait_name
.segments
.iter()
.map(|ps| ps.identifier.name)
.collect();
match self.resolve_module_path(
current_module,
module_path.as_slice(),
UseLexicalScope,
qpath.trait_name.span,
PathSearch) {
Success((ref module, _)) if module.kind.get() ==
TraitModuleKind => {
match self.resolve_definition_of_name_in_module(
(*module).clone(),
qpath.item_name.name,
TypeNS) {
ChildNameDefinition(def, lp) |
ImportNameDefinition(def, lp) => {
match def {
DefAssociatedTy(trait_type_id) => {
let def = DefAssociatedTy(
trait_type_id);
self.record_def(ty.id, (def, lp));
}
_ => {
self.resolve_error(
ty.span,
"not an associated type");
}
}
}
NoNameDefinition => {
self.resolve_error(ty.span,
"unresolved associated \
type");
}
}
}
Success(..) => self.resolve_error(ty.span, "not a trait"),
Indeterminate => {
self.session.span_bug(ty.span,
"indeterminate result when \
resolving associated type")
}
Failed(error) => {
let (span, help) = match error {
Some((span, msg)) => (span, format!("; {}", msg)),
None => (ty.span, String::new()),
};
self.resolve_error(span,
format!("unresolved trait: {}",
help).as_slice())
}
}
self.resolve_type(&*qpath.self_type);
self.resolve_trait_reference(ty.id, &*qpath.trait_ref, TraitQPath);
}
TyClosure(ref c) | TyProc(ref c) => {

View file

@ -226,7 +226,7 @@ impl<'a, 'v> Visitor<'v> for LifetimeContext<'a> {
self.with(LateScope(&trait_ref.bound_lifetimes, self.scope), |this| {
this.check_lifetime_defs(&trait_ref.bound_lifetimes);
for lifetime in trait_ref.bound_lifetimes.iter() {
this.visit_lifetime_decl(lifetime);
this.visit_lifetime_def(lifetime);
}
this.visit_trait_ref(&trait_ref.trait_ref)
})

View file

@ -100,6 +100,10 @@ impl<'tcx> Substs<'tcx> {
regions_is_noop && self.types.is_empty()
}
pub fn type_for_def(&self, ty_param_def: &ty::TypeParameterDef) -> Ty<'tcx> {
*self.types.get(ty_param_def.space, ty_param_def.index)
}
pub fn has_regions_escaping_depth(&self, depth: uint) -> bool {
self.types.iter().any(|&t| ty::type_escapes_depth(t, depth)) || {
match self.regions {

View file

@ -25,6 +25,7 @@ use std::rc::Rc;
use std::slice::Items;
use syntax::ast;
use syntax::codemap::{Span, DUMMY_SP};
use util::common::ErrorReported;
pub use self::fulfill::FulfillmentContext;
pub use self::select::SelectionContext;
@ -95,10 +96,6 @@ pub enum ObligationCauseCode<'tcx> {
FieldSized,
}
// An error has already been reported to the user, so no need to continue checking.
#[deriving(Clone,Show)]
pub struct ErrorReported;
pub type Obligations<'tcx> = subst::VecPerParamSpace<Obligation<'tcx>>;
pub type Selection<'tcx> = Vtable<'tcx, Obligation<'tcx>>;

View file

@ -17,7 +17,6 @@ use self::Candidate::*;
use self::BuiltinBoundConditions::*;
use self::EvaluationResult::*;
use super::{ErrorReported};
use super::{Obligation, ObligationCause};
use super::{SelectionError, Unimplemented, Overflow,
OutputTypeParameterMismatch};
@ -38,6 +37,7 @@ use std::cell::RefCell;
use std::collections::hash_map::HashMap;
use std::rc::Rc;
use syntax::ast;
use util::common::ErrorReported;
use util::ppaux::Repr;
pub struct SelectionContext<'cx, 'tcx:'cx> {

View file

@ -18,9 +18,10 @@ use std::fmt;
use std::rc::Rc;
use syntax::ast;
use syntax::codemap::Span;
use util::common::ErrorReported;
use util::ppaux::Repr;
use super::{ErrorReported, Obligation, ObligationCause, VtableImpl,
use super::{Obligation, ObligationCause, VtableImpl,
VtableParam, VtableParamData, VtableImplData};
///////////////////////////////////////////////////////////////////////////

View file

@ -2635,9 +2635,14 @@ impl ops::Sub<TypeContents,TypeContents> for TypeContents {
}
impl fmt::Show for TypeContents {
#[cfg(stage0)]
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "TypeContents({:t})", self.bits)
}
#[cfg(not(stage0))]
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "TypeContents({:b})", self.bits)
}
}
pub fn type_interior_is_unsafe<'tcx>(cx: &ctxt<'tcx>, ty: Ty<'tcx>) -> bool {

View file

@ -207,7 +207,6 @@ fn ast_path_substs_for_ty<'tcx,AC,RS>(
decl_def_id: ast::DefId,
decl_generics: &ty::Generics<'tcx>,
self_ty: Option<Ty<'tcx>>,
associated_ty: Option<Ty<'tcx>>,
path: &ast::Path)
-> Substs<'tcx>
where AC: AstConv<'tcx>, RS: RegionScope
@ -243,7 +242,7 @@ fn ast_path_substs_for_ty<'tcx,AC,RS>(
};
create_substs_for_ast_path(this, rscope, path.span, decl_def_id,
decl_generics, self_ty, types, regions, associated_ty)
decl_generics, self_ty, types, regions)
}
fn create_substs_for_ast_path<'tcx,AC,RS>(
@ -254,8 +253,7 @@ fn create_substs_for_ast_path<'tcx,AC,RS>(
decl_generics: &ty::Generics<'tcx>,
self_ty: Option<Ty<'tcx>>,
types: Vec<Ty<'tcx>>,
regions: Vec<ty::Region>,
associated_ty: Option<Ty<'tcx>>)
regions: Vec<ty::Region>)
-> Substs<'tcx>
where AC: AstConv<'tcx>, RS: RegionScope
{
@ -366,9 +364,9 @@ fn create_substs_for_ast_path<'tcx,AC,RS>(
substs.types.push(
AssocSpace,
this.associated_type_binding(span,
associated_ty,
self_ty,
decl_def_id,
param.def_id))
param.def_id));
}
return substs;
@ -417,19 +415,17 @@ pub fn instantiate_poly_trait_ref<'tcx,AC,RS>(
this: &AC,
rscope: &RS,
ast_trait_ref: &ast::PolyTraitRef,
self_ty: Option<Ty<'tcx>>,
associated_type: Option<Ty<'tcx>>)
self_ty: Option<Ty<'tcx>>)
-> Rc<ty::TraitRef<'tcx>>
where AC: AstConv<'tcx>, RS: RegionScope
{
instantiate_trait_ref(this, rscope, &ast_trait_ref.trait_ref, self_ty, associated_type)
instantiate_trait_ref(this, rscope, &ast_trait_ref.trait_ref, self_ty)
}
pub fn instantiate_trait_ref<'tcx,AC,RS>(this: &AC,
rscope: &RS,
ast_trait_ref: &ast::TraitRef,
self_ty: Option<Ty<'tcx>>,
associated_type: Option<Ty<'tcx>>)
self_ty: Option<Ty<'tcx>>)
-> Rc<ty::TraitRef<'tcx>>
where AC: AstConv<'tcx>,
RS: RegionScope
@ -444,8 +440,8 @@ pub fn instantiate_trait_ref<'tcx,AC,RS>(this: &AC,
ast_trait_ref.path.span,
ast_trait_ref.ref_id) {
def::DefTrait(trait_def_id) => {
let trait_ref = Rc::new(ast_path_to_trait_ref(this, rscope, trait_def_id, self_ty,
associated_type, &ast_trait_ref.path));
let trait_ref = Rc::new(ast_path_to_trait_ref(this, rscope, trait_def_id,
self_ty, &ast_trait_ref.path));
this.tcx().trait_refs.borrow_mut().insert(ast_trait_ref.ref_id,
trait_ref.clone());
trait_ref
@ -463,7 +459,6 @@ fn ast_path_to_trait_ref<'tcx,AC,RS>(
rscope: &RS,
trait_def_id: ast::DefId,
self_ty: Option<Ty<'tcx>>,
associated_type: Option<Ty<'tcx>>,
path: &ast::Path)
-> ty::TraitRef<'tcx>
where AC: AstConv<'tcx>, RS: RegionScope
@ -493,8 +488,7 @@ fn ast_path_to_trait_ref<'tcx,AC,RS>(
&trait_def.generics,
self_ty,
types,
regions,
associated_type);
regions);
ty::TraitRef::new(trait_def_id, substs)
}
@ -517,7 +511,6 @@ pub fn ast_path_to_ty<'tcx, AC: AstConv<'tcx>, RS: RegionScope>(
did,
&generics,
None,
None,
path);
let ty = decl_ty.subst(tcx, &substs);
TypeAndSubsts { substs: substs, ty: ty }
@ -558,7 +551,7 @@ pub fn ast_path_to_ty_relaxed<'tcx,AC,RS>(
Substs::new(VecPerParamSpace::params_from_type(type_params),
VecPerParamSpace::params_from_type(region_params))
} else {
ast_path_substs_for_ty(this, rscope, did, &generics, None, None, path)
ast_path_substs_for_ty(this, rscope, did, &generics, None, path)
};
let ty = decl_ty.subst(tcx, &substs);
@ -726,7 +719,6 @@ fn mk_pointer<'tcx, AC: AstConv<'tcx>, RS: RegionScope>(
rscope,
trait_def_id,
None,
None,
path);
let empty_vec = [];
let bounds = match *opt_bounds { None => empty_vec.as_slice(),
@ -750,61 +742,37 @@ fn mk_pointer<'tcx, AC: AstConv<'tcx>, RS: RegionScope>(
constr(ast_ty_to_ty(this, rscope, a_seq_ty))
}
fn associated_ty_to_ty<'tcx,AC,RS>(this: &AC,
rscope: &RS,
trait_path: &ast::Path,
for_ast_type: &ast::Ty,
trait_type_id: ast::DefId,
span: Span)
-> Ty<'tcx>
where AC: AstConv<'tcx>, RS: RegionScope
fn qpath_to_ty<'tcx,AC,RS>(this: &AC,
rscope: &RS,
ast_ty: &ast::Ty, // the TyQPath
qpath: &ast::QPath)
-> Ty<'tcx>
where AC: AstConv<'tcx>, RS: RegionScope
{
debug!("associated_ty_to_ty(trait_path={}, for_ast_type={}, trait_type_id={})",
trait_path.repr(this.tcx()),
for_ast_type.repr(this.tcx()),
trait_type_id.repr(this.tcx()));
debug!("qpath_to_ty(ast_ty={})",
ast_ty.repr(this.tcx()));
// Find the trait that this associated type belongs to.
let trait_did = match ty::impl_or_trait_item(this.tcx(),
trait_type_id).container() {
ty::ImplContainer(_) => {
this.tcx().sess.span_bug(span,
"associated_ty_to_ty(): impl associated \
types shouldn't go through this \
function")
}
ty::TraitContainer(trait_id) => trait_id,
};
let self_type = ast_ty_to_ty(this, rscope, &*qpath.self_type);
let for_type = ast_ty_to_ty(this, rscope, for_ast_type);
if !this.associated_types_of_trait_are_valid(for_type, trait_did) {
this.tcx().sess.span_err(span,
"this associated type is not \
allowed in this context");
return ty::mk_err()
}
debug!("qpath_to_ty: self_type={}", self_type.repr(this.tcx()));
let trait_ref = ast_path_to_trait_ref(this,
let trait_ref = instantiate_trait_ref(this,
rscope,
trait_did,
None,
Some(for_type),
trait_path);
&*qpath.trait_ref,
Some(self_type));
debug!("associated_ty_to_ty(trait_ref={})",
trait_ref.repr(this.tcx()));
debug!("qpath_to_ty: trait_ref={}", trait_ref.repr(this.tcx()));
let trait_def = this.get_trait_def(trait_did);
for type_parameter in trait_def.generics.types.iter() {
if type_parameter.def_id == trait_type_id {
debug!("associated_ty_to_ty(type_parameter={} substs={})",
type_parameter.repr(this.tcx()),
trait_ref.substs.repr(this.tcx()));
return *trait_ref.substs.types.get(type_parameter.space,
type_parameter.index)
let trait_def = this.get_trait_def(trait_ref.def_id);
for ty_param_def in trait_def.generics.types.get_slice(AssocSpace).iter() {
if ty_param_def.name == qpath.item_name.name {
debug!("qpath_to_ty: corresponding ty_param_def={}", ty_param_def);
return trait_ref.substs.type_for_def(ty_param_def);
}
}
this.tcx().sess.span_bug(span,
this.tcx().sess.span_bug(ast_ty.span,
"this associated type didn't get added \
as a parameter for some reason")
}
@ -931,7 +899,6 @@ pub fn ast_ty_to_ty<'tcx, AC: AstConv<'tcx>, RS: RegionScope>(
rscope,
trait_def_id,
None,
None,
path);
let empty_bounds: &[ast::TyParamBound] = &[];
let ast_bounds = match *bounds {
@ -996,26 +963,7 @@ pub fn ast_ty_to_ty<'tcx, AC: AstConv<'tcx>, RS: RegionScope>(
}
}
ast::TyQPath(ref qpath) => {
match tcx.def_map.borrow().get(&ast_ty.id) {
None => {
tcx.sess.span_bug(ast_ty.span,
"unbound qualified path")
}
Some(&def::DefAssociatedTy(trait_type_id)) => {
associated_ty_to_ty(this,
rscope,
&qpath.trait_name,
&*qpath.for_type,
trait_type_id,
ast_ty.span)
}
Some(_) => {
tcx.sess.span_err(ast_ty.span,
"this qualified path does not name \
an associated type");
ty::mk_err()
}
}
qpath_to_ty(this, rscope, ast_ty, &**qpath)
}
ast::TyFixedLengthVec(ref ty, ref e) => {
match const_eval::eval_const_expr_partial(tcx, &**e) {
@ -1411,7 +1359,7 @@ fn conv_ty_poly_trait_ref<'tcx, AC, RS>(
let main_trait_bound = match partitioned_bounds.trait_bounds.remove(0) {
Some(trait_bound) => {
Some(instantiate_poly_trait_ref(this, rscope, trait_bound, None, None))
Some(instantiate_poly_trait_ref(this, rscope, trait_bound, None))
}
None => {
this.tcx().sess.span_err(

View file

@ -379,7 +379,8 @@ pub fn check_pat_enum<'a, 'tcx>(pcx: &pat_ctxt<'a, 'tcx>, pat: &ast::Pat,
let real_path_ty = fcx.node_ty(pat.id);
let (arg_tys, kind_name) = match real_path_ty.sty {
ty::ty_enum(enum_def_id, ref expected_substs) => {
ty::ty_enum(enum_def_id, ref expected_substs)
if def == def::DefVariant(enum_def_id, def.def_id(), false) => {
let variant = ty::enum_variant_with_id(tcx, enum_def_id, def.def_id());
(variant.args.iter().map(|t| t.subst(tcx, expected_substs)).collect::<Vec<_>>(),
"variant")
@ -392,7 +393,7 @@ pub fn check_pat_enum<'a, 'tcx>(pcx: &pat_ctxt<'a, 'tcx>, pat: &ast::Pat,
_ => {
let name = pprust::path_to_string(path);
span_err!(tcx.sess, pat.span, E0164,
"`{}` does not name a variant or a tuple struct", name);
"`{}` does not name a non-struct variant or a tuple struct", name);
fcx.write_error(pat.id);
if let Some(ref subpats) = *subpats {

View file

@ -31,6 +31,7 @@ struct ConfirmContext<'a, 'tcx:'a> {
fcx: &'a FnCtxt<'a, 'tcx>,
span: Span,
self_expr: &'a ast::Expr,
call_expr: &'a ast::Expr,
}
struct InstantiatedMethodSig<'tcx> {
@ -56,6 +57,7 @@ struct InstantiatedMethodSig<'tcx> {
pub fn confirm<'a, 'tcx>(fcx: &FnCtxt<'a, 'tcx>,
span: Span,
self_expr: &ast::Expr,
call_expr: &ast::Expr,
unadjusted_self_ty: Ty<'tcx>,
pick: probe::Pick<'tcx>,
supplied_method_types: Vec<Ty<'tcx>>)
@ -66,17 +68,18 @@ pub fn confirm<'a, 'tcx>(fcx: &FnCtxt<'a, 'tcx>,
pick.repr(fcx.tcx()),
supplied_method_types.repr(fcx.tcx()));
let mut confirm_cx = ConfirmContext::new(fcx, span, self_expr);
let mut confirm_cx = ConfirmContext::new(fcx, span, self_expr, call_expr);
confirm_cx.confirm(unadjusted_self_ty, pick, supplied_method_types)
}
impl<'a,'tcx> ConfirmContext<'a,'tcx> {
fn new(fcx: &'a FnCtxt<'a, 'tcx>,
span: Span,
self_expr: &'a ast::Expr)
self_expr: &'a ast::Expr,
call_expr: &'a ast::Expr)
-> ConfirmContext<'a, 'tcx>
{
ConfirmContext { fcx: fcx, span: span, self_expr: self_expr }
ConfirmContext { fcx: fcx, span: span, self_expr: self_expr, call_expr: call_expr }
}
fn confirm(&mut self,
@ -469,6 +472,10 @@ impl<'a,'tcx> ConfirmContext<'a,'tcx> {
traits::ObligationCause::misc(self.span),
method_bounds_substs,
method_bounds);
self.fcx.add_default_region_param_bounds(
method_bounds_substs,
self.call_expr);
}
///////////////////////////////////////////////////////////////////////////

View file

@ -79,7 +79,7 @@ pub fn lookup<'a, 'tcx>(fcx: &FnCtxt<'a, 'tcx>,
method_name: ast::Name,
self_ty: Ty<'tcx>,
supplied_method_types: Vec<Ty<'tcx>>,
call_expr_id: ast::NodeId,
call_expr: &ast::Expr,
self_expr: &ast::Expr)
-> Result<MethodCallee<'tcx>, MethodError>
{
@ -100,14 +100,14 @@ pub fn lookup<'a, 'tcx>(fcx: &FnCtxt<'a, 'tcx>,
* - `self_expr`: the self expression (`foo`)
*/
debug!("lookup(method_name={}, self_ty={}, call_expr_id={}, self_expr={})",
debug!("lookup(method_name={}, self_ty={}, call_expr={}, self_expr={})",
method_name.repr(fcx.tcx()),
self_ty.repr(fcx.tcx()),
call_expr_id,
call_expr.repr(fcx.tcx()),
self_expr.repr(fcx.tcx()));
let pick = try!(probe::probe(fcx, span, method_name, self_ty, call_expr_id));
Ok(confirm::confirm(fcx, span, self_expr, self_ty, pick, supplied_method_types))
let pick = try!(probe::probe(fcx, span, method_name, self_ty, call_expr.id));
Ok(confirm::confirm(fcx, span, self_expr, call_expr, self_ty, pick, supplied_method_types))
}
pub fn lookup_in_trait<'a, 'tcx>(fcx: &'a FnCtxt<'a, 'tcx>,

View file

@ -2050,6 +2050,17 @@ impl<'a, 'tcx> FnCtxt<'a, 'tcx> {
}
}
pub fn add_default_region_param_bounds(&self,
substs: &Substs<'tcx>,
expr: &ast::Expr)
{
for &ty in substs.types.iter() {
let default_bound = ty::ReScope(expr.id);
let origin = infer::RelateDefaultParamBound(expr.span, ty);
self.register_region_obligation(origin, ty, default_bound);
}
}
pub fn add_obligations_for_parameters(&self,
cause: traits::ObligationCause<'tcx>,
substs: &Substs<'tcx>,
@ -3180,7 +3191,7 @@ fn check_expr_with_unifier<'a, 'tcx>(fcx: &FnCtxt<'a, 'tcx>,
method_name.node.name,
expr_t,
tps,
expr.id,
expr,
rcvr) {
Ok(method) => {
let method_ty = method.ty;
@ -4693,11 +4704,7 @@ fn constrain_path_type_parameters(fcx: &FnCtxt,
expr: &ast::Expr)
{
fcx.opt_node_ty_substs(expr.id, |item_substs| {
for &ty in item_substs.substs.types.iter() {
let default_bound = ty::ReScope(expr.id);
let origin = infer::RelateDefaultParamBound(expr.span, ty);
fcx.register_region_obligation(origin, ty, default_bound);
}
fcx.add_default_region_param_bounds(&item_substs.substs, expr);
});
}

View file

@ -684,7 +684,11 @@ fn find_associated_type_in_generics<'tcx>(tcx: &ty::ctxt<'tcx>,
ty: Option<Ty<'tcx>>,
associated_type_id: ast::DefId,
generics: &ty::Generics<'tcx>)
-> Ty<'tcx> {
-> Ty<'tcx>
{
debug!("find_associated_type_in_generics(ty={}, associated_type_id={}, generics={}",
ty.repr(tcx), associated_type_id.repr(tcx), generics.repr(tcx));
let ty = match ty {
None => {
tcx.sess.span_bug(span,
@ -703,20 +707,22 @@ fn find_associated_type_in_generics<'tcx>(tcx: &ty::ctxt<'tcx>,
for type_parameter in generics.types.iter() {
if type_parameter.def_id == associated_type_id
&& type_parameter.associated_with == Some(param_id) {
return ty::mk_param_from_def(tcx, type_parameter)
return ty::mk_param_from_def(tcx, type_parameter);
}
}
tcx.sess.span_bug(span,
"find_associated_type_in_generics(): didn't \
find associated type anywhere in the generics \
list")
tcx.sess.span_err(
span,
format!("no suitable bound on `{}`",
ty.user_string(tcx))[]);
ty::mk_err()
}
_ => {
tcx.sess.span_bug(span,
"find_associated_type_in_generics(): self type \
is not a parameter")
tcx.sess.span_err(
span,
"it is currently unsupported to access associated types except \
through a type parameter; this restriction will be lifted in time");
ty::mk_err()
}
}
}
@ -1155,7 +1161,7 @@ pub fn convert(ccx: &CrateCtxt, it: &ast::Item) {
for trait_ref in opt_trait_ref.iter() {
astconv::instantiate_trait_ref(&icx, &ExplicitRscope, trait_ref,
Some(selfty), None);
Some(selfty));
}
},
ast::ItemTrait(_, _, _, ref trait_methods) => {
@ -1627,7 +1633,7 @@ fn ty_generics_for_trait<'a, 'tcx>(ccx: &CrateCtxt<'a, 'tcx>,
ccx,
subst::AssocSpace,
&associated_type.ty_param,
generics.types.len(subst::TypeSpace),
generics.types.len(subst::AssocSpace),
&ast_generics.where_clause,
Some(local_def(trait_id)));
ccx.tcx.ty_param_defs.borrow_mut().insert(associated_type.ty_param.id,
@ -2019,7 +2025,6 @@ fn conv_param_bounds<'tcx,AC>(this: &AC,
astconv::instantiate_poly_trait_ref(this,
&ExplicitRscope,
bound,
Some(param_ty.to_ty(this.tcx())),
Some(param_ty.to_ty(this.tcx())))
})
.collect();

View file

@ -603,7 +603,7 @@ impl<'a, 'tcx> ConstraintContext<'a, 'tcx> {
let is_inferred;
macro_rules! cannot_happen { () => { {
panic!("invalid parent: {:s} for {:s}",
panic!("invalid parent: {} for {}",
tcx.map.node_to_string(parent_id),
tcx.map.node_to_string(param_id));
} } }

View file

@ -20,6 +20,10 @@ use syntax::ast;
use syntax::visit;
use syntax::visit::Visitor;
// An error has already been reported to the user, so no need to continue checking.
#[deriving(Clone,Show)]
pub struct ErrorReported;
pub fn time<T, U>(do_it: bool, what: &str, u: U, f: |U| -> T) -> T {
local_data_key!(depth: uint);
if !do_it { return f(u); }

View file

@ -181,7 +181,7 @@ mod svh_visitor {
SawStructDef(token::InternedString),
SawLifetimeRef(token::InternedString),
SawLifetimeDecl(token::InternedString),
SawLifetimeDef(token::InternedString),
SawMod,
SawViewItem,
@ -414,8 +414,8 @@ mod svh_visitor {
SawLifetimeRef(content(l.name)).hash(self.st);
}
fn visit_lifetime_decl(&mut self, l: &LifetimeDef) {
SawLifetimeDecl(content(l.lifetime.name)).hash(self.st);
fn visit_lifetime_def(&mut self, l: &LifetimeDef) {
SawLifetimeDef(content(l.lifetime.name)).hash(self.st);
}
// We do recursively walk the bodies of functions/methods

View file

@ -368,7 +368,7 @@ unsafe extern "C" fn diagnostic_handler(info: DiagnosticInfoRef, user: *mut c_vo
if enabled {
let loc = llvm::debug_loc_to_string(llcx, opt.debug_loc);
cgcx.handler.note(format!("optimization {:s} for {:s} at {:s}: {:s}",
cgcx.handler.note(format!("optimization {} for {} at {}: {}",
opt.kind.describe(),
pass_name,
if loc.is_empty() { "[unknown]" } else { loc.as_slice() },

View file

@ -224,13 +224,13 @@ Available lint options:
};
println!("Lint checks provided by rustc:\n");
println!(" {} {:7.7s} {}", padded("name"), "default", "meaning");
println!(" {} {:7.7s} {}", padded("----"), "-------", "-------");
println!(" {} {:7.7} {}", padded("name"), "default", "meaning");
println!(" {} {:7.7} {}", padded("----"), "-------", "-------");
let print_lints = |lints: Vec<&Lint>| {
for lint in lints.into_iter() {
let name = lint.name_lower().replace("_", "-");
println!(" {} {:7.7s} {}",
println!(" {} {:7.7} {}",
padded(name.as_slice()), lint.default_level.as_str(), lint.desc);
}
println!("\n");
@ -293,7 +293,7 @@ fn describe_debug_flags() {
for tuple in r.iter() {
match *tuple {
(ref name, ref desc, _) => {
println!(" -Z {:>20s} -- {}", *name, *desc);
println!(" -Z {:>20} -- {}", *name, *desc);
}
}
}
@ -306,7 +306,7 @@ fn describe_codegen_flags() {
Some(..) => (21, "=val"),
None => (25, "")
};
println!(" -C {:>width$s}{} -- {}", name.replace("_", "-"),
println!(" -C {:>width$}{} -- {}", name.replace("_", "-"),
extra, desc, width=width);
}
}

View file

@ -355,8 +355,8 @@ impl UserIdentifiedItem {
fn to_one_node_id(self, user_option: &str, sess: &Session, map: &ast_map::Map) -> ast::NodeId {
let fail_because = |is_wrong_because| -> ast::NodeId {
let message =
format!("{:s} needs NodeId (int) or unique \
path suffix (b::c::d); got {:s}, which {:s}",
format!("{} needs NodeId (int) or unique \
path suffix (b::c::d); got {}, which {}",
user_option,
self.reconstructed_input(),
is_wrong_because);

View file

@ -3146,7 +3146,7 @@ pub fn trans_crate<'tcx>(analysis: CrateAnalysis<'tcx>)
}
if shared_ccx.sess().count_llvm_insns() {
for (k, v) in shared_ccx.stats().llvm_insns.borrow().iter() {
println!("{:7u} {}", *v, *k);
println!("{:7} {}", *v, *k);
}
}

View file

@ -67,7 +67,7 @@ pub enum CleanupScopeKind<'blk, 'tcx: 'blk> {
}
impl<'blk, 'tcx: 'blk> fmt::Show for CleanupScopeKind<'blk, 'tcx> {
fn fmt(&self, f: &mut fmt::Formatter) -> Result<(), fmt::FormatError> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
CustomScopeKind => write!(f, "CustomScopeKind"),
AstScopeKind(nid) => write!(f, "AstScopeKind({})", nid),

View file

@ -128,12 +128,17 @@ fn doit(sess: &parse::ParseSess, mut lexer: lexer::StringReader,
}
}
// text literals
token::LitByte(..) | token::LitBinary(..) | token::LitBinaryRaw(..) |
token::LitChar(..) | token::LitStr(..) | token::LitStrRaw(..) => "string",
token::Literal(lit, _suf) => {
match lit {
// text literals
token::Byte(..) | token::Char(..) |
token::Binary(..) | token::BinaryRaw(..) |
token::Str_(..) | token::StrRaw(..) => "string",
// number literals
token::LitInteger(..) | token::LitFloat(..) => "number",
// number literals
token::Integer(..) | token::Float(..) => "number",
}
}
// keywords are also included in the identifier set
token::Ident(ident, _is_mod_sep) => {

View file

@ -75,12 +75,6 @@ impl fmt::Show for ItemType {
}
}
impl fmt::Unsigned for ItemType {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
(*self as uint).fmt(f)
}
}
pub fn shortty(item: &clean::Item) -> ItemType {
match item.inner {
clean::ModuleItem(..) => Module,

View file

@ -437,8 +437,8 @@ fn build_index(krate: &clean::Crate, cache: &mut Cache) -> io::IoResult<String>
if i > 0 {
try!(write!(&mut w, ","));
}
try!(write!(&mut w, r#"[{:u},"{}","{}",{}"#,
item.ty, item.name, path,
try!(write!(&mut w, r#"[{},"{}","{}",{}"#,
item.ty as uint, item.name, path,
item.desc.to_json().to_string()));
match item.parent {
Some(nodeid) => {
@ -457,8 +457,8 @@ fn build_index(krate: &clean::Crate, cache: &mut Cache) -> io::IoResult<String>
if i > 0 {
try!(write!(&mut w, ","));
}
try!(write!(&mut w, r#"[{:u},"{}"]"#,
short, *fqp.last().unwrap()));
try!(write!(&mut w, r#"[{},"{}"]"#,
short as uint, *fqp.last().unwrap()));
}
try!(write!(&mut w, "]}};"));
@ -2192,7 +2192,7 @@ impl<'a> fmt::Show for Source<'a> {
}
try!(write!(fmt, "<pre class='line-numbers'>"));
for i in range(1, lines + 1) {
try!(write!(fmt, "<span id='{0:u}'>{0:1$u}</span>\n", i, cols));
try!(write!(fmt, "<span id='{0}'>{0:1$}</span>\n", i, cols));
}
try!(write!(fmt, "</pre>"));
try!(write!(fmt, "{}", highlight::highlight(s.as_slice(), None, None)));

View file

@ -168,11 +168,11 @@ pub fn main_args(args: &[String]) -> int {
if matches.opt_strs("passes").as_slice() == &["list".to_string()] {
println!("Available passes for running rustdoc:");
for &(name, _, description) in PASSES.iter() {
println!("{:>20s} - {}", name, description);
println!("{:>20} - {}", name, description);
}
println!("{}", "\nDefault passes for rustdoc:"); // FIXME: #9970
for &name in DEFAULT_PASSES.iter() {
println!("{:>20s}", name);
println!("{:>20}", name);
}
return 0;
}

View file

@ -283,6 +283,7 @@ mod imp {
#[cfg(any(all(target_os = "linux", target_arch = "x86"), // may not match
all(target_os = "linux", target_arch = "x86_64"),
all(target_os = "linux", target_arch = "arm"), // may not match
all(target_os = "linux", target_arch = "mips"), // may not match
target_os = "android"))] // may not match
mod signal {
use libc;

View file

@ -2403,7 +2403,7 @@ impl<A:ToJson> ToJson for Option<A> {
impl fmt::Show for Json {
/// Encodes a json value into a string
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
self.to_writer(f).map_err(|_| fmt::WriteError)
self.to_writer(f).map_err(|_| fmt::Error)
}
}

View file

@ -38,11 +38,11 @@ Some examples of the `format!` extension are:
```rust
# fn main() {
format!("Hello"); // => "Hello"
format!("Hello, {:s}!", "world"); // => "Hello, world!"
format!("The number is {:d}", 1i); // => "The number is 1"
format!("Hello, {}!", "world"); // => "Hello, world!"
format!("The number is {}", 1i); // => "The number is 1"
format!("{}", (3i, 4i)); // => "(3, 4)"
format!("{value}", value=4i); // => "4"
format!("{} {}", 1i, 2i); // => "1 2"
format!("{} {}", 1i, 2u); // => "1 2"
# }
```
@ -94,9 +94,9 @@ For example, the following `format!` expressions all use named argument:
```rust
# fn main() {
format!("{argument}", argument = "test"); // => "test"
format!("{name} {}", 1i, name = 2i); // => "2 1"
format!("{a:s} {c:d} {b}", a="a", b=(), c=3i); // => "a 3 ()"
format!("{argument}", argument = "test"); // => "test"
format!("{name} {}", 1i, name = 2i); // => "2 1"
format!("{a} {c} {b}", a="a", b=(), c=3i); // => "a 3 ()"
# }
```
@ -138,23 +138,16 @@ multiple actual types to be formatted via `{:d}` (like `i8` as well as `int`).
The current mapping of types to traits is:
* *nothing* `Show`
* `d` `Signed`
* `i` `Signed`
* `u` `Unsigned`
* `b` `Bool`
* `c` `Char`
* `o` `Octal`
* `x` `LowerHex`
* `X` `UpperHex`
* `s` `String`
* `p` `Pointer`
* `t` `Binary`
* `f` `Float`
* `b` `Binary`
* `e` `LowerExp`
* `E` `UpperExp`
What this means is that any type of argument which implements the
`std::fmt::Binary` trait can then be formatted with `{:t}`. Implementations are
`std::fmt::Binary` trait can then be formatted with `{:b}`. Implementations are
provided for these traits for a number of primitive types by the standard
library as well. If no format is specified (as in `{}` or `{:6}`), then the
format trait used is the `Show` trait. This is one of the more commonly
@ -216,7 +209,7 @@ impl fmt::Binary for Vector2D {
// Respect the formatting flags by using the helper method
// `pad_integral` on the Formatter object. See the method documentation
// for details, and the function `pad` can be used to pad strings.
let decimals = f.precision.unwrap_or(3);
let decimals = f.precision().unwrap_or(3);
let string = f64::to_str_exact(magnitude, decimals);
f.pad_integral(true, "", string.as_bytes())
}
@ -226,7 +219,7 @@ fn main() {
let myvector = Vector2D { x: 3, y: 4 };
println!("{}", myvector); // => "(3, 4)"
println!("{:10.3t}", myvector); // => " 5.000"
println!("{:10.3b}", myvector); // => " 5.000"
}
```
@ -418,10 +411,10 @@ use string;
use vec::Vec;
pub use core::fmt::{Formatter, Result, FormatWriter, rt};
pub use core::fmt::{Show, Bool, Char, Signed, Unsigned, Octal, Binary};
pub use core::fmt::{LowerHex, UpperHex, String, Pointer};
pub use core::fmt::{Float, LowerExp, UpperExp};
pub use core::fmt::{FormatError, WriteError};
pub use core::fmt::{Show, Octal, Binary};
pub use core::fmt::{LowerHex, UpperHex, Pointer};
pub use core::fmt::{LowerExp, UpperExp};
pub use core::fmt::Error;
pub use core::fmt::{Argument, Arguments, write, radix, Radix, RadixFmt};
#[doc(hidden)]
@ -444,6 +437,8 @@ pub use core::fmt::{argument, argumentstr, argumentuint};
/// let s = format_args!(fmt::format, "Hello, {}!", "world");
/// assert_eq!(s, "Hello, world!".to_string());
/// ```
#[experimental = "this is an implementation detail of format! and should not \
be called directly"]
pub fn format(args: &Arguments) -> string::String {
let mut output = Vec::new();
let _ = write!(&mut output as &mut Writer, "{}", args);
@ -454,7 +449,7 @@ impl<'a> Writer for Formatter<'a> {
fn write(&mut self, b: &[u8]) -> io::IoResult<()> {
match (*self).write(b) {
Ok(()) => Ok(()),
Err(WriteError) => Err(io::standard_error(io::OtherIoError))
Err(Error) => Err(io::standard_error(io::OtherIoError))
}
}
}

View file

@ -1034,7 +1034,7 @@ pub trait Writer {
Ok(()) => Ok(()),
Err(e) => {
self.error = Err(e);
Err(fmt::WriteError)
Err(fmt::Error)
}
}
}
@ -1081,13 +1081,13 @@ pub trait Writer {
/// Write the result of passing n through `int::to_str_bytes`.
#[inline]
fn write_int(&mut self, n: int) -> IoResult<()> {
write!(self, "{:d}", n)
write!(self, "{}", n)
}
/// Write the result of passing n through `uint::to_str_bytes`.
#[inline]
fn write_uint(&mut self, n: uint) -> IoResult<()> {
write!(self, "{:u}", n)
write!(self, "{}", n)
}
/// Write a little-endian uint (number of bytes depends on system).
@ -1896,10 +1896,8 @@ impl Default for FilePermission {
}
impl fmt::Show for FilePermission {
fn fmt(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.fill = '0';
formatter.width = Some(4);
(&self.bits as &fmt::Octal).fmt(formatter)
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "{:04o}", self.bits)
}
}

View file

@ -240,6 +240,7 @@ macro_rules! unimplemented(
/// format!("x = {}, y = {y}", 10i, y = 30i);
/// ```
#[macro_export]
#[stable]
macro_rules! format(
($($arg:tt)*) => (
format_args!(::std::fmt::format, $($arg)*)
@ -259,15 +260,18 @@ macro_rules! format(
/// write!(&mut w, "formatted {}", "arguments");
/// ```
#[macro_export]
#[stable]
macro_rules! write(
($dst:expr, $($arg:tt)*) => ({
format_args_method!($dst, write_fmt, $($arg)*)
let dst = &mut *$dst;
format_args!(|args| { dst.write_fmt(args) }, $($arg)*)
})
)
/// Equivalent to the `write!` macro, except that a newline is appended after
/// the message is written.
#[macro_export]
#[stable]
macro_rules! writeln(
($dst:expr, $fmt:expr $($arg:tt)*) => (
write!($dst, concat!($fmt, "\n") $($arg)*)
@ -277,6 +281,7 @@ macro_rules! writeln(
/// Equivalent to the `println!` macro except that a newline is not printed at
/// the end of the message.
#[macro_export]
#[stable]
macro_rules! print(
($($arg:tt)*) => (format_args!(::std::io::stdio::print_args, $($arg)*))
)
@ -294,6 +299,7 @@ macro_rules! print(
/// println!("format {} arguments", "some");
/// ```
#[macro_export]
#[stable]
macro_rules! println(
($($arg:tt)*) => (format_args!(::std::io::stdio::println_args, $($arg)*))
)

View file

@ -1108,6 +1108,10 @@ extern "system" {
/// Returns the arguments which this program was started with (normally passed
/// via the command line).
///
/// The first element is traditionally the path to the executable, but it can be
/// set to arbitrary text, and it may not even exist, so this property should not
// be relied upon for security purposes.
///
/// The arguments are interpreted as utf-8, with invalid bytes replaced with \uFFFD.
/// See `String::from_utf8_lossy` for details.
/// # Example

View file

@ -212,12 +212,12 @@ mod imp {
impl Rng for OsRng {
fn next_u32(&mut self) -> u32 {
let mut v = [0u8, .. 4];
self.fill_bytes(v);
self.fill_bytes(&mut v);
unsafe { mem::transmute(v) }
}
fn next_u64(&mut self) -> u64 {
let mut v = [0u8, .. 8];
self.fill_bytes(v);
self.fill_bytes(&mut v);
unsafe { mem::transmute(v) }
}
fn fill_bytes(&mut self, v: &mut [u8]) {

View file

@ -95,7 +95,7 @@ impl Ident {
}
pub fn encode_with_hygiene(&self) -> String {
format!("\x00name_{:u},ctxt_{:u}\x00",
format!("\x00name_{},ctxt_{}\x00",
self.name.uint(),
self.ctxt)
}
@ -706,11 +706,11 @@ pub enum Expr_ {
///
/// <Vec<T> as SomeTrait>::SomeAssociatedItem
/// ^~~~~ ^~~~~~~~~ ^~~~~~~~~~~~~~~~~~
/// for_type trait_name item_name
/// self_type trait_name item_name
#[deriving(Clone, PartialEq, Eq, Encodable, Decodable, Hash, Show)]
pub struct QPath {
pub for_type: P<Ty>,
pub trait_name: Path,
pub self_type: P<Ty>,
pub trait_ref: P<TraitRef>,
pub item_name: Ident,
}
@ -838,7 +838,7 @@ impl TokenTree {
tts: vec![TtToken(sp, token::Ident(token::str_to_ident("doc"),
token::Plain)),
TtToken(sp, token::Eq),
TtToken(sp, token::LitStr(name))],
TtToken(sp, token::Literal(token::Str_(name), None))],
close_span: sp,
}))
}

View file

@ -866,7 +866,7 @@ impl<'ast> Visitor<'ast> for NodeCollector<'ast> {
self.insert(lifetime.id, NodeLifetime(lifetime));
}
fn visit_lifetime_decl(&mut self, def: &'ast LifetimeDef) {
fn visit_lifetime_def(&mut self, def: &'ast LifetimeDef) {
self.visit_lifetime_ref(&def.lifetime);
}
}

View file

@ -535,7 +535,7 @@ impl<'a, 'v, O: IdVisitingOperation> Visitor<'v> for IdVisitor<'a, O> {
self.operation.visit_id(lifetime.id);
}
fn visit_lifetime_decl(&mut self, def: &'v LifetimeDef) {
fn visit_lifetime_def(&mut self, def: &'v LifetimeDef) {
self.visit_lifetime_ref(&def.lifetime);
}
}

View file

@ -87,7 +87,7 @@ pub fn expand_register_diagnostic<'cx>(ecx: &'cx mut ExtCtxt,
},
[ast::TtToken(_, token::Ident(ref code, _)),
ast::TtToken(_, token::Comma),
ast::TtToken(_, token::LitStrRaw(description, _))] => {
ast::TtToken(_, token::Literal(token::StrRaw(description, _), None))] => {
(code, Some(description))
}
_ => unreachable!()

View file

@ -361,9 +361,6 @@ fn initial_syntax_expander_table(ecfg: &expand::ExpansionConfig) -> SyntaxEnv {
syntax_expanders.insert(intern("format_args"),
builtin_normal_expander(
ext::format::expand_format_args));
syntax_expanders.insert(intern("format_args_method"),
builtin_normal_expander(
ext::format::expand_format_args_method));
syntax_expanders.insert(intern("env"),
builtin_normal_expander(
ext::env::expand_env));

View file

@ -335,7 +335,7 @@ pub fn combine_substructure<'a>(f: CombineSubstructureFunc<'a>)
impl<'a> TraitDef<'a> {
pub fn expand(&self,
cx: &mut ExtCtxt,
_mitem: &ast::MetaItem,
mitem: &ast::MetaItem,
item: &ast::Item,
push: |P<ast::Item>|) {
let newitem = match item.node {
@ -351,7 +351,10 @@ impl<'a> TraitDef<'a> {
item.ident,
generics)
}
_ => return
_ => {
cx.span_err(mitem.span, "`deriving` may only be applied to structs and enums");
return;
}
};
// Keep the lint attributes of the previous item to control how the
// generated implementations are linted
@ -887,7 +890,7 @@ impl<'a> MethodDef<'a> {
// a series of let statements mapping each self_arg to a uint
// corresponding to its variant index.
let vi_idents: Vec<ast::Ident> = self_arg_names.iter()
.map(|name| { let vi_suffix = format!("{:s}_vi", name.as_slice());
.map(|name| { let vi_suffix = format!("{}_vi", name.as_slice());
cx.ident_of(vi_suffix.as_slice()) })
.collect::<Vec<ast::Ident>>();

View file

@ -237,7 +237,7 @@ impl<'a, 'b> Context<'a, 'b> {
match arg {
Exact(arg) => {
if self.args.len() <= arg {
let msg = format!("invalid reference to argument `{}` ({:s})",
let msg = format!("invalid reference to argument `{}` ({})",
arg, self.describe_num_args());
self.ecx.span_err(self.fmtsp, msg.as_slice());
@ -670,17 +670,11 @@ impl<'a, 'b> Context<'a, 'b> {
Known(ref tyname) => {
match tyname.as_slice() {
"" => "Show",
"b" => "Bool",
"c" => "Char",
"d" | "i" => "Signed",
"e" => "LowerExp",
"E" => "UpperExp",
"f" => "Float",
"o" => "Octal",
"p" => "Pointer",
"s" => "String",
"t" => "Binary",
"u" => "Unsigned",
"b" => "Binary",
"x" => "LowerHex",
"X" => "UpperHex",
_ => {
@ -724,18 +718,6 @@ pub fn expand_format_args<'cx>(ecx: &'cx mut ExtCtxt, sp: Span,
}
}
pub fn expand_format_args_method<'cx>(ecx: &'cx mut ExtCtxt, sp: Span,
tts: &[ast::TokenTree]) -> Box<base::MacResult+'cx> {
match parse_args(ecx, sp, true, tts) {
(invocation, Some((efmt, args, order, names))) => {
MacExpr::new(expand_preparsed_format_args(ecx, sp, invocation, efmt,
args, order, names))
}
(_, None) => MacExpr::new(ecx.expr_uint(sp, 2))
}
}
/// Take the various parts of `format_args!(extra, efmt, args...,
/// name=names...)` and construct the appropriate formatting
/// expression.

View file

@ -131,7 +131,7 @@ fn new_sctable_internal() -> SCTable {
pub fn display_sctable(table: &SCTable) {
error!("SC table:");
for (idx,val) in table.table.borrow().iter().enumerate() {
error!("{:4u} : {}",idx,val);
error!("{:4} : {}",idx,val);
}
}

View file

@ -542,6 +542,16 @@ fn mk_delim(cx: &ExtCtxt, sp: Span, delim: token::DelimToken) -> P<ast::Expr> {
#[allow(non_upper_case_globals)]
fn mk_token(cx: &ExtCtxt, sp: Span, tok: &token::Token) -> P<ast::Expr> {
macro_rules! mk_lit {
($name: expr, $suffix: expr, $($args: expr),*) => {{
let inner = cx.expr_call(sp, mk_token_path(cx, sp, $name), vec![$($args),*]);
let suffix = match $suffix {
Some(name) => cx.expr_some(sp, mk_name(cx, sp, ast::Ident::new(name))),
None => cx.expr_none(sp)
};
cx.expr_call(sp, mk_token_path(cx, sp, "Literal"), vec![inner, suffix])
}}
}
match *tok {
token::BinOp(binop) => {
return cx.expr_call(sp, mk_token_path(cx, sp, "BinOp"), vec!(mk_binop(cx, sp, binop)));
@ -560,38 +570,32 @@ fn mk_token(cx: &ExtCtxt, sp: Span, tok: &token::Token) -> P<ast::Expr> {
vec![mk_delim(cx, sp, delim)]);
}
token::LitByte(i) => {
token::Literal(token::Byte(i), suf) => {
let e_byte = mk_name(cx, sp, i.ident());
return cx.expr_call(sp, mk_token_path(cx, sp, "LitByte"), vec!(e_byte));
return mk_lit!("Byte", suf, e_byte);
}
token::LitChar(i) => {
token::Literal(token::Char(i), suf) => {
let e_char = mk_name(cx, sp, i.ident());
return cx.expr_call(sp, mk_token_path(cx, sp, "LitChar"), vec!(e_char));
return mk_lit!("Char", suf, e_char);
}
token::LitInteger(i) => {
token::Literal(token::Integer(i), suf) => {
let e_int = mk_name(cx, sp, i.ident());
return cx.expr_call(sp, mk_token_path(cx, sp, "LitInteger"), vec!(e_int));
return mk_lit!("Integer", suf, e_int);
}
token::LitFloat(fident) => {
token::Literal(token::Float(fident), suf) => {
let e_fident = mk_name(cx, sp, fident.ident());
return cx.expr_call(sp, mk_token_path(cx, sp, "LitFloat"), vec!(e_fident));
return mk_lit!("Float", suf, e_fident);
}
token::LitStr(ident) => {
return cx.expr_call(sp,
mk_token_path(cx, sp, "LitStr"),
vec!(mk_name(cx, sp, ident.ident())));
token::Literal(token::Str_(ident), suf) => {
return mk_lit!("Str_", suf, mk_name(cx, sp, ident.ident()))
}
token::LitStrRaw(ident, n) => {
return cx.expr_call(sp,
mk_token_path(cx, sp, "LitStrRaw"),
vec!(mk_name(cx, sp, ident.ident()), cx.expr_uint(sp, n)));
token::Literal(token::StrRaw(ident, n), suf) => {
return mk_lit!("StrRaw", suf, mk_name(cx, sp, ident.ident()), cx.expr_uint(sp, n))
}
token::Ident(ident, style) => {

View file

@ -141,8 +141,8 @@ impl<'a> Context<'a> {
}
impl<'a, 'v> Visitor<'v> for Context<'a> {
fn visit_ident(&mut self, sp: Span, id: ast::Ident) {
if !token::get_ident(id).get().is_ascii() {
fn visit_name(&mut self, sp: Span, name: ast::Name) {
if !token::get_name(name).get().is_ascii() {
self.gate_feature("non_ascii_idents", sp,
"non-ascii idents are not fully supported.");
}

View file

@ -142,6 +142,10 @@ pub trait Folder {
noop_fold_ty(t, self)
}
fn fold_qpath(&mut self, t: P<QPath>) -> P<QPath> {
noop_fold_qpath(t, self)
}
fn fold_mod(&mut self, m: Mod) -> Mod {
noop_fold_mod(m, self)
}
@ -435,12 +439,8 @@ pub fn noop_fold_ty<T: Folder>(t: P<Ty>, fld: &mut T) -> P<Ty> {
fld.fold_opt_bounds(bounds),
id)
}
TyQPath(ref qpath) => {
TyQPath(P(QPath {
for_type: fld.fold_ty(qpath.for_type.clone()),
trait_name: fld.fold_path(qpath.trait_name.clone()),
item_name: fld.fold_ident(qpath.item_name.clone()),
}))
TyQPath(qpath) => {
TyQPath(fld.fold_qpath(qpath))
}
TyFixedLengthVec(ty, e) => {
TyFixedLengthVec(fld.fold_ty(ty), fld.fold_expr(e))
@ -456,6 +456,16 @@ pub fn noop_fold_ty<T: Folder>(t: P<Ty>, fld: &mut T) -> P<Ty> {
})
}
pub fn noop_fold_qpath<T: Folder>(qpath: P<QPath>, fld: &mut T) -> P<QPath> {
qpath.map(|qpath| {
QPath {
self_type: fld.fold_ty(qpath.self_type),
trait_ref: qpath.trait_ref.map(|tr| fld.fold_trait_ref(tr)),
item_name: fld.fold_ident(qpath.item_name),
}
})
}
pub fn noop_fold_foreign_mod<T: Folder>(ForeignMod {abi, view_items, items}: ForeignMod,
fld: &mut T) -> ForeignMod {
ForeignMod {

View file

@ -369,6 +369,25 @@ impl<'a> StringReader<'a> {
self.nextnextch() == Some(c)
}
/// Eats <XID_start><XID_continue>*, if possible.
fn scan_optional_raw_name(&mut self) -> Option<ast::Name> {
if !ident_start(self.curr) {
return None
}
let start = self.last_pos;
while ident_continue(self.curr) {
self.bump();
}
self.with_str_from(start, |string| {
if string == "_" {
None
} else {
Some(token::intern(string))
}
})
}
/// PRECONDITION: self.curr is not whitespace
/// Eats any kind of comment.
fn scan_comment(&mut self) -> Option<TokenAndSpan> {
@ -638,7 +657,7 @@ impl<'a> StringReader<'a> {
}
/// Lex a LIT_INTEGER or a LIT_FLOAT
fn scan_number(&mut self, c: char) -> token::Token {
fn scan_number(&mut self, c: char) -> token::Lit {
let mut num_digits;
let mut base = 10;
let start_bpos = self.last_pos;
@ -653,19 +672,9 @@ impl<'a> StringReader<'a> {
'0'...'9' | '_' | '.' => {
num_digits = self.scan_digits(10) + 1;
}
'u' | 'i' => {
self.scan_int_suffix();
return token::LitInteger(self.name_from(start_bpos));
},
'f' => {
let last_pos = self.last_pos;
self.scan_float_suffix();
self.check_float_base(start_bpos, last_pos, base);
return token::LitFloat(self.name_from(start_bpos));
}
_ => {
// just a 0
return token::LitInteger(self.name_from(start_bpos));
return token::Integer(self.name_from(start_bpos));
}
}
} else if c.is_digit_radix(10) {
@ -676,9 +685,7 @@ impl<'a> StringReader<'a> {
if num_digits == 0 {
self.err_span_(start_bpos, self.last_pos, "no valid digits found for number");
// eat any suffix
self.scan_int_suffix();
return token::LitInteger(token::intern("0"));
return token::Integer(token::intern("0"));
}
// might be a float, but don't be greedy if this is actually an
@ -692,29 +699,20 @@ impl<'a> StringReader<'a> {
if self.curr.unwrap_or('\0').is_digit_radix(10) {
self.scan_digits(10);
self.scan_float_exponent();
self.scan_float_suffix();
}
let last_pos = self.last_pos;
self.check_float_base(start_bpos, last_pos, base);
return token::LitFloat(self.name_from(start_bpos));
} else if self.curr_is('f') {
// or it might be an integer literal suffixed as a float
self.scan_float_suffix();
let last_pos = self.last_pos;
self.check_float_base(start_bpos, last_pos, base);
return token::LitFloat(self.name_from(start_bpos));
return token::Float(self.name_from(start_bpos));
} else {
// it might be a float if it has an exponent
if self.curr_is('e') || self.curr_is('E') {
self.scan_float_exponent();
self.scan_float_suffix();
let last_pos = self.last_pos;
self.check_float_base(start_bpos, last_pos, base);
return token::LitFloat(self.name_from(start_bpos));
return token::Float(self.name_from(start_bpos));
}
// but we certainly have an integer!
self.scan_int_suffix();
return token::LitInteger(self.name_from(start_bpos));
return token::Integer(self.name_from(start_bpos));
}
}
@ -850,55 +848,6 @@ impl<'a> StringReader<'a> {
true
}
/// Scan over an int literal suffix.
fn scan_int_suffix(&mut self) {
match self.curr {
Some('i') | Some('u') => {
self.bump();
if self.curr_is('8') {
self.bump();
} else if self.curr_is('1') {
if !self.nextch_is('6') {
self.err_span_(self.last_pos, self.pos,
"illegal int suffix");
} else {
self.bump(); self.bump();
}
} else if self.curr_is('3') {
if !self.nextch_is('2') {
self.err_span_(self.last_pos, self.pos,
"illegal int suffix");
} else {
self.bump(); self.bump();
}
} else if self.curr_is('6') {
if !self.nextch_is('4') {
self.err_span_(self.last_pos, self.pos,
"illegal int suffix");
} else {
self.bump(); self.bump();
}
}
},
_ => { }
}
}
/// Scan over a float literal suffix
fn scan_float_suffix(&mut self) {
if self.curr_is('f') {
if (self.nextch_is('3') && self.nextnextch_is('2'))
|| (self.nextch_is('6') && self.nextnextch_is('4')) {
self.bump();
self.bump();
self.bump();
} else {
self.err_span_(self.last_pos, self.pos, "illegal float suffix");
}
}
}
/// Scan over a float exponent.
fn scan_float_exponent(&mut self) {
if self.curr_is('e') || self.curr_is('E') {
@ -967,7 +916,10 @@ impl<'a> StringReader<'a> {
}
if is_dec_digit(c) {
return self.scan_number(c.unwrap());
let num = self.scan_number(c.unwrap());
let suffix = self.scan_optional_raw_name();
debug!("next_token_inner: scanned number {}, {}", num, suffix);
return token::Literal(num, suffix)
}
if self.read_embedded_ident {
@ -1126,17 +1078,19 @@ impl<'a> StringReader<'a> {
}
let id = if valid { self.name_from(start) } else { token::intern("0") };
self.bump(); // advance curr past token
return token::LitChar(id);
let suffix = self.scan_optional_raw_name();
return token::Literal(token::Char(id), suffix);
}
'b' => {
self.bump();
return match self.curr {
let lit = match self.curr {
Some('\'') => self.scan_byte(),
Some('"') => self.scan_byte_string(),
Some('r') => self.scan_raw_byte_string(),
_ => unreachable!() // Should have been a token::Ident above.
};
let suffix = self.scan_optional_raw_name();
return token::Literal(lit, suffix);
}
'"' => {
let start_bpos = self.last_pos;
@ -1157,7 +1111,8 @@ impl<'a> StringReader<'a> {
let id = if valid { self.name_from(start_bpos + BytePos(1)) }
else { token::intern("??") };
self.bump();
return token::LitStr(id);
let suffix = self.scan_optional_raw_name();
return token::Literal(token::Str_(id), suffix);
}
'r' => {
let start_bpos = self.last_pos;
@ -1224,7 +1179,8 @@ impl<'a> StringReader<'a> {
} else {
token::intern("??")
};
return token::LitStrRaw(id, hash_count);
let suffix = self.scan_optional_raw_name();
return token::Literal(token::StrRaw(id, hash_count), suffix);
}
'-' => {
if self.nextch_is('>') {
@ -1293,7 +1249,7 @@ impl<'a> StringReader<'a> {
|| (self.curr_is('#') && self.nextch_is('!') && !self.nextnextch_is('['))
}
fn scan_byte(&mut self) -> token::Token {
fn scan_byte(&mut self) -> token::Lit {
self.bump();
let start = self.last_pos;
@ -1314,10 +1270,10 @@ impl<'a> StringReader<'a> {
let id = if valid { self.name_from(start) } else { token::intern("??") };
self.bump(); // advance curr past token
return token::LitByte(id);
return token::Byte(id);
}
fn scan_byte_string(&mut self) -> token::Token {
fn scan_byte_string(&mut self) -> token::Lit {
self.bump();
let start = self.last_pos;
let mut valid = true;
@ -1336,10 +1292,10 @@ impl<'a> StringReader<'a> {
}
let id = if valid { self.name_from(start) } else { token::intern("??") };
self.bump();
return token::LitBinary(id);
return token::Binary(id);
}
fn scan_raw_byte_string(&mut self) -> token::Token {
fn scan_raw_byte_string(&mut self) -> token::Lit {
let start_bpos = self.last_pos;
self.bump();
let mut hash_count = 0u;
@ -1387,8 +1343,9 @@ impl<'a> StringReader<'a> {
self.bump();
}
self.bump();
return token::LitBinaryRaw(self.name_from_to(content_start_bpos, content_end_bpos),
hash_count);
return token::BinaryRaw(self.name_from_to(content_start_bpos,
content_end_bpos),
hash_count);
}
}
@ -1535,17 +1492,17 @@ mod test {
#[test] fn character_a() {
assert_eq!(setup(&mk_sh(), "'a'".to_string()).next_token().tok,
token::LitChar(token::intern("a")));
token::Literal(token::Char(token::intern("a")), None));
}
#[test] fn character_space() {
assert_eq!(setup(&mk_sh(), "' '".to_string()).next_token().tok,
token::LitChar(token::intern(" ")));
token::Literal(token::Char(token::intern(" ")), None));
}
#[test] fn character_escaped() {
assert_eq!(setup(&mk_sh(), "'\\n'".to_string()).next_token().tok,
token::LitChar(token::intern("\\n")));
token::Literal(token::Char(token::intern("\\n")), None));
}
#[test] fn lifetime_name() {
@ -1557,7 +1514,41 @@ mod test {
assert_eq!(setup(&mk_sh(),
"r###\"\"#a\\b\x00c\"\"###".to_string()).next_token()
.tok,
token::LitStrRaw(token::intern("\"#a\\b\x00c\""), 3));
token::Literal(token::StrRaw(token::intern("\"#a\\b\x00c\""), 3), None));
}
#[test] fn literal_suffixes() {
macro_rules! test {
($input: expr, $tok_type: ident, $tok_contents: expr) => {{
assert_eq!(setup(&mk_sh(), format!("{}suffix", $input)).next_token().tok,
token::Literal(token::$tok_type(token::intern($tok_contents)),
Some(token::intern("suffix"))));
// with a whitespace separator:
assert_eq!(setup(&mk_sh(), format!("{} suffix", $input)).next_token().tok,
token::Literal(token::$tok_type(token::intern($tok_contents)),
None));
}}
}
test!("'a'", Char, "a");
test!("b'a'", Byte, "a");
test!("\"a\"", Str_, "a");
test!("b\"a\"", Binary, "a");
test!("1234", Integer, "1234");
test!("0b101", Integer, "0b101");
test!("0xABC", Integer, "0xABC");
test!("1.0", Float, "1.0");
test!("1.0e10", Float, "1.0e10");
assert_eq!(setup(&mk_sh(), "2u".to_string()).next_token().tok,
token::Literal(token::Integer(token::intern("2")),
Some(token::intern("u"))));
assert_eq!(setup(&mk_sh(), "r###\"raw\"###suffix".to_string()).next_token().tok,
token::Literal(token::StrRaw(token::intern("raw"), 3),
Some(token::intern("suffix"))));
assert_eq!(setup(&mk_sh(), "br###\"raw\"###suffix".to_string()).next_token().tok,
token::Literal(token::BinaryRaw(token::intern("raw"), 3),
Some(token::intern("suffix"))));
}
#[test] fn line_doc_comments() {
@ -1573,7 +1564,7 @@ mod test {
token::Comment => { },
_ => panic!("expected a comment!")
}
assert_eq!(lexer.next_token().tok, token::LitChar(token::intern("a")));
assert_eq!(lexer.next_token().tok, token::Literal(token::Char(token::intern("a")), None));
}
}

View file

@ -511,27 +511,40 @@ pub fn raw_str_lit(lit: &str) -> String {
res
}
pub fn float_lit(s: &str) -> ast::Lit_ {
debug!("float_lit: {}", s);
// check if `s` looks like i32 or u1234 etc.
fn looks_like_width_suffix(first_chars: &[char], s: &str) -> bool {
s.len() > 1 &&
first_chars.contains(&s.char_at(0)) &&
s.slice_from(1).chars().all(|c| '0' <= c && c <= '9')
}
fn filtered_float_lit(data: token::InternedString, suffix: Option<&str>,
sd: &SpanHandler, sp: Span) -> ast::Lit_ {
debug!("filtered_float_lit: {}, {}", data, suffix);
match suffix {
Some("f32") => ast::LitFloat(data, ast::TyF32),
Some("f64") => ast::LitFloat(data, ast::TyF64),
Some(suf) => {
if suf.len() >= 2 && looks_like_width_suffix(&['f'], suf) {
// if it looks like a width, lets try to be helpful.
sd.span_err(sp, &*format!("illegal width `{}` for float literal, \
valid widths are 32 and 64", suf.slice_from(1)));
} else {
sd.span_err(sp, &*format!("illegal suffix `{}` for float literal, \
valid suffixes are `f32` and `f64`", suf));
}
ast::LitFloatUnsuffixed(data)
}
None => ast::LitFloatUnsuffixed(data)
}
}
pub fn float_lit(s: &str, suffix: Option<&str>, sd: &SpanHandler, sp: Span) -> ast::Lit_ {
debug!("float_lit: {}, {}", s, suffix);
// FIXME #2252: bounds checking float literals is defered until trans
let s2 = s.chars().filter(|&c| c != '_').collect::<String>();
let s = s2.as_slice();
let mut ty = None;
if s.ends_with("f32") {
ty = Some(ast::TyF32);
} else if s.ends_with("f64") {
ty = Some(ast::TyF64);
}
match ty {
Some(t) => {
ast::LitFloat(token::intern_and_get_ident(s.slice_to(s.len() - t.suffix_len())), t)
},
None => ast::LitFloatUnsuffixed(token::intern_and_get_ident(s))
}
let s = s.chars().filter(|&c| c != '_').collect::<String>();
let data = token::intern_and_get_ident(&*s);
filtered_float_lit(data, suffix, sd, sp)
}
/// Parse a string representing a byte literal into its final form. Similar to `char_lit`
@ -626,24 +639,19 @@ pub fn binary_lit(lit: &str) -> Rc<Vec<u8>> {
Rc::new(res)
}
pub fn integer_lit(s: &str, sd: &SpanHandler, sp: Span) -> ast::Lit_ {
pub fn integer_lit(s: &str, suffix: Option<&str>, sd: &SpanHandler, sp: Span) -> ast::Lit_ {
// s can only be ascii, byte indexing is fine
let s2 = s.chars().filter(|&c| c != '_').collect::<String>();
let mut s = s2.as_slice();
debug!("parse_integer_lit: {}", s);
if s.len() == 1 {
let n = (s.char_at(0)).to_digit(10).unwrap();
return ast::LitInt(n as u64, ast::UnsuffixedIntLit(ast::Sign::new(n)));
}
debug!("integer_lit: {}, {}", s, suffix);
let mut base = 10;
let orig = s;
let mut ty = ast::UnsuffixedIntLit(ast::Plus);
if s.char_at(0) == '0' {
if s.char_at(0) == '0' && s.len() > 1 {
match s.char_at(1) {
'x' => base = 16,
'o' => base = 8,
@ -652,57 +660,56 @@ pub fn integer_lit(s: &str, sd: &SpanHandler, sp: Span) -> ast::Lit_ {
}
}
// 1f64 and 2f32 etc. are valid float literals.
match suffix {
Some(suf) if looks_like_width_suffix(&['f'], suf) => {
match base {
16u => sd.span_err(sp, "hexadecimal float literal is not supported"),
8u => sd.span_err(sp, "octal float literal is not supported"),
2u => sd.span_err(sp, "binary float literal is not supported"),
_ => ()
}
let ident = token::intern_and_get_ident(&*s);
return filtered_float_lit(ident, suffix, sd, sp)
}
_ => {}
}
if base != 10 {
s = s.slice_from(2);
}
let last = s.len() - 1;
match s.char_at(last) {
'i' => ty = ast::SignedIntLit(ast::TyI, ast::Plus),
'u' => ty = ast::UnsignedIntLit(ast::TyU),
'8' => {
if s.len() > 2 {
match s.char_at(last - 1) {
'i' => ty = ast::SignedIntLit(ast::TyI8, ast::Plus),
'u' => ty = ast::UnsignedIntLit(ast::TyU8),
_ => { }
if let Some(suf) = suffix {
if suf.is_empty() { sd.span_bug(sp, "found empty literal suffix in Some")}
ty = match suf {
"i" => ast::SignedIntLit(ast::TyI, ast::Plus),
"i8" => ast::SignedIntLit(ast::TyI8, ast::Plus),
"i16" => ast::SignedIntLit(ast::TyI16, ast::Plus),
"i32" => ast::SignedIntLit(ast::TyI32, ast::Plus),
"i64" => ast::SignedIntLit(ast::TyI64, ast::Plus),
"u" => ast::UnsignedIntLit(ast::TyU),
"u8" => ast::UnsignedIntLit(ast::TyU8),
"u16" => ast::UnsignedIntLit(ast::TyU16),
"u32" => ast::UnsignedIntLit(ast::TyU32),
"u64" => ast::UnsignedIntLit(ast::TyU64),
_ => {
// i<digits> and u<digits> look like widths, so lets
// give an error message along those lines
if looks_like_width_suffix(&['i', 'u'], suf) {
sd.span_err(sp, &*format!("illegal width `{}` for integer literal; \
valid widths are 8, 16, 32 and 64",
suf.slice_from(1)));
} else {
sd.span_err(sp, &*format!("illegal suffix `{}` for numeric literal", suf));
}
ty
}
},
'6' => {
if s.len() > 3 && s.char_at(last - 1) == '1' {
match s.char_at(last - 2) {
'i' => ty = ast::SignedIntLit(ast::TyI16, ast::Plus),
'u' => ty = ast::UnsignedIntLit(ast::TyU16),
_ => { }
}
}
},
'2' => {
if s.len() > 3 && s.char_at(last - 1) == '3' {
match s.char_at(last - 2) {
'i' => ty = ast::SignedIntLit(ast::TyI32, ast::Plus),
'u' => ty = ast::UnsignedIntLit(ast::TyU32),
_ => { }
}
}
},
'4' => {
if s.len() > 3 && s.char_at(last - 1) == '6' {
match s.char_at(last - 2) {
'i' => ty = ast::SignedIntLit(ast::TyI64, ast::Plus),
'u' => ty = ast::UnsignedIntLit(ast::TyU64),
_ => { }
}
}
},
_ => { }
}
}
debug!("The suffix is {}, base {}, the new string is {}, the original \
string was {}", ty, base, s, orig);
s = s.slice_to(s.len() - ty.suffix_len());
debug!("integer_lit: the type is {}, base {}, the new string is {}, the original \
string was {}, the original suffix was {}", ty, base, s, orig, suffix);
let res: u64 = match ::std::num::from_str_radix(s, base) {
Some(r) => r,

View file

@ -646,6 +646,20 @@ impl<'a> Parser<'a> {
}
}
pub fn expect_no_suffix(&mut self, sp: Span, kind: &str, suffix: Option<ast::Name>) {
match suffix {
None => {/* everything ok */}
Some(suf) => {
let text = suf.as_str();
if text.is_empty() {
self.span_bug(sp, "found empty literal suffix in Some")
}
self.span_err(sp, &*format!("{} with a suffix is illegal", kind));
}
}
}
/// Attempt to consume a `<`. If `<<` is seen, replace it with a single
/// `<` and continue. If a `<` is not seen, return false.
///
@ -968,6 +982,9 @@ impl<'a> Parser<'a> {
pub fn span_err(&mut self, sp: Span, m: &str) {
self.sess.span_diagnostic.span_err(sp, m)
}
pub fn span_bug(&mut self, sp: Span, m: &str) -> ! {
self.sess.span_diagnostic.span_bug(sp, m)
}
pub fn abort_if_errors(&mut self) {
self.sess.span_diagnostic.handler().abort_if_errors();
}
@ -1502,17 +1519,17 @@ impl<'a> Parser<'a> {
} else if self.eat_keyword(keywords::Proc) {
self.parse_proc_type(Vec::new())
} else if self.token == token::Lt {
// QUALIFIED PATH
// QUALIFIED PATH `<TYPE as TRAIT_REF>::item`
self.bump();
let for_type = self.parse_ty(true);
let self_type = self.parse_ty(true);
self.expect_keyword(keywords::As);
let trait_name = self.parse_path(LifetimeAndTypesWithoutColons);
let trait_ref = self.parse_trait_ref();
self.expect(&token::Gt);
self.expect(&token::ModSep);
let item_name = self.parse_ident();
TyQPath(P(QPath {
for_type: for_type,
trait_name: trait_name.path,
self_type: self_type,
trait_ref: P(trait_ref),
item_name: item_name,
}))
} else if self.token == token::ModSep ||
@ -1640,24 +1657,53 @@ impl<'a> Parser<'a> {
/// Matches token_lit = LIT_INTEGER | ...
pub fn lit_from_token(&mut self, tok: &token::Token) -> Lit_ {
match *tok {
token::LitByte(i) => LitByte(parse::byte_lit(i.as_str()).val0()),
token::LitChar(i) => LitChar(parse::char_lit(i.as_str()).val0()),
token::LitInteger(s) => parse::integer_lit(s.as_str(),
&self.sess.span_diagnostic,
self.last_span),
token::LitFloat(s) => parse::float_lit(s.as_str()),
token::LitStr(s) => {
LitStr(token::intern_and_get_ident(parse::str_lit(s.as_str()).as_slice()),
ast::CookedStr)
token::Literal(lit, suf) => {
let (suffix_illegal, out) = match lit {
token::Byte(i) => (true, LitByte(parse::byte_lit(i.as_str()).val0())),
token::Char(i) => (true, LitChar(parse::char_lit(i.as_str()).val0())),
// there are some valid suffixes for integer and
// float literals, so all the handling is done
// internally.
token::Integer(s) => {
(false, parse::integer_lit(s.as_str(),
suf.as_ref().map(|s| s.as_str()),
&self.sess.span_diagnostic,
self.last_span))
}
token::Float(s) => {
(false, parse::float_lit(s.as_str(),
suf.as_ref().map(|s| s.as_str()),
&self.sess.span_diagnostic,
self.last_span))
}
token::Str_(s) => {
(true,
LitStr(token::intern_and_get_ident(parse::str_lit(s.as_str()).as_slice()),
ast::CookedStr))
}
token::StrRaw(s, n) => {
(true,
LitStr(
token::intern_and_get_ident(
parse::raw_str_lit(s.as_str()).as_slice()),
ast::RawStr(n)))
}
token::Binary(i) =>
(true, LitBinary(parse::binary_lit(i.as_str()))),
token::BinaryRaw(i, _) =>
(true,
LitBinary(Rc::new(i.as_str().as_bytes().iter().map(|&x| x).collect()))),
};
if suffix_illegal {
let sp = self.last_span;
self.expect_no_suffix(sp, &*format!("{} literal", lit.short_name()), suf)
}
out
}
token::LitStrRaw(s, n) => {
LitStr(token::intern_and_get_ident(parse::raw_str_lit(s.as_str()).as_slice()),
ast::RawStr(n))
}
token::LitBinary(i) =>
LitBinary(parse::binary_lit(i.as_str())),
token::LitBinaryRaw(i, _) =>
LitBinary(Rc::new(i.as_str().as_bytes().iter().map(|&x| x).collect())),
_ => { self.unexpected_last(tok); }
}
}
@ -2424,7 +2470,10 @@ impl<'a> Parser<'a> {
}
}
}
token::LitInteger(n) => {
token::Literal(token::Integer(n), suf) => {
let sp = self.span;
self.expect_no_suffix(sp, "tuple index", suf);
let index = n.as_str();
let dot = self.last_span.hi;
hi = self.span.hi;
@ -2449,7 +2498,7 @@ impl<'a> Parser<'a> {
}
}
}
token::LitFloat(n) => {
token::Literal(token::Float(n), _suf) => {
self.bump();
let last_span = self.last_span;
let fstr = n.as_str();
@ -5085,12 +5134,17 @@ impl<'a> Parser<'a> {
self.expect(&token::Semi);
(path, the_ident)
},
token::LitStr(..) | token::LitStrRaw(..) => {
let path = self.parse_str();
token::Literal(token::Str_(..), suf) | token::Literal(token::StrRaw(..), suf) => {
let sp = self.span;
self.expect_no_suffix(sp, "extern crate name", suf);
// forgo the internal suffix check of `parse_str` to
// avoid repeats (this unwrap will always succeed due
// to the restriction of the `match`)
let (s, style, _) = self.parse_optional_str().unwrap();
self.expect_keyword(keywords::As);
let the_ident = self.parse_ident();
self.expect(&token::Semi);
(Some(path), the_ident)
(Some((s, style)), the_ident)
},
_ => {
let span = self.span;
@ -5267,7 +5321,9 @@ impl<'a> Parser<'a> {
/// the `extern` keyword, if one is found.
fn parse_opt_abi(&mut self) -> Option<abi::Abi> {
match self.token {
token::LitStr(s) | token::LitStrRaw(s, _) => {
token::Literal(token::Str_(s), suf) | token::Literal(token::StrRaw(s, _), suf) => {
let sp = self.span;
self.expect_no_suffix(sp, "ABI spec", suf);
self.bump();
let the_string = s.as_str();
match abi::lookup(the_string) {
@ -5910,21 +5966,27 @@ impl<'a> Parser<'a> {
}
pub fn parse_optional_str(&mut self)
-> Option<(InternedString, ast::StrStyle)> {
let (s, style) = match self.token {
token::LitStr(s) => (self.id_to_interned_str(s.ident()), ast::CookedStr),
token::LitStrRaw(s, n) => {
(self.id_to_interned_str(s.ident()), ast::RawStr(n))
-> Option<(InternedString, ast::StrStyle, Option<ast::Name>)> {
let ret = match self.token {
token::Literal(token::Str_(s), suf) => {
(self.id_to_interned_str(s.ident()), ast::CookedStr, suf)
}
token::Literal(token::StrRaw(s, n), suf) => {
(self.id_to_interned_str(s.ident()), ast::RawStr(n), suf)
}
_ => return None
};
self.bump();
Some((s, style))
Some(ret)
}
pub fn parse_str(&mut self) -> (InternedString, StrStyle) {
match self.parse_optional_str() {
Some(s) => { s }
Some((s, style, suf)) => {
let sp = self.last_span;
self.expect_no_suffix(sp, "str literal", suf);
(s, style)
}
_ => self.fatal("expected string literal")
}
}

View file

@ -12,6 +12,7 @@ pub use self::BinOpToken::*;
pub use self::Nonterminal::*;
pub use self::DelimToken::*;
pub use self::IdentStyle::*;
pub use self::Lit::*;
pub use self::Token::*;
use ast;
@ -59,6 +60,31 @@ pub enum IdentStyle {
Plain,
}
#[deriving(Clone, Encodable, Decodable, PartialEq, Eq, Hash, Show)]
pub enum Lit {
Byte(ast::Name),
Char(ast::Name),
Integer(ast::Name),
Float(ast::Name),
Str_(ast::Name),
StrRaw(ast::Name, uint), /* raw str delimited by n hash symbols */
Binary(ast::Name),
BinaryRaw(ast::Name, uint), /* raw binary str delimited by n hash symbols */
}
impl Lit {
pub fn short_name(&self) -> &'static str {
match *self {
Byte(_) => "byte",
Char(_) => "char",
Integer(_) => "integer",
Float(_) => "float",
Str_(_) | StrRaw(..) => "str",
Binary(_) | BinaryRaw(..) => "binary str"
}
}
}
#[allow(non_camel_case_types)]
#[deriving(Clone, Encodable, Decodable, PartialEq, Eq, Hash, Show)]
pub enum Token {
@ -98,14 +124,7 @@ pub enum Token {
CloseDelim(DelimToken),
/* Literals */
LitByte(ast::Name),
LitChar(ast::Name),
LitInteger(ast::Name),
LitFloat(ast::Name),
LitStr(ast::Name),
LitStrRaw(ast::Name, uint), /* raw str delimited by n hash symbols */
LitBinary(ast::Name),
LitBinaryRaw(ast::Name, uint), /* raw binary str delimited by n hash symbols */
Literal(Lit, Option<ast::Name>),
/* Name components */
Ident(ast::Ident, IdentStyle),
@ -145,14 +164,7 @@ impl Token {
Ident(_, _) => true,
Underscore => true,
Tilde => true,
LitByte(_) => true,
LitChar(_) => true,
LitInteger(_) => true,
LitFloat(_) => true,
LitStr(_) => true,
LitStrRaw(_, _) => true,
LitBinary(_) => true,
LitBinaryRaw(_, _) => true,
Literal(_, _) => true,
Pound => true,
At => true,
Not => true,
@ -173,15 +185,8 @@ impl Token {
/// Returns `true` if the token is any literal
pub fn is_lit(&self) -> bool {
match *self {
LitByte(_) => true,
LitChar(_) => true,
LitInteger(_) => true,
LitFloat(_) => true,
LitStr(_) => true,
LitStrRaw(_, _) => true,
LitBinary(_) => true,
LitBinaryRaw(_, _) => true,
_ => false,
Literal(_, _) => true,
_ => false,
}
}

View file

@ -236,18 +236,28 @@ pub fn token_to_string(tok: &Token) -> String {
token::Question => "?".into_string(),
/* Literals */
token::LitByte(b) => format!("b'{}'", b.as_str()),
token::LitChar(c) => format!("'{}'", c.as_str()),
token::LitFloat(c) => c.as_str().into_string(),
token::LitInteger(c) => c.as_str().into_string(),
token::LitStr(s) => format!("\"{}\"", s.as_str()),
token::LitStrRaw(s, n) => format!("r{delim}\"{string}\"{delim}",
delim="#".repeat(n),
string=s.as_str()),
token::LitBinary(v) => format!("b\"{}\"", v.as_str()),
token::LitBinaryRaw(s, n) => format!("br{delim}\"{string}\"{delim}",
delim="#".repeat(n),
string=s.as_str()),
token::Literal(lit, suf) => {
let mut out = match lit {
token::Byte(b) => format!("b'{}'", b.as_str()),
token::Char(c) => format!("'{}'", c.as_str()),
token::Float(c) => c.as_str().into_string(),
token::Integer(c) => c.as_str().into_string(),
token::Str_(s) => format!("\"{}\"", s.as_str()),
token::StrRaw(s, n) => format!("r{delim}\"{string}\"{delim}",
delim="#".repeat(n),
string=s.as_str()),
token::Binary(v) => format!("b\"{}\"", v.as_str()),
token::BinaryRaw(s, n) => format!("br{delim}\"{string}\"{delim}",
delim="#".repeat(n),
string=s.as_str()),
};
if let Some(s) = suf {
out.push_str(s.as_str())
}
out
}
/* Name components */
token::Ident(s, _) => token::get_ident(s).get().into_string(),
@ -744,10 +754,10 @@ impl<'a> State<'a> {
}
ast::TyQPath(ref qpath) => {
try!(word(&mut self.s, "<"));
try!(self.print_type(&*qpath.for_type));
try!(self.print_type(&*qpath.self_type));
try!(space(&mut self.s));
try!(self.word_space("as"));
try!(self.print_path(&qpath.trait_name, false));
try!(self.print_trait_ref(&*qpath.trait_ref));
try!(word(&mut self.s, ">"));
try!(word(&mut self.s, "::"));
try!(self.print_ident(qpath.item_name));

View file

@ -55,8 +55,11 @@ pub enum FnKind<'a> {
/// new default implementation gets introduced.)
pub trait Visitor<'v> {
fn visit_ident(&mut self, _sp: Span, _ident: Ident) {
/*! Visit the idents */
fn visit_name(&mut self, _span: Span, _name: Name) {
// Nothing to do.
}
fn visit_ident(&mut self, span: Span, ident: Ident) {
self.visit_name(span, ident.name);
}
fn visit_mod(&mut self, m: &'v Mod, _s: Span, _n: NodeId) { walk_mod(self, m) }
fn visit_view_item(&mut self, i: &'v ViewItem) { walk_view_item(self, i) }
@ -102,11 +105,11 @@ pub trait Visitor<'v> {
None => ()
}
}
fn visit_lifetime_ref(&mut self, _lifetime: &'v Lifetime) {
/*! Visits a reference to a lifetime */
fn visit_lifetime_ref(&mut self, lifetime: &'v Lifetime) {
self.visit_name(lifetime.span, lifetime.name)
}
fn visit_lifetime_decl(&mut self, _lifetime: &'v LifetimeDef) {
/*! Visits a declaration of a lifetime */
fn visit_lifetime_def(&mut self, lifetime: &'v LifetimeDef) {
walk_lifetime_def(self, lifetime)
}
fn visit_explicit_self(&mut self, es: &'v ExplicitSelf) {
walk_explicit_self(self, es)
@ -207,6 +210,14 @@ pub fn walk_local<'v, V: Visitor<'v>>(visitor: &mut V, local: &'v Local) {
walk_expr_opt(visitor, &local.init);
}
pub fn walk_lifetime_def<'v, V: Visitor<'v>>(visitor: &mut V,
lifetime_def: &'v LifetimeDef) {
visitor.visit_name(lifetime_def.lifetime.span, lifetime_def.lifetime.name);
for bound in lifetime_def.bounds.iter() {
visitor.visit_lifetime_ref(bound);
}
}
pub fn walk_explicit_self<'v, V: Visitor<'v>>(visitor: &mut V,
explicit_self: &'v ExplicitSelf) {
match explicit_self.node {
@ -403,8 +414,8 @@ pub fn walk_ty<'v, V: Visitor<'v>>(visitor: &mut V, typ: &'v Ty) {
}
}
TyQPath(ref qpath) => {
visitor.visit_ty(&*qpath.for_type);
visitor.visit_path(&qpath.trait_name, typ.id);
visitor.visit_ty(&*qpath.self_type);
visitor.visit_trait_ref(&*qpath.trait_ref);
visitor.visit_ident(typ.span, qpath.item_name);
}
TyFixedLengthVec(ref ty, ref expression) => {
@ -424,7 +435,7 @@ pub fn walk_ty<'v, V: Visitor<'v>>(visitor: &mut V, typ: &'v Ty) {
pub fn walk_lifetime_decls_helper<'v, V: Visitor<'v>>(visitor: &mut V,
lifetimes: &'v Vec<LifetimeDef>) {
for l in lifetimes.iter() {
visitor.visit_lifetime_decl(l);
visitor.visit_lifetime_def(l);
}
}
@ -555,6 +566,7 @@ pub fn walk_ty_param_bound<'v, V: Visitor<'v>>(visitor: &mut V,
pub fn walk_generics<'v, V: Visitor<'v>>(visitor: &mut V, generics: &'v Generics) {
for type_parameter in generics.ty_params.iter() {
visitor.visit_ident(type_parameter.span, type_parameter.ident);
walk_ty_param_bounds_helper(visitor, &type_parameter.bounds);
match type_parameter.default {
Some(ref ty) => visitor.visit_ty(&**ty),

View file

@ -497,8 +497,8 @@ fn format(val: Param, op: FormatOp, flags: Flags) -> Result<Vec<u8> ,String> {
let mut s = match val {
Number(d) => {
let s = match (op, flags.sign) {
(FormatDigit, true) => format!("{:+d}", d).into_bytes(),
(FormatDigit, false) => format!("{:d}", d).into_bytes(),
(FormatDigit, true) => format!("{:+}", d).into_bytes(),
(FormatDigit, false) => format!("{}", d).into_bytes(),
(FormatOctal, _) => format!("{:o}", d).into_bytes(),
(FormatHex, _) => format!("{:x}", d).into_bytes(),
(FormatHEX, _) => format!("{:X}", d).into_bytes(),

View file

@ -687,14 +687,14 @@ impl<T: Writer> ConsoleTestState<T> {
improved += 1;
try!(self.write_plain(format!(": {}", *k).as_slice()));
try!(self.write_improved());
try!(self.write_plain(format!(" by {:.2f}%\n",
try!(self.write_plain(format!(" by {:.2}%\n",
pct as f64).as_slice()));
}
Regression(pct) => {
regressed += 1;
try!(self.write_plain(format!(": {}", *k).as_slice()));
try!(self.write_regressed());
try!(self.write_plain(format!(" by {:.2f}%\n",
try!(self.write_plain(format!(" by {:.2}%\n",
pct as f64).as_slice()));
}
}

View file

@ -602,8 +602,8 @@ impl<'a> fmt::Show for TmFmt<'a> {
match ch {
'G' => write!(fmt, "{}", year),
'g' => write!(fmt, "{:02d}", (year % 100 + 100) % 100),
'V' => write!(fmt, "{:02d}", days / 7 + 1),
'g' => write!(fmt, "{:02}", (year % 100 + 100) % 100),
'V' => write!(fmt, "{:02}", days / 7 + 1),
_ => Ok(())
}
}
@ -663,7 +663,7 @@ impl<'a> fmt::Show for TmFmt<'a> {
11 => "Dec",
_ => return die()
},
'C' => return write!(fmt, "{:02d}", (tm.tm_year as int + 1900) / 100),
'C' => return write!(fmt, "{:02}", (tm.tm_year as int + 1900) / 100),
'c' => {
try!(parse_type(fmt, 'a', tm));
try!(' '.fmt(fmt));
@ -682,9 +682,9 @@ impl<'a> fmt::Show for TmFmt<'a> {
try!('/'.fmt(fmt));
return parse_type(fmt, 'y', tm);
}
'd' => return write!(fmt, "{:02d}", tm.tm_mday),
'e' => return write!(fmt, "{:2d}", tm.tm_mday),
'f' => return write!(fmt, "{:09d}", tm.tm_nsec),
'd' => return write!(fmt, "{:02}", tm.tm_mday),
'e' => return write!(fmt, "{:2}", tm.tm_mday),
'f' => return write!(fmt, "{:09}", tm.tm_nsec),
'F' => {
try!(parse_type(fmt, 'Y', tm));
try!('-'.fmt(fmt));
@ -694,23 +694,23 @@ impl<'a> fmt::Show for TmFmt<'a> {
}
'G' => return iso_week(fmt, 'G', tm),
'g' => return iso_week(fmt, 'g', tm),
'H' => return write!(fmt, "{:02d}", tm.tm_hour),
'H' => return write!(fmt, "{:02}", tm.tm_hour),
'I' => {
let mut h = tm.tm_hour;
if h == 0 { h = 12 }
if h > 12 { h -= 12 }
return write!(fmt, "{:02d}", h)
return write!(fmt, "{:02}", h)
}
'j' => return write!(fmt, "{:03d}", tm.tm_yday + 1),
'k' => return write!(fmt, "{:2d}", tm.tm_hour),
'j' => return write!(fmt, "{:03}", tm.tm_yday + 1),
'k' => return write!(fmt, "{:2}", tm.tm_hour),
'l' => {
let mut h = tm.tm_hour;
if h == 0 { h = 12 }
if h > 12 { h -= 12 }
return write!(fmt, "{:2d}", h)
return write!(fmt, "{:2}", h)
}
'M' => return write!(fmt, "{:02d}", tm.tm_min),
'm' => return write!(fmt, "{:02d}", tm.tm_mon + 1),
'M' => return write!(fmt, "{:02}", tm.tm_min),
'm' => return write!(fmt, "{:02}", tm.tm_mon + 1),
'n' => "\n",
'P' => if (tm.tm_hour as int) < 12 { "am" } else { "pm" },
'p' => if (tm.tm_hour as int) < 12 { "AM" } else { "PM" },
@ -728,7 +728,7 @@ impl<'a> fmt::Show for TmFmt<'a> {
try!(' '.fmt(fmt));
return parse_type(fmt, 'p', tm);
}
'S' => return write!(fmt, "{:02d}", tm.tm_sec),
'S' => return write!(fmt, "{:02}", tm.tm_sec),
's' => return write!(fmt, "{}", tm.to_timespec().sec),
'T' | 'X' => {
try!(parse_type(fmt, 'H', tm));
@ -738,7 +738,7 @@ impl<'a> fmt::Show for TmFmt<'a> {
return parse_type(fmt, 'S', tm);
}
't' => "\t",
'U' => return write!(fmt, "{:02d}", (tm.tm_yday - tm.tm_wday + 7) / 7),
'U' => return write!(fmt, "{:02}", (tm.tm_yday - tm.tm_wday + 7) / 7),
'u' => {
let i = tm.tm_wday as int;
return (if i == 0 { 7 } else { i }).fmt(fmt);
@ -752,19 +752,19 @@ impl<'a> fmt::Show for TmFmt<'a> {
return parse_type(fmt, 'Y', tm);
}
'W' => {
return write!(fmt, "{:02d}",
return write!(fmt, "{:02}",
(tm.tm_yday - (tm.tm_wday - 1 + 7) % 7 + 7) / 7)
}
'w' => return (tm.tm_wday as int).fmt(fmt),
'Y' => return (tm.tm_year as int + 1900).fmt(fmt),
'y' => return write!(fmt, "{:02d}", (tm.tm_year as int + 1900) % 100),
'y' => return write!(fmt, "{:02}", (tm.tm_year as int + 1900) % 100),
'Z' => if tm.tm_gmtoff == 0_i32 { "GMT"} else { "" }, // FIXME (#2350): support locale
'z' => {
let sign = if tm.tm_gmtoff > 0_i32 { '+' } else { '-' };
let mut m = tm.tm_gmtoff.abs() / 60_i32;
let h = m / 60_i32;
m -= h * 60_i32;
return write!(fmt, "{}{:02d}{:02d}", sign, h, m);
return write!(fmt, "{}{:02}{:02}", sign, h, m);
}
'+' => return tm.rfc3339().fmt(fmt),
'%' => "%",
@ -806,7 +806,7 @@ impl<'a> fmt::Show for TmFmt<'a> {
let mut m = self.tm.tm_gmtoff.abs() / 60_i32;
let h = m / 60_i32;
m -= h * 60_i32;
write!(fmt, "{}{}{:02d}:{:02d}", s, sign, h as int, m as int)
write!(fmt, "{}{}{:02}:{:02}", s, sign, h as int, m as int)
}
}
}

View file

@ -148,7 +148,7 @@ fn write_header(header: &str) {
}
fn write_row(label: &str, value: Duration) {
println!("{:30s} {} s\n", label, value);
println!("{:30} {} s\n", label, value);
}
fn write_results(label: &str, results: &Results) {

View file

@ -115,7 +115,7 @@ fn main() {
for y in range(0u, 256) {
for x in range(0u, 256) {
let idx = (pixels[y*256+x] / 0.2) as uint;
print!("{:c}", symbols[idx]);
print!("{}", symbols[idx]);
}
print!("\n");
}

View file

@ -63,7 +63,7 @@ fn sort_and_fmt(mm: &HashMap<Vec<u8> , uint>, total: uint) -> String {
let mut buffer = String::new();
for &(ref k, v) in pairs_sorted.iter() {
buffer.push_str(format!("{} {:0.3f}\n",
buffer.push_str(format!("{} {:0.3}\n",
k.as_slice()
.to_ascii()
.to_uppercase()

View file

@ -266,7 +266,7 @@ fn print_frequencies(frequencies: &Table, frame: uint) {
}
for &(count, key) in vector.iter().rev() {
println!("{} {:.3f}",
println!("{} {:.3}",
key.unpack(frame).as_slice(),
(count as f32 * 100.0) / (total_count as f32));
}

View file

@ -179,11 +179,11 @@ fn main() {
let mut bodies = BODIES;
offset_momentum(&mut bodies);
println!("{:.9f}", energy(&bodies));
println!("{:.9}", energy(&bodies));
advance(&mut bodies, 0.01, n);
println!("{:.9f}", energy(&bodies));
println!("{:.9}", energy(&bodies));
}
/// Pop a mutable reference off the head of a slice, mutating the slice to no

View file

@ -59,7 +59,7 @@ fn main() {
} else {
from_str(args[1].as_slice()).unwrap()
});
println!("{:.9f}", answer);
println!("{:.9}", answer);
}
fn spectralnorm(n: uint) -> f64 {

View file

@ -20,12 +20,12 @@ fn get<T:Get,U:Get>(x: T, y: U) -> Get::Value {}
trait Other {
fn uhoh<U:Get>(&self, foo: U, bar: <Self as Get>::Value) {}
//~^ ERROR this associated type is not allowed in this context
//~^ ERROR no suitable bound on `Self`
}
impl<T:Get> Other for T {
fn uhoh<U:Get>(&self, foo: U, bar: <(T, U) as Get>::Value) {}
//~^ ERROR this associated type is not allowed in this context
//~^ ERROR currently unsupported
}
trait Grab {

View file

@ -16,7 +16,7 @@ trait Get {
}
fn get(x: int) -> <int as Get>::Value {}
//~^ ERROR this associated type is not allowed in this context
//~^ ERROR unsupported
struct Struct {
x: int,
@ -24,7 +24,7 @@ struct Struct {
impl Struct {
fn uhoh<T>(foo: <T as Get>::Value) {}
//~^ ERROR this associated type is not allowed in this context
//~^ ERROR no suitable bound on `T`
}
fn main() {

View file

@ -0,0 +1,41 @@
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
extern crate
"foo"suffix //~ ERROR extern crate name with a suffix is illegal
as foo;
extern
"C"suffix //~ ERROR ABI spec with a suffix is illegal
fn foo() {}
extern
"C"suffix //~ ERROR ABI spec with a suffix is illegal
{}
fn main() {
""suffix; //~ ERROR str literal with a suffix is illegal
b""suffix; //~ ERROR binary str literal with a suffix is illegal
r#""#suffix; //~ ERROR str literal with a suffix is illegal
br#""#suffix; //~ ERROR binary str literal with a suffix is illegal
'a'suffix; //~ ERROR char literal with a suffix is illegal
b'a'suffix; //~ ERROR byte literal with a suffix is illegal
1234u1024; //~ ERROR illegal width `1024` for integer literal
1234i1024; //~ ERROR illegal width `1024` for integer literal
1234f1024; //~ ERROR illegal width `1024` for float literal
1234.5f1024; //~ ERROR illegal width `1024` for float literal
1234suffix; //~ ERROR illegal suffix `suffix` for numeric literal
0b101suffix; //~ ERROR illegal suffix `suffix` for numeric literal
1.0suffix; //~ ERROR illegal suffix `suffix` for float literal
1.0e10suffix; //~ ERROR illegal suffix `suffix` for float literal
}

View file

@ -0,0 +1,40 @@
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#![allow(dead_code)]
struct S;
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
trait T { }
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
impl S { }
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
impl T for S { }
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
static s: uint = 0u;
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
const c: uint = 0u;
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
mod m { }
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
extern "C" { }
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
type A = uint;
#[deriving(PartialEq)] //~ ERROR: `deriving` may only be applied to structs and enums
fn main() { }

View file

@ -23,8 +23,8 @@ fn main() {
format!("{foo}", 1, foo=2); //~ ERROR: argument never used
format!("", foo=2); //~ ERROR: named argument never used
format!("{0:d} {0:s}", 1); //~ ERROR: redeclared with type `s`
format!("{foo:d} {foo:s}", foo=1); //~ ERROR: redeclared with type `s`
format!("{0:x} {0:X}", 1); //~ ERROR: redeclared with type `X`
format!("{foo:x} {foo:X}", foo=1); //~ ERROR: redeclared with type `X`
format!("{foo}", foo=1, foo=2); //~ ERROR: duplicate argument
format!("", foo=1, 2); //~ ERROR: positional arguments cannot follow

View file

@ -9,6 +9,6 @@
// except according to those terms.
fn main() {
format!("{:d}", "3");
//~^ ERROR: the trait `core::fmt::Signed` is not implemented
format!("{:X}", "3");
//~^ ERROR: the trait `core::fmt::UpperHex` is not implemented
}

View file

@ -13,5 +13,5 @@
fn foo(a: uint) -> uint { a }
fn main() {
println!("{:u}", foo(10i)); //~ ERROR mismatched types
println!("{}", foo(10i)); //~ ERROR mismatched types
}

View file

@ -10,17 +10,18 @@
use std::fmt::Show;
trait Str {}
trait Something {
fn yay<T: Show>(_: Option<Self>, thing: &[T]) -> String {
}
fn yay<T: Show>(_: Option<Self>, thing: &[T]);
}
struct X { data: u32 }
impl Something for X {
fn yay<T: Str>(_:Option<X>, thing: &[T]) -> String {
//~^ ERROR in method `yay`, type parameter 0 requires bound `core::str::Str`, which is not required
format!("{:s}", thing[0])
fn yay<T: Str>(_:Option<X>, thing: &[T]) {
//~^ ERROR in method `yay`, type parameter 0 requires bound `Str`, which is not required
}
}

View file

@ -0,0 +1,23 @@
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use Foo::FooB;
enum Foo {
FooB { x: i32, y: i32 }
}
fn main() {
let f = FooB { x: 3, y: 4 };
match f {
FooB(a, b) => println!("{} {}", a, b),
//~^ ERROR `FooB` does not name a non-struct variant or a tuple struct
}
}

View file

@ -0,0 +1,26 @@
// Copyright 2012 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// Test a method call where the parameter `B` would (illegally) be
// inferred to a region bound in the method argument. If this program
// were accepted, then the closure passed to `s.f` could escape its
// argument.
struct S;
impl S {
fn f<B>(&self, _: |&i32| -> B) {
}
}
fn main() {
let s = S;
s.f(|p| p) //~ ERROR cannot infer
}

Some files were not shown because too many files have changed in this diff Show more