Auto merge of #33816 - nikomatsakis:projection-cache-2, r=arielb1
Projection cache and better warnings for #32330 This PR does three things: - it lays the groundwork for the more precise subtyping rules discussed in #32330, but does not enable them; - it issues warnings when the result of a leak-check or subtyping check relies on a late-bound region which will late become early-bound when #32330 is fixed; - it introduces a cache for projection in the inference context. I'm not 100% happy with the approach taken by the cache here, but it seems like a step in the right direction. It results in big wins on some test cases, but not as big as previous versions -- I think because it is caching the `Vec<Obligation>` (whereas before I just returned the normalized type with an empty vector). However, that change was needed to fix an ICE in @alexcrichton's future-rs module (I haven't fully tracked the cause of that ICE yet). Also, because trans/the collector use a fresh inference context for every call to `fulfill_obligation`, they don't profit nearly as much from this cache as they ought to. Still, here are the results from the future-rs `retry.rs`: ``` 06:26 <nmatsakis> time: 6.246; rss: 44MB item-bodies checking 06:26 <nmatsakis> time: 54.783; rss: 63MB translation item collection 06:26 <nmatsakis> time: 140.086; rss: 86MB translation 06:26 <nmatsakis> time: 0.361; rss: 46MB item-bodies checking 06:26 <nmatsakis> time: 5.299; rss: 63MB translation item collection 06:26 <nmatsakis> time: 12.140; rss: 86MB translation ``` ~~Another example is the example from #31849. For that, I get 34s to run item-bodies without any cache. The version of the cache included here takes 2s to run item-bodies type-checking. An alternative version which doesn't track nested obligations takes 0.2s, but that version ICEs on @alexcrichton's future-rs (and may well be incorrect, I've not fully convinced myself of that). So, a definite win, but I think there's definitely room for further progress.~~ Pushed a modified version which improves performance of the case from #31849: ``` lunch-box. time rustc --stage0 ~/tmp/issue-31849.rs -Z no-trans real 0m33.539s user 0m32.932s sys 0m0.570s lunch-box. time rustc --stage2 ~/tmp/issue-31849.rs -Z no-trans real 0m0.195s user 0m0.154s sys 0m0.042s ``` Some sort of cache is also needed for unblocking further work on lazy normalization, since that will lean even more heavily on the cache, and will also require cycle detection. r? @arielb1
This commit is contained in:
commit
12238b984a
49 changed files with 2156 additions and 640 deletions
|
|
@ -1647,5 +1647,5 @@ register_diagnostics! {
|
|||
E0490, // a value of type `..` is borrowed for too long
|
||||
E0491, // in type `..`, reference has a longer lifetime than the data it...
|
||||
E0495, // cannot infer an appropriate lifetime due to conflicting requirements
|
||||
E0525, // expected a closure that implements `..` but this closure only implements `..`
|
||||
E0525 // expected a closure that implements `..` but this closure only implements `..`
|
||||
}
|
||||
|
|
|
|||
|
|
@ -132,6 +132,9 @@ pub trait Visitor<'v> : Sized {
|
|||
fn visit_generics(&mut self, g: &'v Generics) {
|
||||
walk_generics(self, g)
|
||||
}
|
||||
fn visit_where_predicate(&mut self, predicate: &'v WherePredicate) {
|
||||
walk_where_predicate(self, predicate)
|
||||
}
|
||||
fn visit_fn(&mut self, fk: FnKind<'v>, fd: &'v FnDecl, b: &'v Block, s: Span, _: NodeId) {
|
||||
walk_fn(self, fk, fd, b, s)
|
||||
}
|
||||
|
|
@ -529,29 +532,34 @@ pub fn walk_generics<'v, V: Visitor<'v>>(visitor: &mut V, generics: &'v Generics
|
|||
walk_list!(visitor, visit_ty, ¶m.default);
|
||||
}
|
||||
walk_list!(visitor, visit_lifetime_def, &generics.lifetimes);
|
||||
for predicate in &generics.where_clause.predicates {
|
||||
match predicate {
|
||||
&WherePredicate::BoundPredicate(WhereBoundPredicate{ref bounded_ty,
|
||||
ref bounds,
|
||||
ref bound_lifetimes,
|
||||
..}) => {
|
||||
visitor.visit_ty(bounded_ty);
|
||||
walk_list!(visitor, visit_ty_param_bound, bounds);
|
||||
walk_list!(visitor, visit_lifetime_def, bound_lifetimes);
|
||||
}
|
||||
&WherePredicate::RegionPredicate(WhereRegionPredicate{ref lifetime,
|
||||
ref bounds,
|
||||
..}) => {
|
||||
visitor.visit_lifetime(lifetime);
|
||||
walk_list!(visitor, visit_lifetime, bounds);
|
||||
}
|
||||
&WherePredicate::EqPredicate(WhereEqPredicate{id,
|
||||
ref path,
|
||||
ref ty,
|
||||
..}) => {
|
||||
visitor.visit_path(path, id);
|
||||
visitor.visit_ty(ty);
|
||||
}
|
||||
walk_list!(visitor, visit_where_predicate, &generics.where_clause.predicates);
|
||||
}
|
||||
|
||||
pub fn walk_where_predicate<'v, V: Visitor<'v>>(
|
||||
visitor: &mut V,
|
||||
predicate: &'v WherePredicate)
|
||||
{
|
||||
match predicate {
|
||||
&WherePredicate::BoundPredicate(WhereBoundPredicate{ref bounded_ty,
|
||||
ref bounds,
|
||||
ref bound_lifetimes,
|
||||
..}) => {
|
||||
visitor.visit_ty(bounded_ty);
|
||||
walk_list!(visitor, visit_ty_param_bound, bounds);
|
||||
walk_list!(visitor, visit_lifetime_def, bound_lifetimes);
|
||||
}
|
||||
&WherePredicate::RegionPredicate(WhereRegionPredicate{ref lifetime,
|
||||
ref bounds,
|
||||
..}) => {
|
||||
visitor.visit_lifetime(lifetime);
|
||||
walk_list!(visitor, visit_lifetime, bounds);
|
||||
}
|
||||
&WherePredicate::EqPredicate(WhereEqPredicate{id,
|
||||
ref path,
|
||||
ref ty,
|
||||
..}) => {
|
||||
visitor.visit_path(path, id);
|
||||
visitor.visit_ty(ty);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -77,6 +77,7 @@ use hir::map as ast_map;
|
|||
use hir;
|
||||
use hir::print as pprust;
|
||||
|
||||
use lint;
|
||||
use hir::def::Def;
|
||||
use hir::def_id::DefId;
|
||||
use infer::{self, TypeOrigin};
|
||||
|
|
@ -1017,6 +1018,27 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
let (fn_decl, generics) = rebuilder.rebuild();
|
||||
self.give_expl_lifetime_param(err, &fn_decl, unsafety, constness, name, &generics, span);
|
||||
}
|
||||
|
||||
pub fn issue_32330_warnings(&self, span: Span, issue32330s: &[ty::Issue32330]) {
|
||||
for issue32330 in issue32330s {
|
||||
match *issue32330 {
|
||||
ty::Issue32330::WontChange => { }
|
||||
ty::Issue32330::WillChange { fn_def_id, region_name } => {
|
||||
self.tcx.sess.add_lint(
|
||||
lint::builtin::HR_LIFETIME_IN_ASSOC_TYPE,
|
||||
ast::CRATE_NODE_ID,
|
||||
span,
|
||||
format!("lifetime parameter `{0}` declared on fn `{1}` \
|
||||
appears only in the return type, \
|
||||
but here is required to be higher-ranked, \
|
||||
which means that `{0}` must appear in both \
|
||||
argument and return types",
|
||||
region_name,
|
||||
self.tcx.item_path_str(fn_def_id)));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
struct RebuildPathInfo<'a> {
|
||||
|
|
@ -1129,7 +1151,7 @@ impl<'a, 'gcx, 'tcx> Rebuilder<'a, 'gcx, 'tcx> {
|
|||
ty::BrAnon(i) => {
|
||||
anon_nums.insert(i);
|
||||
}
|
||||
ty::BrNamed(_, name) => {
|
||||
ty::BrNamed(_, name, _) => {
|
||||
region_names.insert(name);
|
||||
}
|
||||
_ => ()
|
||||
|
|
@ -1143,7 +1165,7 @@ impl<'a, 'gcx, 'tcx> Rebuilder<'a, 'gcx, 'tcx> {
|
|||
for sr in self.same_regions {
|
||||
for br in &sr.regions {
|
||||
match *br {
|
||||
ty::BrNamed(_, name) => {
|
||||
ty::BrNamed(_, name, _) => {
|
||||
all_region_names.insert(name);
|
||||
}
|
||||
_ => ()
|
||||
|
|
@ -1923,3 +1945,4 @@ fn name_to_dummy_lifetime(name: ast::Name) -> hir::Lifetime {
|
|||
span: codemap::DUMMY_SP,
|
||||
name: name }
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -11,8 +11,14 @@
|
|||
//! Helper routines for higher-ranked things. See the `doc` module at
|
||||
//! the end of the file for details.
|
||||
|
||||
use super::{CombinedSnapshot, InferCtxt, HigherRankedType, SkolemizationMap};
|
||||
use super::{CombinedSnapshot,
|
||||
InferCtxt,
|
||||
LateBoundRegion,
|
||||
HigherRankedType,
|
||||
SubregionOrigin,
|
||||
SkolemizationMap};
|
||||
use super::combine::CombineFields;
|
||||
use super::region_inference::{TaintDirections};
|
||||
|
||||
use ty::{self, TyCtxt, Binder, TypeFoldable};
|
||||
use ty::error::TypeError;
|
||||
|
|
@ -20,6 +26,19 @@ use ty::relate::{Relate, RelateResult, TypeRelation};
|
|||
use syntax::codemap::Span;
|
||||
use util::nodemap::{FnvHashMap, FnvHashSet};
|
||||
|
||||
pub struct HrMatchResult<U> {
|
||||
pub value: U,
|
||||
|
||||
/// Normally, when we do a higher-ranked match operation, we
|
||||
/// expect all higher-ranked regions to be constrained as part of
|
||||
/// the match operation. However, in the transition period for
|
||||
/// #32330, it can happen that we sometimes have unconstrained
|
||||
/// regions that get instantiated with fresh variables. In that
|
||||
/// case, we collect the set of unconstrained bound regions here
|
||||
/// and replace them with fresh variables.
|
||||
pub unconstrained_regions: Vec<ty::BoundRegion>,
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
||||
pub fn higher_ranked_sub<T>(&self, a: &Binder<T>, b: &Binder<T>)
|
||||
-> RelateResult<'tcx, Binder<T>>
|
||||
|
|
@ -39,11 +58,13 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
|||
// Start a snapshot so we can examine "all bindings that were
|
||||
// created as part of this type comparison".
|
||||
return self.infcx.commit_if_ok(|snapshot| {
|
||||
let span = self.trace.origin.span();
|
||||
|
||||
// First, we instantiate each bound region in the subtype with a fresh
|
||||
// region variable.
|
||||
let (a_prime, _) =
|
||||
self.infcx.replace_late_bound_regions_with_fresh_var(
|
||||
self.trace.origin.span(),
|
||||
span,
|
||||
HigherRankedType,
|
||||
a);
|
||||
|
||||
|
|
@ -60,7 +81,11 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
|||
|
||||
// Presuming type comparison succeeds, we need to check
|
||||
// that the skolemized regions do not "leak".
|
||||
self.infcx.leak_check(!self.a_is_expected, &skol_map, snapshot)?;
|
||||
self.infcx.leak_check(!self.a_is_expected, span, &skol_map, snapshot)?;
|
||||
|
||||
// We are finished with the skolemized regions now so pop
|
||||
// them off.
|
||||
self.infcx.pop_skolemized(skol_map, snapshot);
|
||||
|
||||
debug!("higher_ranked_sub: OK result={:?}", result);
|
||||
|
||||
|
|
@ -68,6 +93,134 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
|||
});
|
||||
}
|
||||
|
||||
/// The value consists of a pair `(t, u)` where `t` is the
|
||||
/// *matcher* and `u` is a *value*. The idea is to find a
|
||||
/// substitution `S` such that `S(t) == b`, and then return
|
||||
/// `S(u)`. In other words, find values for the late-bound regions
|
||||
/// in `a` that can make `t == b` and then replace the LBR in `u`
|
||||
/// with those values.
|
||||
///
|
||||
/// This routine is (as of this writing) used in trait matching,
|
||||
/// particularly projection.
|
||||
///
|
||||
/// NB. It should not happen that there are LBR appearing in `U`
|
||||
/// that do not appear in `T`. If that happens, those regions are
|
||||
/// unconstrained, and this routine replaces them with `'static`.
|
||||
pub fn higher_ranked_match<T, U>(&self,
|
||||
span: Span,
|
||||
a_pair: &Binder<(T, U)>,
|
||||
b_match: &T)
|
||||
-> RelateResult<'tcx, HrMatchResult<U>>
|
||||
where T: Relate<'tcx>,
|
||||
U: TypeFoldable<'tcx>
|
||||
{
|
||||
debug!("higher_ranked_match(a={:?}, b={:?})",
|
||||
a_pair, b_match);
|
||||
|
||||
// Start a snapshot so we can examine "all bindings that were
|
||||
// created as part of this type comparison".
|
||||
return self.infcx.commit_if_ok(|snapshot| {
|
||||
// First, we instantiate each bound region in the matcher
|
||||
// with a skolemized region.
|
||||
let ((a_match, a_value), skol_map) =
|
||||
self.infcx.skolemize_late_bound_regions(a_pair, snapshot);
|
||||
|
||||
debug!("higher_ranked_match: a_match={:?}", a_match);
|
||||
debug!("higher_ranked_match: skol_map={:?}", skol_map);
|
||||
|
||||
// Equate types now that bound regions have been replaced.
|
||||
try!(self.equate().relate(&a_match, &b_match));
|
||||
|
||||
// Map each skolemized region to a vector of other regions that it
|
||||
// must be equated with. (Note that this vector may include other
|
||||
// skolemized regions from `skol_map`.)
|
||||
let skol_resolution_map: FnvHashMap<_, _> =
|
||||
skol_map
|
||||
.iter()
|
||||
.map(|(&br, &skol)| {
|
||||
let tainted_regions =
|
||||
self.infcx.tainted_regions(snapshot,
|
||||
skol,
|
||||
TaintDirections::incoming()); // [1]
|
||||
|
||||
// [1] this routine executes after the skolemized
|
||||
// regions have been *equated* with something
|
||||
// else, so examining the incoming edges ought to
|
||||
// be enough to collect all constraints
|
||||
|
||||
(skol, (br, tainted_regions))
|
||||
})
|
||||
.collect();
|
||||
|
||||
// For each skolemized region, pick a representative -- which can
|
||||
// be any region from the sets above, except for other members of
|
||||
// `skol_map`. There should always be a representative if things
|
||||
// are properly well-formed.
|
||||
let mut unconstrained_regions = vec![];
|
||||
let skol_representatives: FnvHashMap<_, _> =
|
||||
skol_resolution_map
|
||||
.iter()
|
||||
.map(|(&skol, &(br, ref regions))| {
|
||||
let representative =
|
||||
regions.iter()
|
||||
.filter(|r| !skol_resolution_map.contains_key(r))
|
||||
.cloned()
|
||||
.next()
|
||||
.unwrap_or_else(|| { // [1]
|
||||
unconstrained_regions.push(br);
|
||||
self.infcx.next_region_var(
|
||||
LateBoundRegion(span, br, HigherRankedType))
|
||||
});
|
||||
|
||||
// [1] There should always be a representative,
|
||||
// unless the higher-ranked region did not appear
|
||||
// in the values being matched. We should reject
|
||||
// as ill-formed cases that can lead to this, but
|
||||
// right now we sometimes issue warnings (see
|
||||
// #32330).
|
||||
|
||||
(skol, representative)
|
||||
})
|
||||
.collect();
|
||||
|
||||
// Equate all the members of each skolemization set with the
|
||||
// representative.
|
||||
for (skol, &(_br, ref regions)) in &skol_resolution_map {
|
||||
let representative = &skol_representatives[skol];
|
||||
debug!("higher_ranked_match: \
|
||||
skol={:?} representative={:?} regions={:?}",
|
||||
skol, representative, regions);
|
||||
for region in regions.iter()
|
||||
.filter(|&r| !skol_resolution_map.contains_key(r))
|
||||
.filter(|&r| r != representative)
|
||||
{
|
||||
let origin = SubregionOrigin::Subtype(self.trace.clone());
|
||||
self.infcx.region_vars.make_eqregion(origin,
|
||||
*representative,
|
||||
*region);
|
||||
}
|
||||
}
|
||||
|
||||
// Replace the skolemized regions appearing in value with
|
||||
// their representatives
|
||||
let a_value =
|
||||
fold_regions_in(
|
||||
self.tcx(),
|
||||
&a_value,
|
||||
|r, _| skol_representatives.get(&r).cloned().unwrap_or(r));
|
||||
|
||||
debug!("higher_ranked_match: value={:?}", a_value);
|
||||
|
||||
// We are now done with these skolemized variables.
|
||||
self.infcx.pop_skolemized(skol_map, snapshot);
|
||||
|
||||
Ok(HrMatchResult {
|
||||
value: a_value,
|
||||
unconstrained_regions: unconstrained_regions,
|
||||
})
|
||||
});
|
||||
}
|
||||
|
||||
pub fn higher_ranked_lub<T>(&self, a: &Binder<T>, b: &Binder<T>)
|
||||
-> RelateResult<'tcx, Binder<T>>
|
||||
where T: Relate<'tcx>
|
||||
|
|
@ -124,7 +277,7 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
|||
return r0;
|
||||
}
|
||||
|
||||
let tainted = infcx.tainted_regions(snapshot, r0);
|
||||
let tainted = infcx.tainted_regions(snapshot, r0, TaintDirections::both());
|
||||
|
||||
// Variables created during LUB computation which are
|
||||
// *related* to regions that pre-date the LUB computation
|
||||
|
|
@ -219,7 +372,7 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
|||
return r0;
|
||||
}
|
||||
|
||||
let tainted = infcx.tainted_regions(snapshot, r0);
|
||||
let tainted = infcx.tainted_regions(snapshot, r0, TaintDirections::both());
|
||||
|
||||
let mut a_r = None;
|
||||
let mut b_r = None;
|
||||
|
|
@ -341,8 +494,12 @@ fn fold_regions_in<'a, 'gcx, 'tcx, T, F>(tcx: TyCtxt<'a, 'gcx, 'tcx>,
|
|||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
fn tainted_regions(&self, snapshot: &CombinedSnapshot, r: ty::Region) -> Vec<ty::Region> {
|
||||
self.region_vars.tainted(&snapshot.region_vars_snapshot, r)
|
||||
fn tainted_regions(&self,
|
||||
snapshot: &CombinedSnapshot,
|
||||
r: ty::Region,
|
||||
directions: TaintDirections)
|
||||
-> FnvHashSet<ty::Region> {
|
||||
self.region_vars.tainted(&snapshot.region_vars_snapshot, r, directions)
|
||||
}
|
||||
|
||||
fn region_vars_confined_to_snapshot(&self,
|
||||
|
|
@ -422,22 +579,27 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
region_vars
|
||||
}
|
||||
|
||||
/// Replace all regions bound by `binder` with skolemized regions and
|
||||
/// return a map indicating which bound-region was replaced with what
|
||||
/// skolemized region. This is the first step of checking subtyping
|
||||
/// when higher-ranked things are involved.
|
||||
///
|
||||
/// **Important:** you must call this function from within a snapshot.
|
||||
/// Moreover, before committing the snapshot, you must eventually call
|
||||
/// either `plug_leaks` or `pop_skolemized` to remove the skolemized
|
||||
/// regions. If you rollback the snapshot (or are using a probe), then
|
||||
/// the pop occurs as part of the rollback, so an explicit call is not
|
||||
/// needed (but is also permitted).
|
||||
///
|
||||
/// See `README.md` for more details.
|
||||
pub fn skolemize_late_bound_regions<T>(&self,
|
||||
binder: &ty::Binder<T>,
|
||||
snapshot: &CombinedSnapshot)
|
||||
-> (T, SkolemizationMap)
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
/*!
|
||||
* Replace all regions bound by `binder` with skolemized regions and
|
||||
* return a map indicating which bound-region was replaced with what
|
||||
* skolemized region. This is the first step of checking subtyping
|
||||
* when higher-ranked things are involved. See `README.md` for more
|
||||
* details.
|
||||
*/
|
||||
|
||||
let (result, map) = self.tcx.replace_late_bound_regions(binder, |br| {
|
||||
self.region_vars.new_skolemized(br, &snapshot.region_vars_snapshot)
|
||||
self.region_vars.push_skolemized(br, &snapshot.region_vars_snapshot)
|
||||
});
|
||||
|
||||
debug!("skolemize_bound_regions(binder={:?}, result={:?}, map={:?})",
|
||||
|
|
@ -448,32 +610,80 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
(result, map)
|
||||
}
|
||||
|
||||
/// Searches the region constriants created since `snapshot` was started
|
||||
/// and checks to determine whether any of the skolemized regions created
|
||||
/// in `skol_map` would "escape" -- meaning that they are related to
|
||||
/// other regions in some way. If so, the higher-ranked subtyping doesn't
|
||||
/// hold. See `README.md` for more details.
|
||||
pub fn leak_check(&self,
|
||||
overly_polymorphic: bool,
|
||||
span: Span,
|
||||
skol_map: &SkolemizationMap,
|
||||
snapshot: &CombinedSnapshot)
|
||||
-> RelateResult<'tcx, ()>
|
||||
{
|
||||
/*!
|
||||
* Searches the region constriants created since `snapshot` was started
|
||||
* and checks to determine whether any of the skolemized regions created
|
||||
* in `skol_map` would "escape" -- meaning that they are related to
|
||||
* other regions in some way. If so, the higher-ranked subtyping doesn't
|
||||
* hold. See `README.md` for more details.
|
||||
*/
|
||||
|
||||
debug!("leak_check: skol_map={:?}",
|
||||
skol_map);
|
||||
|
||||
// ## Issue #32330 warnings
|
||||
//
|
||||
// When Issue #32330 is fixed, a certain number of late-bound
|
||||
// regions (LBR) will become early-bound. We wish to issue
|
||||
// warnings when the result of `leak_check` relies on such LBR, as
|
||||
// that means that compilation will likely start to fail.
|
||||
//
|
||||
// Recall that when we do a "HR subtype" check, we replace all
|
||||
// late-bound regions (LBR) in the subtype with fresh variables,
|
||||
// and skolemize the late-bound regions in the supertype. If those
|
||||
// skolemized regions from the supertype wind up being
|
||||
// super-regions (directly or indirectly) of either
|
||||
//
|
||||
// - another skolemized region; or,
|
||||
// - some region that pre-exists the HR subtype check
|
||||
// - e.g., a region variable that is not one of those created
|
||||
// to represent bound regions in the subtype
|
||||
//
|
||||
// then leak-check (and hence the subtype check) fails.
|
||||
//
|
||||
// What will change when we fix #32330 is that some of the LBR in the
|
||||
// subtype may become early-bound. In that case, they would no longer be in
|
||||
// the "permitted set" of variables that can be related to a skolemized
|
||||
// type.
|
||||
//
|
||||
// So the foundation for this warning is to collect variables that we found
|
||||
// to be related to a skolemized type. For each of them, we have a
|
||||
// `BoundRegion` which carries a `Issue32330` flag. We check whether any of
|
||||
// those flags indicate that this variable was created from a lifetime
|
||||
// that will change from late- to early-bound. If so, we issue a warning
|
||||
// indicating that the results of compilation may change.
|
||||
//
|
||||
// This is imperfect, since there are other kinds of code that will not
|
||||
// compile once #32330 is fixed. However, it fixes the errors observed in
|
||||
// practice on crater runs.
|
||||
let mut warnings = vec![];
|
||||
|
||||
let new_vars = self.region_vars_confined_to_snapshot(snapshot);
|
||||
for (&skol_br, &skol) in skol_map {
|
||||
let tainted = self.tainted_regions(snapshot, skol);
|
||||
for &tainted_region in &tainted {
|
||||
// The inputs to a skolemized variable can only
|
||||
// be itself or other new variables.
|
||||
let incoming_taints = self.tainted_regions(snapshot,
|
||||
skol,
|
||||
TaintDirections::both());
|
||||
for &tainted_region in &incoming_taints {
|
||||
// Each skolemized should only be relatable to itself
|
||||
// or new variables:
|
||||
match tainted_region {
|
||||
ty::ReVar(vid) => {
|
||||
if new_vars.iter().any(|&x| x == vid) { continue; }
|
||||
if new_vars.contains(&vid) {
|
||||
warnings.extend(
|
||||
match self.region_vars.var_origin(vid) {
|
||||
LateBoundRegion(_,
|
||||
ty::BrNamed(_, _, wc),
|
||||
_) => Some(wc),
|
||||
_ => None,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
if tainted_region == skol { continue; }
|
||||
|
|
@ -496,6 +706,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
self.issue_32330_warnings(span, &warnings);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
|
@ -533,8 +746,6 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
value: &T) -> T
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
debug_assert!(self.leak_check(false, &skol_map, snapshot).is_ok());
|
||||
|
||||
debug!("plug_leaks(skol_map={:?}, value={:?})",
|
||||
skol_map,
|
||||
value);
|
||||
|
|
@ -545,9 +756,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
// these taint sets are mutually disjoint.
|
||||
let inv_skol_map: FnvHashMap<ty::Region, ty::BoundRegion> =
|
||||
skol_map
|
||||
.into_iter()
|
||||
.flat_map(|(skol_br, skol)| {
|
||||
self.tainted_regions(snapshot, skol)
|
||||
.iter()
|
||||
.flat_map(|(&skol_br, &skol)| {
|
||||
self.tainted_regions(snapshot, skol, TaintDirections::both())
|
||||
.into_iter()
|
||||
.map(move |tainted_region| (tainted_region, skol_br))
|
||||
})
|
||||
|
|
@ -577,6 +788,19 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
// binders, so this assert is satisfied.
|
||||
assert!(current_depth > 1);
|
||||
|
||||
// since leak-check passed, this skolemized region
|
||||
// should only have incoming edges from variables
|
||||
// (which ought not to escape the snapshot, but we
|
||||
// don't check that) or itself
|
||||
assert!(
|
||||
match r {
|
||||
ty::ReVar(_) => true,
|
||||
ty::ReSkolemized(_, ref br1) => br == br1,
|
||||
_ => false,
|
||||
},
|
||||
"leak-check would have us replace {:?} with {:?}",
|
||||
r, br);
|
||||
|
||||
ty::ReLateBound(ty::DebruijnIndex::new(current_depth - 1), br.clone())
|
||||
}
|
||||
}
|
||||
|
|
@ -585,6 +809,27 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
debug!("plug_leaks: result={:?}",
|
||||
result);
|
||||
|
||||
self.pop_skolemized(skol_map, snapshot);
|
||||
|
||||
debug!("plug_leaks: result={:?}", result);
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Pops the skolemized regions found in `skol_map` from the region
|
||||
/// inference context. Whenever you create skolemized regions via
|
||||
/// `skolemize_late_bound_regions`, they must be popped before you
|
||||
/// commit the enclosing snapshot (if you do not commit, e.g. within a
|
||||
/// probe or as a result of an error, then this is not necessary, as
|
||||
/// popping happens as part of the rollback).
|
||||
///
|
||||
/// Note: popping also occurs implicitly as part of `leak_check`.
|
||||
pub fn pop_skolemized(&self,
|
||||
skol_map: SkolemizationMap,
|
||||
snapshot: &CombinedSnapshot)
|
||||
{
|
||||
debug!("pop_skolemized({:?})", skol_map);
|
||||
let skol_regions: FnvHashSet<_> = skol_map.values().cloned().collect();
|
||||
self.region_vars.pop_skolemized(&skol_regions, &snapshot.region_vars_snapshot);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -45,6 +45,7 @@ use syntax::errors::DiagnosticBuilder;
|
|||
use util::nodemap::{FnvHashMap, FnvHashSet, NodeMap};
|
||||
|
||||
use self::combine::CombineFields;
|
||||
use self::higher_ranked::HrMatchResult;
|
||||
use self::region_inference::{RegionVarBindings, RegionSnapshot};
|
||||
use self::unify_key::ToType;
|
||||
|
||||
|
|
@ -63,6 +64,7 @@ pub mod sub;
|
|||
pub mod type_variable;
|
||||
pub mod unify_key;
|
||||
|
||||
#[must_use]
|
||||
pub struct InferOk<'tcx, T> {
|
||||
pub value: T,
|
||||
pub obligations: PredicateObligations<'tcx>,
|
||||
|
|
@ -104,6 +106,12 @@ pub struct InferCtxt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
|||
|
||||
pub tables: InferTables<'a, 'gcx, 'tcx>,
|
||||
|
||||
// Cache for projections. This cache is snapshotted along with the
|
||||
// infcx.
|
||||
//
|
||||
// Public so that `traits::project` can use it.
|
||||
pub projection_cache: RefCell<traits::ProjectionCache<'tcx>>,
|
||||
|
||||
// We instantiate UnificationTable with bounds<Ty> because the
|
||||
// types that might instantiate a general type variable have an
|
||||
// order, represented by its upper and lower bounds.
|
||||
|
|
@ -477,6 +485,7 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'gcx> {
|
|||
parameter_environment: param_env,
|
||||
selection_cache: traits::SelectionCache::new(),
|
||||
evaluation_cache: traits::EvaluationCache::new(),
|
||||
projection_cache: RefCell::new(traits::ProjectionCache::new()),
|
||||
reported_trait_errors: RefCell::new(FnvHashSet()),
|
||||
normalize: false,
|
||||
projection_mode: ProjectionMode::AnyFinal,
|
||||
|
|
@ -510,6 +519,7 @@ impl<'a, 'gcx, 'tcx> InferCtxtBuilder<'a, 'gcx, 'tcx> {
|
|||
global_tcx.enter_local(arenas, |tcx| f(InferCtxt {
|
||||
tcx: tcx,
|
||||
tables: tables,
|
||||
projection_cache: RefCell::new(traits::ProjectionCache::new()),
|
||||
type_variables: RefCell::new(type_variable::TypeVariableTable::new()),
|
||||
int_unification_table: RefCell::new(UnificationTable::new()),
|
||||
float_unification_table: RefCell::new(UnificationTable::new()),
|
||||
|
|
@ -538,13 +548,14 @@ impl<T> ExpectedFound<T> {
|
|||
}
|
||||
|
||||
impl<'tcx, T> InferOk<'tcx, T> {
|
||||
fn unit(self) -> InferOk<'tcx, ()> {
|
||||
pub fn unit(self) -> InferOk<'tcx, ()> {
|
||||
InferOk { value: (), obligations: self.obligations }
|
||||
}
|
||||
}
|
||||
|
||||
#[must_use = "once you start a snapshot, you should always consume it"]
|
||||
pub struct CombinedSnapshot {
|
||||
projection_cache_snapshot: traits::ProjectionCacheSnapshot,
|
||||
type_snapshot: type_variable::Snapshot,
|
||||
int_snapshot: unify::Snapshot<ty::IntVid>,
|
||||
float_snapshot: unify::Snapshot<ty::FloatVid>,
|
||||
|
|
@ -643,6 +654,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
-> T::Lifted
|
||||
where T: TypeFoldable<'tcx> + ty::Lift<'gcx>
|
||||
{
|
||||
debug!("drain_fulfillment_cx_or_panic()");
|
||||
|
||||
let when = "resolving bounds after type-checking";
|
||||
let v = match self.drain_fulfillment_cx(fulfill_cx, result) {
|
||||
Ok(v) => v,
|
||||
|
|
@ -817,10 +830,13 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
|
||||
fn start_snapshot(&self) -> CombinedSnapshot {
|
||||
debug!("start_snapshot()");
|
||||
|
||||
let obligations_in_snapshot = self.obligations_in_snapshot.get();
|
||||
self.obligations_in_snapshot.set(false);
|
||||
|
||||
CombinedSnapshot {
|
||||
projection_cache_snapshot: self.projection_cache.borrow_mut().snapshot(),
|
||||
type_snapshot: self.type_variables.borrow_mut().snapshot(),
|
||||
int_snapshot: self.int_unification_table.borrow_mut().snapshot(),
|
||||
float_snapshot: self.float_unification_table.borrow_mut().snapshot(),
|
||||
|
|
@ -831,7 +847,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
|
||||
fn rollback_to(&self, cause: &str, snapshot: CombinedSnapshot) {
|
||||
debug!("rollback_to(cause={})", cause);
|
||||
let CombinedSnapshot { type_snapshot,
|
||||
let CombinedSnapshot { projection_cache_snapshot,
|
||||
type_snapshot,
|
||||
int_snapshot,
|
||||
float_snapshot,
|
||||
region_vars_snapshot,
|
||||
|
|
@ -840,6 +857,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
assert!(!self.obligations_in_snapshot.get());
|
||||
self.obligations_in_snapshot.set(obligations_in_snapshot);
|
||||
|
||||
self.projection_cache
|
||||
.borrow_mut()
|
||||
.rollback_to(projection_cache_snapshot);
|
||||
self.type_variables
|
||||
.borrow_mut()
|
||||
.rollback_to(type_snapshot);
|
||||
|
|
@ -854,8 +874,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
|
||||
fn commit_from(&self, snapshot: CombinedSnapshot) {
|
||||
debug!("commit_from!");
|
||||
let CombinedSnapshot { type_snapshot,
|
||||
debug!("commit_from()");
|
||||
let CombinedSnapshot { projection_cache_snapshot,
|
||||
type_snapshot,
|
||||
int_snapshot,
|
||||
float_snapshot,
|
||||
region_vars_snapshot,
|
||||
|
|
@ -863,6 +884,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
|
||||
self.obligations_in_snapshot.set(obligations_in_snapshot);
|
||||
|
||||
self.projection_cache
|
||||
.borrow_mut()
|
||||
.commit(projection_cache_snapshot);
|
||||
self.type_variables
|
||||
.borrow_mut()
|
||||
.commit(type_snapshot);
|
||||
|
|
@ -920,7 +944,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
F: FnOnce() -> Result<T, E>
|
||||
{
|
||||
debug!("commit_regions_if_ok()");
|
||||
let CombinedSnapshot { type_snapshot,
|
||||
let CombinedSnapshot { projection_cache_snapshot,
|
||||
type_snapshot,
|
||||
int_snapshot,
|
||||
float_snapshot,
|
||||
region_vars_snapshot,
|
||||
|
|
@ -935,6 +960,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
|
||||
// Roll back any non-region bindings - they should be resolved
|
||||
// inside `f`, with, e.g. `resolve_type_vars_if_possible`.
|
||||
self.projection_cache
|
||||
.borrow_mut()
|
||||
.rollback_to(projection_cache_snapshot);
|
||||
self.type_variables
|
||||
.borrow_mut()
|
||||
.rollback_to(type_snapshot);
|
||||
|
|
@ -1076,7 +1104,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
self.skolemize_late_bound_regions(predicate, snapshot);
|
||||
let origin = TypeOrigin::EquatePredicate(span);
|
||||
let eqty_ok = self.eq_types(false, origin, a, b)?;
|
||||
self.leak_check(false, &skol_map, snapshot).map(|_| eqty_ok.unit())
|
||||
self.leak_check(false, span, &skol_map, snapshot)?;
|
||||
self.pop_skolemized(skol_map, snapshot);
|
||||
Ok(eqty_ok.unit())
|
||||
})
|
||||
}
|
||||
|
||||
|
|
@ -1090,7 +1120,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
self.skolemize_late_bound_regions(predicate, snapshot);
|
||||
let origin = RelateRegionParamBound(span);
|
||||
self.sub_regions(origin, r_b, r_a); // `b : a` ==> `a <= b`
|
||||
self.leak_check(false, &skol_map, snapshot)
|
||||
self.leak_check(false, span, &skol_map, snapshot)?;
|
||||
Ok(self.pop_skolemized(skol_map, snapshot))
|
||||
})
|
||||
}
|
||||
|
||||
|
|
@ -1569,6 +1600,40 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
|br| self.next_region_var(LateBoundRegion(span, br, lbrct)))
|
||||
}
|
||||
|
||||
/// Given a higher-ranked projection predicate like:
|
||||
///
|
||||
/// for<'a> <T as Fn<&'a u32>>::Output = &'a u32
|
||||
///
|
||||
/// and a target trait-ref like:
|
||||
///
|
||||
/// <T as Fn<&'x u32>>
|
||||
///
|
||||
/// find a substitution `S` for the higher-ranked regions (here,
|
||||
/// `['a => 'x]`) such that the predicate matches the trait-ref,
|
||||
/// and then return the value (here, `&'a u32`) but with the
|
||||
/// substitution applied (hence, `&'x u32`).
|
||||
///
|
||||
/// See `higher_ranked_match` in `higher_ranked/mod.rs` for more
|
||||
/// details.
|
||||
pub fn match_poly_projection_predicate(&self,
|
||||
origin: TypeOrigin,
|
||||
match_a: ty::PolyProjectionPredicate<'tcx>,
|
||||
match_b: ty::TraitRef<'tcx>)
|
||||
-> InferResult<'tcx, HrMatchResult<Ty<'tcx>>>
|
||||
{
|
||||
let span = origin.span();
|
||||
let match_trait_ref = match_a.skip_binder().projection_ty.trait_ref;
|
||||
let trace = TypeTrace {
|
||||
origin: origin,
|
||||
values: TraitRefs(ExpectedFound::new(true, match_trait_ref, match_b))
|
||||
};
|
||||
|
||||
let match_pair = match_a.map_bound(|p| (p.projection_ty.trait_ref, p.ty));
|
||||
let combine = self.combine_fields(true, trace);
|
||||
let result = combine.higher_ranked_match(span, &match_pair, &match_b)?;
|
||||
Ok(InferOk { value: result, obligations: combine.obligations })
|
||||
}
|
||||
|
||||
/// See `verify_generic_bound` method in `region_inference`
|
||||
pub fn verify_generic_bound(&self,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
|
|
|
|||
|
|
@ -213,8 +213,12 @@ fn constraint_to_nodes(c: &Constraint) -> (Node, Node) {
|
|||
match *c {
|
||||
Constraint::ConstrainVarSubVar(rv_1, rv_2) =>
|
||||
(Node::RegionVid(rv_1), Node::RegionVid(rv_2)),
|
||||
Constraint::ConstrainRegSubVar(r_1, rv_2) => (Node::Region(r_1), Node::RegionVid(rv_2)),
|
||||
Constraint::ConstrainVarSubReg(rv_1, r_2) => (Node::RegionVid(rv_1), Node::Region(r_2)),
|
||||
Constraint::ConstrainRegSubVar(r_1, rv_2) =>
|
||||
(Node::Region(r_1), Node::RegionVid(rv_2)),
|
||||
Constraint::ConstrainVarSubReg(rv_1, r_2) =>
|
||||
(Node::RegionVid(rv_1), Node::Region(r_2)),
|
||||
Constraint::ConstrainRegSubReg(r_1, r_2) =>
|
||||
(Node::Region(r_1), Node::Region(r_2)),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -11,7 +11,6 @@
|
|||
//! See README.md
|
||||
|
||||
pub use self::Constraint::*;
|
||||
pub use self::Verify::*;
|
||||
pub use self::UndoLogEntry::*;
|
||||
pub use self::CombineMapType::*;
|
||||
pub use self::RegionResolutionError::*;
|
||||
|
|
@ -20,6 +19,7 @@ pub use self::VarValue::*;
|
|||
use super::{RegionVariableOrigin, SubregionOrigin, MiscVariable};
|
||||
use super::unify_key;
|
||||
|
||||
use rustc_data_structures::fnv::{FnvHashMap, FnvHashSet};
|
||||
use rustc_data_structures::graph::{self, Direction, NodeIndex, OUTGOING};
|
||||
use rustc_data_structures::unify::{self, UnificationTable};
|
||||
use middle::free_region::FreeRegionMap;
|
||||
|
|
@ -27,12 +27,11 @@ use ty::{self, Ty, TyCtxt};
|
|||
use ty::{BoundRegion, Region, RegionVid};
|
||||
use ty::{ReEmpty, ReStatic, ReFree, ReEarlyBound};
|
||||
use ty::{ReLateBound, ReScope, ReVar, ReSkolemized, BrFresh};
|
||||
use util::common::indenter;
|
||||
use util::nodemap::{FnvHashMap, FnvHashSet};
|
||||
|
||||
use std::cell::{Cell, RefCell};
|
||||
use std::cmp::Ordering::{self, Less, Greater, Equal};
|
||||
use std::fmt;
|
||||
use std::mem;
|
||||
use std::u32;
|
||||
use syntax::ast;
|
||||
|
||||
|
|
@ -47,25 +46,28 @@ pub enum Constraint {
|
|||
// Concrete region is subregion of region variable
|
||||
ConstrainRegSubVar(Region, RegionVid),
|
||||
|
||||
// Region variable is subregion of concrete region
|
||||
//
|
||||
// FIXME(#29436) -- should be remove in favor of a Verify
|
||||
// Region variable is subregion of concrete region. This does not
|
||||
// directly affect inference, but instead is checked after
|
||||
// inference is complete.
|
||||
ConstrainVarSubReg(RegionVid, Region),
|
||||
|
||||
// A constraint where neither side is a variable. This does not
|
||||
// directly affect inference, but instead is checked after
|
||||
// inference is complete.
|
||||
ConstrainRegSubReg(Region, Region),
|
||||
}
|
||||
|
||||
// Something we have to verify after region inference is done, but
|
||||
// which does not directly influence the inference process
|
||||
pub enum Verify<'tcx> {
|
||||
// VerifyRegSubReg(a, b): Verify that `a <= b`. Neither `a` nor
|
||||
// `b` are inference variables.
|
||||
VerifyRegSubReg(SubregionOrigin<'tcx>, Region, Region),
|
||||
|
||||
// VerifyGenericBound(T, _, R, RS): The parameter type `T` (or
|
||||
// associated type) must outlive the region `R`. `T` is known to
|
||||
// outlive `RS`. Therefore verify that `R <= RS[i]` for some
|
||||
// `i`. Inference variables may be involved (but this verification
|
||||
// step doesn't influence inference).
|
||||
VerifyGenericBound(GenericKind<'tcx>, SubregionOrigin<'tcx>, Region, VerifyBound),
|
||||
// VerifyGenericBound(T, _, R, RS): The parameter type `T` (or
|
||||
// associated type) must outlive the region `R`. `T` is known to
|
||||
// outlive `RS`. Therefore verify that `R <= RS[i]` for some
|
||||
// `i`. Inference variables may be involved (but this verification
|
||||
// step doesn't influence inference).
|
||||
#[derive(Debug)]
|
||||
pub struct Verify<'tcx> {
|
||||
kind: GenericKind<'tcx>,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
region: Region,
|
||||
bound: VerifyBound,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq, Eq)]
|
||||
|
|
@ -108,13 +110,36 @@ pub struct TwoRegions {
|
|||
|
||||
#[derive(Copy, Clone, PartialEq)]
|
||||
pub enum UndoLogEntry {
|
||||
/// Pushed when we start a snapshot.
|
||||
OpenSnapshot,
|
||||
|
||||
/// Replaces an `OpenSnapshot` when a snapshot is committed, but
|
||||
/// that snapshot is not the root. If the root snapshot is
|
||||
/// unrolled, all nested snapshots must be committed.
|
||||
CommitedSnapshot,
|
||||
|
||||
/// We added `RegionVid`
|
||||
AddVar(RegionVid),
|
||||
|
||||
/// We added the given `constraint`
|
||||
AddConstraint(Constraint),
|
||||
|
||||
/// We added the given `verify`
|
||||
AddVerify(usize),
|
||||
|
||||
/// We added the given `given`
|
||||
AddGiven(ty::FreeRegion, ty::RegionVid),
|
||||
|
||||
/// We added a GLB/LUB "combinaton variable"
|
||||
AddCombination(CombineMapType, TwoRegions),
|
||||
|
||||
/// During skolemization, we sometimes purge entries from the undo
|
||||
/// log in a kind of minisnapshot (unlike other snapshots, this
|
||||
/// purging actually takes place *on success*). In that case, we
|
||||
/// replace the corresponding entry with `Noop` so as to avoid the
|
||||
/// need to do a bunch of swapping. (We can't use `swap_remove` as
|
||||
/// the order of the vector is important.)
|
||||
Purged,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq)]
|
||||
|
|
@ -253,6 +278,112 @@ pub struct RegionSnapshot {
|
|||
skolemization_count: u32,
|
||||
}
|
||||
|
||||
/// When working with skolemized regions, we often wish to find all of
|
||||
/// the regions that are either reachable from a skolemized region, or
|
||||
/// which can reach a skolemized region, or both. We call such regions
|
||||
/// *tained* regions. This struct allows you to decide what set of
|
||||
/// tainted regions you want.
|
||||
#[derive(Debug)]
|
||||
pub struct TaintDirections {
|
||||
incoming: bool,
|
||||
outgoing: bool,
|
||||
}
|
||||
|
||||
impl TaintDirections {
|
||||
pub fn incoming() -> Self {
|
||||
TaintDirections { incoming: true, outgoing: false }
|
||||
}
|
||||
|
||||
pub fn outgoing() -> Self {
|
||||
TaintDirections { incoming: false, outgoing: true }
|
||||
}
|
||||
|
||||
pub fn both() -> Self {
|
||||
TaintDirections { incoming: true, outgoing: true }
|
||||
}
|
||||
}
|
||||
|
||||
struct TaintSet {
|
||||
directions: TaintDirections,
|
||||
regions: FnvHashSet<ty::Region>
|
||||
}
|
||||
|
||||
impl TaintSet {
|
||||
fn new(directions: TaintDirections,
|
||||
initial_region: ty::Region)
|
||||
-> Self {
|
||||
let mut regions = FnvHashSet();
|
||||
regions.insert(initial_region);
|
||||
TaintSet { directions: directions, regions: regions }
|
||||
}
|
||||
|
||||
fn fixed_point(&mut self,
|
||||
undo_log: &[UndoLogEntry],
|
||||
verifys: &[Verify]) {
|
||||
let mut prev_len = 0;
|
||||
while prev_len < self.len() {
|
||||
debug!("tainted: prev_len = {:?} new_len = {:?}",
|
||||
prev_len, self.len());
|
||||
|
||||
prev_len = self.len();
|
||||
|
||||
for undo_entry in undo_log {
|
||||
match undo_entry {
|
||||
&AddConstraint(ConstrainVarSubVar(a, b)) => {
|
||||
self.add_edge(ReVar(a), ReVar(b));
|
||||
}
|
||||
&AddConstraint(ConstrainRegSubVar(a, b)) => {
|
||||
self.add_edge(a, ReVar(b));
|
||||
}
|
||||
&AddConstraint(ConstrainVarSubReg(a, b)) => {
|
||||
self.add_edge(ReVar(a), b);
|
||||
}
|
||||
&AddConstraint(ConstrainRegSubReg(a, b)) => {
|
||||
self.add_edge(a, b);
|
||||
}
|
||||
&AddGiven(a, b) => {
|
||||
self.add_edge(ReFree(a), ReVar(b));
|
||||
}
|
||||
&AddVerify(i) => {
|
||||
verifys[i].bound.for_each_region(&mut |b| {
|
||||
self.add_edge(verifys[i].region, b);
|
||||
});
|
||||
}
|
||||
&Purged |
|
||||
&AddCombination(..) |
|
||||
&AddVar(..) |
|
||||
&OpenSnapshot |
|
||||
&CommitedSnapshot => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn into_set(self) -> FnvHashSet<ty::Region> {
|
||||
self.regions
|
||||
}
|
||||
|
||||
fn len(&self) -> usize {
|
||||
self.regions.len()
|
||||
}
|
||||
|
||||
fn add_edge(&mut self,
|
||||
source: ty::Region,
|
||||
target: ty::Region) {
|
||||
if self.directions.incoming {
|
||||
if self.regions.contains(&target) {
|
||||
self.regions.insert(source);
|
||||
}
|
||||
}
|
||||
|
||||
if self.directions.outgoing {
|
||||
if self.regions.contains(&source) {
|
||||
self.regions.insert(target);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
||||
pub fn new(tcx: TyCtxt<'a, 'gcx, 'tcx>) -> RegionVarBindings<'a, 'gcx, 'tcx> {
|
||||
RegionVarBindings {
|
||||
|
|
@ -290,6 +421,10 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
debug!("RegionVarBindings: commit({})", snapshot.length);
|
||||
assert!(self.undo_log.borrow().len() > snapshot.length);
|
||||
assert!((*self.undo_log.borrow())[snapshot.length] == OpenSnapshot);
|
||||
assert!(self.skolemization_count.get() == snapshot.skolemization_count,
|
||||
"failed to pop skolemized regions: {} now vs {} at start",
|
||||
self.skolemization_count.get(),
|
||||
snapshot.skolemization_count);
|
||||
|
||||
let mut undo_log = self.undo_log.borrow_mut();
|
||||
if snapshot.length == 0 {
|
||||
|
|
@ -297,7 +432,6 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
} else {
|
||||
(*undo_log)[snapshot.length] = CommitedSnapshot;
|
||||
}
|
||||
self.skolemization_count.set(snapshot.skolemization_count);
|
||||
self.unification_table.borrow_mut().commit(snapshot.region_snapshot);
|
||||
}
|
||||
|
||||
|
|
@ -307,33 +441,7 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
assert!(undo_log.len() > snapshot.length);
|
||||
assert!((*undo_log)[snapshot.length] == OpenSnapshot);
|
||||
while undo_log.len() > snapshot.length + 1 {
|
||||
match undo_log.pop().unwrap() {
|
||||
OpenSnapshot => {
|
||||
bug!("Failure to observe stack discipline");
|
||||
}
|
||||
CommitedSnapshot => {}
|
||||
AddVar(vid) => {
|
||||
let mut var_origins = self.var_origins.borrow_mut();
|
||||
var_origins.pop().unwrap();
|
||||
assert_eq!(var_origins.len(), vid.index as usize);
|
||||
}
|
||||
AddConstraint(ref constraint) => {
|
||||
self.constraints.borrow_mut().remove(constraint);
|
||||
}
|
||||
AddVerify(index) => {
|
||||
self.verifys.borrow_mut().pop();
|
||||
assert_eq!(self.verifys.borrow().len(), index);
|
||||
}
|
||||
AddGiven(sub, sup) => {
|
||||
self.givens.borrow_mut().remove(&(sub, sup));
|
||||
}
|
||||
AddCombination(Glb, ref regions) => {
|
||||
self.glbs.borrow_mut().remove(regions);
|
||||
}
|
||||
AddCombination(Lub, ref regions) => {
|
||||
self.lubs.borrow_mut().remove(regions);
|
||||
}
|
||||
}
|
||||
self.rollback_undo_entry(undo_log.pop().unwrap());
|
||||
}
|
||||
let c = undo_log.pop().unwrap();
|
||||
assert!(c == OpenSnapshot);
|
||||
|
|
@ -342,6 +450,38 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
.rollback_to(snapshot.region_snapshot);
|
||||
}
|
||||
|
||||
pub fn rollback_undo_entry(&self, undo_entry: UndoLogEntry) {
|
||||
match undo_entry {
|
||||
OpenSnapshot => {
|
||||
panic!("Failure to observe stack discipline");
|
||||
}
|
||||
Purged | CommitedSnapshot => {
|
||||
// nothing to do here
|
||||
}
|
||||
AddVar(vid) => {
|
||||
let mut var_origins = self.var_origins.borrow_mut();
|
||||
var_origins.pop().unwrap();
|
||||
assert_eq!(var_origins.len(), vid.index as usize);
|
||||
}
|
||||
AddConstraint(ref constraint) => {
|
||||
self.constraints.borrow_mut().remove(constraint);
|
||||
}
|
||||
AddVerify(index) => {
|
||||
self.verifys.borrow_mut().pop();
|
||||
assert_eq!(self.verifys.borrow().len(), index);
|
||||
}
|
||||
AddGiven(sub, sup) => {
|
||||
self.givens.borrow_mut().remove(&(sub, sup));
|
||||
}
|
||||
AddCombination(Glb, ref regions) => {
|
||||
self.glbs.borrow_mut().remove(regions);
|
||||
}
|
||||
AddCombination(Lub, ref regions) => {
|
||||
self.lubs.borrow_mut().remove(regions);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn num_vars(&self) -> u32 {
|
||||
let len = self.var_origins.borrow().len();
|
||||
// enforce no overflow
|
||||
|
|
@ -366,22 +506,30 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
return vid;
|
||||
}
|
||||
|
||||
pub fn var_origin(&self, vid: RegionVid) -> RegionVariableOrigin {
|
||||
self.var_origins.borrow()[vid.index as usize].clone()
|
||||
}
|
||||
|
||||
/// Creates a new skolemized region. Skolemized regions are fresh
|
||||
/// regions used when performing higher-ranked computations. They
|
||||
/// must be used in a very particular way and are never supposed
|
||||
/// to "escape" out into error messages or the code at large.
|
||||
///
|
||||
/// The idea is to always create a snapshot. Skolemized regions
|
||||
/// can be created in the context of this snapshot, but once the
|
||||
/// snapshot is committed or rolled back, their numbers will be
|
||||
/// recycled, so you must be finished with them. See the extensive
|
||||
/// comments in `higher_ranked.rs` to see how it works (in
|
||||
/// particular, the subtyping comparison).
|
||||
/// can be created in the context of this snapshot, but before the
|
||||
/// snapshot is committed or rolled back, they must be popped
|
||||
/// (using `pop_skolemized_regions`), so that their numbers can be
|
||||
/// recycled. Normally you don't have to think about this: you use
|
||||
/// the APIs in `higher_ranked/mod.rs`, such as
|
||||
/// `skolemize_late_bound_regions` and `plug_leaks`, which will
|
||||
/// guide you on this path (ensure that the `SkolemizationMap` is
|
||||
/// consumed and you are good). There are also somewhat extensive
|
||||
/// comments in `higher_ranked/README.md`.
|
||||
///
|
||||
/// The `snapshot` argument to this function is not really used;
|
||||
/// it's just there to make it explicit which snapshot bounds the
|
||||
/// skolemized region that results.
|
||||
pub fn new_skolemized(&self, br: ty::BoundRegion, snapshot: &RegionSnapshot) -> Region {
|
||||
/// skolemized region that results. It should always be the top-most snapshot.
|
||||
pub fn push_skolemized(&self, br: ty::BoundRegion, snapshot: &RegionSnapshot) -> Region {
|
||||
assert!(self.in_snapshot());
|
||||
assert!(self.undo_log.borrow()[snapshot.length] == OpenSnapshot);
|
||||
|
||||
|
|
@ -390,6 +538,94 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
ReSkolemized(ty::SkolemizedRegionVid { index: sc }, br)
|
||||
}
|
||||
|
||||
/// Removes all the edges to/from the skolemized regions that are
|
||||
/// in `skols`. This is used after a higher-ranked operation
|
||||
/// completes to remove all trace of the skolemized regions
|
||||
/// created in that time.
|
||||
pub fn pop_skolemized(&self,
|
||||
skols: &FnvHashSet<ty::Region>,
|
||||
snapshot: &RegionSnapshot) {
|
||||
debug!("pop_skolemized_regions(skols={:?})", skols);
|
||||
|
||||
assert!(self.in_snapshot());
|
||||
assert!(self.undo_log.borrow()[snapshot.length] == OpenSnapshot);
|
||||
assert!(self.skolemization_count.get() as usize >= skols.len(),
|
||||
"popping more skolemized variables than actually exist, \
|
||||
sc now = {}, skols.len = {}",
|
||||
self.skolemization_count.get(),
|
||||
skols.len());
|
||||
|
||||
let last_to_pop = self.skolemization_count.get();
|
||||
let first_to_pop = last_to_pop - (skols.len() as u32);
|
||||
|
||||
assert!(first_to_pop >= snapshot.skolemization_count,
|
||||
"popping more regions than snapshot contains, \
|
||||
sc now = {}, sc then = {}, skols.len = {}",
|
||||
self.skolemization_count.get(),
|
||||
snapshot.skolemization_count,
|
||||
skols.len());
|
||||
debug_assert! {
|
||||
skols.iter()
|
||||
.all(|k| match *k {
|
||||
ty::ReSkolemized(index, _) =>
|
||||
index.index >= first_to_pop &&
|
||||
index.index < last_to_pop,
|
||||
_ =>
|
||||
false
|
||||
}),
|
||||
"invalid skolemization keys or keys out of range ({}..{}): {:?}",
|
||||
snapshot.skolemization_count,
|
||||
self.skolemization_count.get(),
|
||||
skols
|
||||
}
|
||||
|
||||
let mut undo_log = self.undo_log.borrow_mut();
|
||||
|
||||
let constraints_to_kill: Vec<usize> =
|
||||
undo_log.iter()
|
||||
.enumerate()
|
||||
.rev()
|
||||
.filter(|&(_, undo_entry)| kill_constraint(skols, undo_entry))
|
||||
.map(|(index, _)| index)
|
||||
.collect();
|
||||
|
||||
for index in constraints_to_kill {
|
||||
let undo_entry = mem::replace(&mut undo_log[index], Purged);
|
||||
self.rollback_undo_entry(undo_entry);
|
||||
}
|
||||
|
||||
self.skolemization_count.set(snapshot.skolemization_count);
|
||||
return;
|
||||
|
||||
fn kill_constraint(skols: &FnvHashSet<ty::Region>,
|
||||
undo_entry: &UndoLogEntry)
|
||||
-> bool {
|
||||
match undo_entry {
|
||||
&AddConstraint(ConstrainVarSubVar(_, _)) =>
|
||||
false,
|
||||
&AddConstraint(ConstrainRegSubVar(a, _)) =>
|
||||
skols.contains(&a),
|
||||
&AddConstraint(ConstrainVarSubReg(_, b)) =>
|
||||
skols.contains(&b),
|
||||
&AddConstraint(ConstrainRegSubReg(a, b)) =>
|
||||
skols.contains(&a) || skols.contains(&b),
|
||||
&AddGiven(_, _) =>
|
||||
false,
|
||||
&AddVerify(_) =>
|
||||
false,
|
||||
&AddCombination(_, ref two_regions) =>
|
||||
skols.contains(&two_regions.a) ||
|
||||
skols.contains(&two_regions.b),
|
||||
&AddVar(..) |
|
||||
&OpenSnapshot |
|
||||
&Purged |
|
||||
&CommitedSnapshot =>
|
||||
false,
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
pub fn new_bound(&self, debruijn: ty::DebruijnIndex) -> Region {
|
||||
// Creates a fresh bound variable for use in GLB computations.
|
||||
// See discussion of GLB computation in the large comment at
|
||||
|
|
@ -443,11 +679,9 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
debug!("RegionVarBindings: add_verify({:?})", verify);
|
||||
|
||||
// skip no-op cases known to be satisfied
|
||||
match verify {
|
||||
VerifyGenericBound(_, _, _, VerifyBound::AllBounds(ref bs)) if bs.len() == 0 => {
|
||||
return;
|
||||
}
|
||||
_ => {}
|
||||
match verify.bound {
|
||||
VerifyBound::AllBounds(ref bs) if bs.len() == 0 => { return; }
|
||||
_ => { }
|
||||
}
|
||||
|
||||
let mut verifys = self.verifys.borrow_mut();
|
||||
|
|
@ -515,7 +749,7 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
self.add_constraint(ConstrainVarSubReg(sub_id, r), origin);
|
||||
}
|
||||
_ => {
|
||||
self.add_verify(VerifyRegSubReg(origin, sub, sup));
|
||||
self.add_constraint(ConstrainRegSubReg(sub, sup), origin);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -526,7 +760,12 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
kind: GenericKind<'tcx>,
|
||||
sub: Region,
|
||||
bound: VerifyBound) {
|
||||
self.add_verify(VerifyGenericBound(kind, origin, sub, bound));
|
||||
self.add_verify(Verify {
|
||||
kind: kind,
|
||||
origin: origin,
|
||||
region: sub,
|
||||
bound: bound
|
||||
});
|
||||
}
|
||||
|
||||
pub fn lub_regions(&self, origin: SubregionOrigin<'tcx>, a: Region, b: Region) -> Region {
|
||||
|
|
@ -632,83 +871,30 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
.collect()
|
||||
}
|
||||
|
||||
/// Computes all regions that have been related to `r0` in any way since the mark `mark` was
|
||||
/// made---`r0` itself will be the first entry. This is used when checking whether skolemized
|
||||
/// regions are being improperly related to other regions.
|
||||
pub fn tainted(&self, mark: &RegionSnapshot, r0: Region) -> Vec<Region> {
|
||||
debug!("tainted(mark={:?}, r0={:?})", mark, r0);
|
||||
let _indenter = indenter();
|
||||
/// Computes all regions that have been related to `r0` since the
|
||||
/// mark `mark` was made---`r0` itself will be the first
|
||||
/// entry. The `directions` parameter controls what kind of
|
||||
/// relations are considered. For example, one can say that only
|
||||
/// "incoming" edges to `r0` are desired, in which case one will
|
||||
/// get the set of regions `{r|r <= r0}`. This is used when
|
||||
/// checking whether skolemized regions are being improperly
|
||||
/// related to other regions.
|
||||
pub fn tainted(&self,
|
||||
mark: &RegionSnapshot,
|
||||
r0: Region,
|
||||
directions: TaintDirections)
|
||||
-> FnvHashSet<ty::Region> {
|
||||
debug!("tainted(mark={:?}, r0={:?}, directions={:?})",
|
||||
mark, r0, directions);
|
||||
|
||||
// `result_set` acts as a worklist: we explore all outgoing
|
||||
// edges and add any new regions we find to result_set. This
|
||||
// is not a terribly efficient implementation.
|
||||
let mut result_set = vec![r0];
|
||||
let mut result_index = 0;
|
||||
while result_index < result_set.len() {
|
||||
// nb: can't use usize::range() here because result_set grows
|
||||
let r = result_set[result_index];
|
||||
debug!("result_index={}, r={:?}", result_index, r);
|
||||
|
||||
for undo_entry in self.undo_log.borrow()[mark.length..].iter() {
|
||||
match undo_entry {
|
||||
&AddConstraint(ConstrainVarSubVar(a, b)) => {
|
||||
consider_adding_bidirectional_edges(&mut result_set, r, ReVar(a), ReVar(b));
|
||||
}
|
||||
&AddConstraint(ConstrainRegSubVar(a, b)) => {
|
||||
consider_adding_bidirectional_edges(&mut result_set, r, a, ReVar(b));
|
||||
}
|
||||
&AddConstraint(ConstrainVarSubReg(a, b)) => {
|
||||
consider_adding_bidirectional_edges(&mut result_set, r, ReVar(a), b);
|
||||
}
|
||||
&AddGiven(a, b) => {
|
||||
consider_adding_bidirectional_edges(&mut result_set,
|
||||
r,
|
||||
ReFree(a),
|
||||
ReVar(b));
|
||||
}
|
||||
&AddVerify(i) => {
|
||||
match (*self.verifys.borrow())[i] {
|
||||
VerifyRegSubReg(_, a, b) => {
|
||||
consider_adding_bidirectional_edges(&mut result_set, r, a, b);
|
||||
}
|
||||
VerifyGenericBound(_, _, a, ref bound) => {
|
||||
bound.for_each_region(&mut |b| {
|
||||
consider_adding_bidirectional_edges(&mut result_set, r, a, b)
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
&AddCombination(..) |
|
||||
&AddVar(..) |
|
||||
&OpenSnapshot |
|
||||
&CommitedSnapshot => {}
|
||||
}
|
||||
}
|
||||
|
||||
result_index += 1;
|
||||
}
|
||||
|
||||
return result_set;
|
||||
|
||||
fn consider_adding_bidirectional_edges(result_set: &mut Vec<Region>,
|
||||
r: Region,
|
||||
r1: Region,
|
||||
r2: Region) {
|
||||
consider_adding_directed_edge(result_set, r, r1, r2);
|
||||
consider_adding_directed_edge(result_set, r, r2, r1);
|
||||
}
|
||||
|
||||
fn consider_adding_directed_edge(result_set: &mut Vec<Region>,
|
||||
r: Region,
|
||||
r1: Region,
|
||||
r2: Region) {
|
||||
if r == r1 {
|
||||
// Clearly, this is potentially inefficient.
|
||||
if !result_set.iter().any(|x| *x == r2) {
|
||||
result_set.push(r2);
|
||||
}
|
||||
}
|
||||
}
|
||||
let mut taint_set = TaintSet::new(directions, r0);
|
||||
taint_set.fixed_point(&self.undo_log.borrow()[mark.length..],
|
||||
&self.verifys.borrow());
|
||||
debug!("tainted: result={:?}", taint_set.regions);
|
||||
return taint_set.into_set();
|
||||
}
|
||||
|
||||
/// This function performs the actual region resolution. It must be
|
||||
|
|
@ -805,10 +991,6 @@ pub enum VarValue {
|
|||
ErrorValue,
|
||||
}
|
||||
|
||||
struct VarData {
|
||||
value: VarValue,
|
||||
}
|
||||
|
||||
struct RegionAndOrigin<'tcx> {
|
||||
region: Region,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
|
|
@ -834,18 +1016,14 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
let graph = self.construct_graph();
|
||||
self.expand_givens(&graph);
|
||||
self.expansion(free_regions, &mut var_data);
|
||||
self.contraction(free_regions, &mut var_data);
|
||||
let values = self.extract_values_and_collect_conflicts(free_regions,
|
||||
&var_data,
|
||||
&graph,
|
||||
errors);
|
||||
self.collect_concrete_region_errors(free_regions, &values, errors);
|
||||
values
|
||||
self.collect_errors(free_regions, &mut var_data, errors);
|
||||
self.collect_var_errors(free_regions, &var_data, &graph, errors);
|
||||
var_data
|
||||
}
|
||||
|
||||
fn construct_var_data(&self) -> Vec<VarData> {
|
||||
fn construct_var_data(&self) -> Vec<VarValue> {
|
||||
(0..self.num_vars() as usize)
|
||||
.map(|_| VarData { value: Value(ty::ReEmpty) })
|
||||
.map(|_| Value(ty::ReEmpty))
|
||||
.collect()
|
||||
}
|
||||
|
||||
|
|
@ -882,30 +1060,28 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
}
|
||||
|
||||
fn expansion(&self, free_regions: &FreeRegionMap, var_data: &mut [VarData]) {
|
||||
self.iterate_until_fixed_point("Expansion", |constraint| {
|
||||
fn expansion(&self, free_regions: &FreeRegionMap, var_values: &mut [VarValue]) {
|
||||
self.iterate_until_fixed_point("Expansion", |constraint, origin| {
|
||||
debug!("expansion: constraint={:?} origin={:?}",
|
||||
constraint,
|
||||
self.constraints
|
||||
.borrow()
|
||||
.get(constraint)
|
||||
.unwrap());
|
||||
constraint, origin);
|
||||
match *constraint {
|
||||
ConstrainRegSubVar(a_region, b_vid) => {
|
||||
let b_data = &mut var_data[b_vid.index as usize];
|
||||
let b_data = &mut var_values[b_vid.index as usize];
|
||||
self.expand_node(free_regions, a_region, b_vid, b_data)
|
||||
}
|
||||
ConstrainVarSubVar(a_vid, b_vid) => {
|
||||
match var_data[a_vid.index as usize].value {
|
||||
match var_values[a_vid.index as usize] {
|
||||
ErrorValue => false,
|
||||
Value(a_region) => {
|
||||
let b_node = &mut var_data[b_vid.index as usize];
|
||||
let b_node = &mut var_values[b_vid.index as usize];
|
||||
self.expand_node(free_regions, a_region, b_vid, b_node)
|
||||
}
|
||||
}
|
||||
}
|
||||
ConstrainRegSubReg(..) |
|
||||
ConstrainVarSubReg(..) => {
|
||||
// This is a contraction constraint. Ignore it.
|
||||
// These constraints are checked after expansion
|
||||
// is done, in `collect_errors`.
|
||||
false
|
||||
}
|
||||
}
|
||||
|
|
@ -916,12 +1092,12 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
free_regions: &FreeRegionMap,
|
||||
a_region: Region,
|
||||
b_vid: RegionVid,
|
||||
b_data: &mut VarData)
|
||||
b_data: &mut VarValue)
|
||||
-> bool {
|
||||
debug!("expand_node({:?}, {:?} == {:?})",
|
||||
a_region,
|
||||
b_vid,
|
||||
b_data.value);
|
||||
b_data);
|
||||
|
||||
// Check if this relationship is implied by a given.
|
||||
match a_region {
|
||||
|
|
@ -934,7 +1110,7 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
_ => {}
|
||||
}
|
||||
|
||||
match b_data.value {
|
||||
match *b_data {
|
||||
Value(cur_region) => {
|
||||
let lub = self.lub_concrete_regions(free_regions, a_region, cur_region);
|
||||
if lub == cur_region {
|
||||
|
|
@ -946,7 +1122,7 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
cur_region,
|
||||
lub);
|
||||
|
||||
b_data.value = Value(lub);
|
||||
*b_data = Value(lub);
|
||||
return true;
|
||||
}
|
||||
|
||||
|
|
@ -956,63 +1132,30 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
}
|
||||
|
||||
// FIXME(#29436) -- this fn would just go away if we removed ConstrainVarSubReg
|
||||
fn contraction(&self, free_regions: &FreeRegionMap, var_data: &mut [VarData]) {
|
||||
self.iterate_until_fixed_point("Contraction", |constraint| {
|
||||
debug!("contraction: constraint={:?} origin={:?}",
|
||||
constraint,
|
||||
self.constraints
|
||||
.borrow()
|
||||
.get(constraint)
|
||||
.unwrap());
|
||||
/// After expansion is complete, go and check upper bounds (i.e.,
|
||||
/// cases where the region cannot grow larger than a fixed point)
|
||||
/// and check that they are satisfied.
|
||||
fn collect_errors(&self,
|
||||
free_regions: &FreeRegionMap,
|
||||
var_data: &mut Vec<VarValue>,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>) {
|
||||
let constraints = self.constraints.borrow();
|
||||
for (constraint, origin) in constraints.iter() {
|
||||
debug!("collect_errors: constraint={:?} origin={:?}",
|
||||
constraint, origin);
|
||||
match *constraint {
|
||||
ConstrainRegSubVar(..) |
|
||||
ConstrainVarSubVar(..) => {
|
||||
// Expansion will ensure that these constraints hold. Ignore.
|
||||
}
|
||||
ConstrainVarSubReg(a_vid, b_region) => {
|
||||
let a_data = &mut var_data[a_vid.index as usize];
|
||||
debug!("contraction: {:?} == {:?}, {:?}",
|
||||
a_vid,
|
||||
a_data.value,
|
||||
b_region);
|
||||
|
||||
let a_region = match a_data.value {
|
||||
ErrorValue => return false,
|
||||
Value(a_region) => a_region,
|
||||
};
|
||||
|
||||
if !free_regions.is_subregion_of(self.tcx, a_region, b_region) {
|
||||
debug!("Setting {:?} to ErrorValue: {:?} not subregion of {:?}",
|
||||
a_vid,
|
||||
a_region,
|
||||
b_region);
|
||||
a_data.value = ErrorValue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
})
|
||||
}
|
||||
|
||||
fn collect_concrete_region_errors(&self,
|
||||
free_regions: &FreeRegionMap,
|
||||
values: &Vec<VarValue>,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>) {
|
||||
let mut reg_reg_dups = FnvHashSet();
|
||||
for verify in self.verifys.borrow().iter() {
|
||||
match *verify {
|
||||
VerifyRegSubReg(ref origin, sub, sup) => {
|
||||
ConstrainRegSubReg(sub, sup) => {
|
||||
if free_regions.is_subregion_of(self.tcx, sub, sup) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if !reg_reg_dups.insert((sub, sup)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
debug!("region inference error at {:?}: {:?} <= {:?} is not true",
|
||||
debug!("collect_errors: region error at {:?}: \
|
||||
cannot verify that {:?} <= {:?}",
|
||||
origin,
|
||||
sub,
|
||||
sup);
|
||||
|
|
@ -1020,30 +1163,61 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
errors.push(ConcreteFailure((*origin).clone(), sub, sup));
|
||||
}
|
||||
|
||||
VerifyGenericBound(ref kind, ref origin, sub, ref bound) => {
|
||||
let sub = normalize(values, sub);
|
||||
if bound.is_met(self.tcx, free_regions, values, sub) {
|
||||
continue;
|
||||
ConstrainVarSubReg(a_vid, b_region) => {
|
||||
let a_data = &mut var_data[a_vid.index as usize];
|
||||
debug!("contraction: {:?} == {:?}, {:?}",
|
||||
a_vid,
|
||||
a_data,
|
||||
b_region);
|
||||
|
||||
let a_region = match *a_data {
|
||||
ErrorValue => continue,
|
||||
Value(a_region) => a_region,
|
||||
};
|
||||
|
||||
// Do not report these errors immediately:
|
||||
// instead, set the variable value to error and
|
||||
// collect them later.
|
||||
if !free_regions.is_subregion_of(self.tcx, a_region, b_region) {
|
||||
debug!("collect_errors: region error at {:?}: \
|
||||
cannot verify that {:?}={:?} <= {:?}",
|
||||
origin,
|
||||
a_vid,
|
||||
a_region,
|
||||
b_region);
|
||||
*a_data = ErrorValue;
|
||||
}
|
||||
|
||||
debug!("region inference error at {:?}: verifying {:?} <= {:?}",
|
||||
origin,
|
||||
sub,
|
||||
bound);
|
||||
|
||||
errors.push(GenericBoundFailure((*origin).clone(), kind.clone(), sub));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for verify in self.verifys.borrow().iter() {
|
||||
debug!("collect_errors: verify={:?}", verify);
|
||||
let sub = normalize(var_data, verify.region);
|
||||
if verify.bound.is_met(self.tcx, free_regions, var_data, sub) {
|
||||
continue;
|
||||
}
|
||||
|
||||
debug!("collect_errors: region error at {:?}: \
|
||||
cannot verify that {:?} <= {:?}",
|
||||
verify.origin,
|
||||
verify.region,
|
||||
verify.bound);
|
||||
|
||||
errors.push(GenericBoundFailure(verify.origin.clone(),
|
||||
verify.kind.clone(),
|
||||
sub));
|
||||
}
|
||||
}
|
||||
|
||||
fn extract_values_and_collect_conflicts(&self,
|
||||
free_regions: &FreeRegionMap,
|
||||
var_data: &[VarData],
|
||||
graph: &RegionGraph,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>)
|
||||
-> Vec<VarValue> {
|
||||
debug!("extract_values_and_collect_conflicts()");
|
||||
/// Go over the variables that were declared to be error variables
|
||||
/// and create a `RegionResolutionError` for each of them.
|
||||
fn collect_var_errors(&self,
|
||||
free_regions: &FreeRegionMap,
|
||||
var_data: &[VarValue],
|
||||
graph: &RegionGraph,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>) {
|
||||
debug!("collect_var_errors");
|
||||
|
||||
// This is the best way that I have found to suppress
|
||||
// duplicate and related errors. Basically we keep a set of
|
||||
|
|
@ -1059,7 +1233,7 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
let mut dup_vec = vec![u32::MAX; self.num_vars() as usize];
|
||||
|
||||
for idx in 0..self.num_vars() as usize {
|
||||
match var_data[idx].value {
|
||||
match var_data[idx] {
|
||||
Value(_) => {
|
||||
/* Inference successful */
|
||||
}
|
||||
|
|
@ -1096,8 +1270,6 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
(0..self.num_vars() as usize).map(|idx| var_data[idx].value).collect()
|
||||
}
|
||||
|
||||
fn construct_graph(&self) -> RegionGraph {
|
||||
|
|
@ -1132,6 +1304,10 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
ConstrainVarSubReg(a_id, _) => {
|
||||
graph.add_edge(NodeIndex(a_id.index as usize), dummy_sink, *constraint);
|
||||
}
|
||||
ConstrainRegSubReg(..) => {
|
||||
// this would be an edge from `dummy_source` to
|
||||
// `dummy_sink`; just ignore it.
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1274,13 +1450,18 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
origin: this.constraints.borrow().get(&edge.data).unwrap().clone(),
|
||||
});
|
||||
}
|
||||
|
||||
ConstrainRegSubReg(..) => {
|
||||
panic!("cannot reach reg-sub-reg edge in region inference \
|
||||
post-processing")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn iterate_until_fixed_point<F>(&self, tag: &str, mut body: F)
|
||||
where F: FnMut(&Constraint) -> bool
|
||||
where F: FnMut(&Constraint, &SubregionOrigin<'tcx>) -> bool
|
||||
{
|
||||
let mut iteration = 0;
|
||||
let mut changed = true;
|
||||
|
|
@ -1288,8 +1469,8 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
changed = false;
|
||||
iteration += 1;
|
||||
debug!("---- {} Iteration {}{}", "#", tag, iteration);
|
||||
for (constraint, _) in self.constraints.borrow().iter() {
|
||||
let edge_changed = body(constraint);
|
||||
for (constraint, origin) in self.constraints.borrow().iter() {
|
||||
let edge_changed = body(constraint, origin);
|
||||
if edge_changed {
|
||||
debug!("Updated due to constraint {:?}", constraint);
|
||||
changed = true;
|
||||
|
|
@ -1301,19 +1482,6 @@ impl<'a, 'gcx, 'tcx> RegionVarBindings<'a, 'gcx, 'tcx> {
|
|||
|
||||
}
|
||||
|
||||
impl<'tcx> fmt::Debug for Verify<'tcx> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
match *self {
|
||||
VerifyRegSubReg(_, ref a, ref b) => {
|
||||
write!(f, "VerifyRegSubReg({:?}, {:?})", a, b)
|
||||
}
|
||||
VerifyGenericBound(_, ref p, ref a, ref bs) => {
|
||||
write!(f, "VerifyGenericBound({:?}, {:?}, {:?})", p, a, bs)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn normalize(values: &Vec<VarValue>, r: ty::Region) -> ty::Region {
|
||||
match r {
|
||||
ty::ReVar(rid) => lookup(values, rid),
|
||||
|
|
|
|||
|
|
@ -178,7 +178,9 @@ impl<'tcx> TypeVariableTable<'tcx> {
|
|||
value: Bounded { relations: vec![], default: default },
|
||||
diverging: diverging
|
||||
});
|
||||
ty::TyVid { index: index as u32 }
|
||||
let v = ty::TyVid { index: index as u32 };
|
||||
debug!("new_var() -> {:?}", v);
|
||||
v
|
||||
}
|
||||
|
||||
pub fn root_var(&mut self, vid: ty::TyVid) -> ty::TyVid {
|
||||
|
|
@ -219,6 +221,17 @@ impl<'tcx> TypeVariableTable<'tcx> {
|
|||
}
|
||||
|
||||
pub fn rollback_to(&mut self, s: Snapshot) {
|
||||
debug!("rollback_to{:?}", {
|
||||
for action in self.values.actions_since_snapshot(&s.snapshot) {
|
||||
match *action {
|
||||
sv::UndoLog::NewElem(index) => {
|
||||
debug!("inference variable _#{}t popped", index)
|
||||
}
|
||||
_ => { }
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
self.values.rollback_to(s.snapshot);
|
||||
self.eq_relations.rollback_to(s.eq_snapshot);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -22,6 +22,7 @@ use dep_graph::DepNode;
|
|||
use hir::map::Map;
|
||||
use session::Session;
|
||||
use hir::def::{Def, DefMap};
|
||||
use hir::def_id::DefId;
|
||||
use middle::region;
|
||||
use ty::subst;
|
||||
use ty;
|
||||
|
|
@ -32,6 +33,7 @@ use syntax::codemap::Span;
|
|||
use syntax::parse::token::keywords;
|
||||
use util::nodemap::NodeMap;
|
||||
|
||||
use rustc_data_structures::fnv::FnvHashSet;
|
||||
use hir;
|
||||
use hir::print::lifetime_to_string;
|
||||
use hir::intravisit::{self, Visitor, FnKind};
|
||||
|
|
@ -50,11 +52,21 @@ pub enum DefRegion {
|
|||
|
||||
// Maps the id of each lifetime reference to the lifetime decl
|
||||
// that it corresponds to.
|
||||
pub type NamedRegionMap = NodeMap<DefRegion>;
|
||||
pub struct NamedRegionMap {
|
||||
// maps from every use of a named (not anonymous) lifetime to a
|
||||
// `DefRegion` describing how that region is bound
|
||||
pub defs: NodeMap<DefRegion>,
|
||||
|
||||
struct LifetimeContext<'a> {
|
||||
// the set of lifetime def ids that are late-bound; late-bound ids
|
||||
// are named regions appearing in fn arguments that do not appear
|
||||
// in where-clauses
|
||||
pub late_bound: NodeMap<ty::Issue32330>,
|
||||
}
|
||||
|
||||
struct LifetimeContext<'a, 'tcx: 'a> {
|
||||
sess: &'a Session,
|
||||
named_region_map: &'a mut NamedRegionMap,
|
||||
hir_map: &'a Map<'tcx>,
|
||||
map: &'a mut NamedRegionMap,
|
||||
scope: Scope<'a>,
|
||||
def_map: &'a DefMap,
|
||||
// Deep breath. Our representation for poly trait refs contains a single
|
||||
|
|
@ -101,21 +113,25 @@ pub fn krate(sess: &Session,
|
|||
-> Result<NamedRegionMap, usize> {
|
||||
let _task = hir_map.dep_graph.in_task(DepNode::ResolveLifetimes);
|
||||
let krate = hir_map.krate();
|
||||
let mut named_region_map = NodeMap();
|
||||
let mut map = NamedRegionMap {
|
||||
defs: NodeMap(),
|
||||
late_bound: NodeMap(),
|
||||
};
|
||||
sess.track_errors(|| {
|
||||
krate.visit_all_items(&mut LifetimeContext {
|
||||
sess: sess,
|
||||
named_region_map: &mut named_region_map,
|
||||
hir_map: hir_map,
|
||||
map: &mut map,
|
||||
scope: &ROOT_SCOPE,
|
||||
def_map: def_map,
|
||||
trait_ref_hack: false,
|
||||
labels_in_fn: vec![],
|
||||
});
|
||||
})?;
|
||||
Ok(named_region_map)
|
||||
Ok(map)
|
||||
}
|
||||
|
||||
impl<'a, 'v> Visitor<'v> for LifetimeContext<'a> {
|
||||
impl<'a, 'tcx, 'v> Visitor<'v> for LifetimeContext<'a, 'tcx> {
|
||||
fn visit_item(&mut self, item: &hir::Item) {
|
||||
assert!(self.labels_in_fn.is_empty());
|
||||
|
||||
|
|
@ -164,8 +180,12 @@ impl<'a, 'v> Visitor<'v> for LifetimeContext<'a> {
|
|||
// Items always introduce a new root scope
|
||||
self.with(RootScope, |_, this| {
|
||||
match item.node {
|
||||
hir::ForeignItemFn(_, ref generics) => {
|
||||
this.visit_early_late(subst::FnSpace, generics, |this| {
|
||||
hir::ForeignItemFn(ref decl, ref generics) => {
|
||||
this.visit_early_late(item.id,
|
||||
subst::FnSpace,
|
||||
decl,
|
||||
generics,
|
||||
|this| {
|
||||
intravisit::walk_foreign_item(this, item);
|
||||
})
|
||||
}
|
||||
|
|
@ -179,24 +199,27 @@ impl<'a, 'v> Visitor<'v> for LifetimeContext<'a> {
|
|||
replace(&mut self.labels_in_fn, saved);
|
||||
}
|
||||
|
||||
fn visit_fn(&mut self, fk: FnKind<'v>, fd: &'v hir::FnDecl,
|
||||
fn visit_fn(&mut self, fk: FnKind<'v>, decl: &'v hir::FnDecl,
|
||||
b: &'v hir::Block, s: Span, fn_id: ast::NodeId) {
|
||||
match fk {
|
||||
FnKind::ItemFn(_, generics, _, _, _, _, _) => {
|
||||
self.visit_early_late(subst::FnSpace, generics, |this| {
|
||||
this.add_scope_and_walk_fn(fk, fd, b, s, fn_id)
|
||||
self.visit_early_late(fn_id, subst::FnSpace, decl, generics, |this| {
|
||||
this.add_scope_and_walk_fn(fk, decl, b, s, fn_id)
|
||||
})
|
||||
}
|
||||
FnKind::Method(_, sig, _, _) => {
|
||||
self.visit_early_late(subst::FnSpace, &sig.generics, |this| {
|
||||
this.add_scope_and_walk_fn(fk, fd, b, s, fn_id)
|
||||
})
|
||||
self.visit_early_late(
|
||||
fn_id,
|
||||
subst::FnSpace,
|
||||
decl,
|
||||
&sig.generics,
|
||||
|this| this.add_scope_and_walk_fn(fk, decl, b, s, fn_id));
|
||||
}
|
||||
FnKind::Closure(_) => {
|
||||
// Closures have their own set of labels, save labels just
|
||||
// like for foreign items above.
|
||||
let saved = replace(&mut self.labels_in_fn, vec![]);
|
||||
let result = self.add_scope_and_walk_fn(fk, fd, b, s, fn_id);
|
||||
let result = self.add_scope_and_walk_fn(fk, decl, b, s, fn_id);
|
||||
replace(&mut self.labels_in_fn, saved);
|
||||
result
|
||||
}
|
||||
|
|
@ -240,7 +263,8 @@ impl<'a, 'v> Visitor<'v> for LifetimeContext<'a> {
|
|||
|
||||
if let hir::MethodTraitItem(ref sig, None) = trait_item.node {
|
||||
self.visit_early_late(
|
||||
subst::FnSpace, &sig.generics,
|
||||
trait_item.id, subst::FnSpace,
|
||||
&sig.decl, &sig.generics,
|
||||
|this| intravisit::walk_trait_item(this, trait_item))
|
||||
} else {
|
||||
intravisit::walk_trait_item(self, trait_item);
|
||||
|
|
@ -380,8 +404,7 @@ fn signal_shadowing_problem(sess: &Session, name: ast::Name, orig: Original, sha
|
|||
|
||||
// Adds all labels in `b` to `ctxt.labels_in_fn`, signalling a warning
|
||||
// if one of the label shadows a lifetime or another label.
|
||||
fn extract_labels<'v, 'a>(ctxt: &mut LifetimeContext<'a>, b: &'v hir::Block) {
|
||||
|
||||
fn extract_labels(ctxt: &mut LifetimeContext, b: &hir::Block) {
|
||||
struct GatherLabels<'a> {
|
||||
sess: &'a Session,
|
||||
scope: Scope<'a>,
|
||||
|
|
@ -468,7 +491,7 @@ fn extract_labels<'v, 'a>(ctxt: &mut LifetimeContext<'a>, b: &'v hir::Block) {
|
|||
}
|
||||
}
|
||||
|
||||
impl<'a> LifetimeContext<'a> {
|
||||
impl<'a, 'tcx> LifetimeContext<'a, 'tcx> {
|
||||
fn add_scope_and_walk_fn<'b>(&mut self,
|
||||
fk: FnKind,
|
||||
fd: &hir::FnDecl,
|
||||
|
|
@ -501,10 +524,11 @@ impl<'a> LifetimeContext<'a> {
|
|||
fn with<F>(&mut self, wrap_scope: ScopeChain, f: F) where
|
||||
F: FnOnce(Scope, &mut LifetimeContext),
|
||||
{
|
||||
let LifetimeContext {sess, ref mut named_region_map, ..} = *self;
|
||||
let LifetimeContext {sess, hir_map, ref mut map, ..} = *self;
|
||||
let mut this = LifetimeContext {
|
||||
sess: sess,
|
||||
named_region_map: *named_region_map,
|
||||
hir_map: hir_map,
|
||||
map: *map,
|
||||
scope: &wrap_scope,
|
||||
def_map: self.def_map,
|
||||
trait_ref_hack: self.trait_ref_hack,
|
||||
|
|
@ -534,20 +558,27 @@ impl<'a> LifetimeContext<'a> {
|
|||
/// bound lifetimes are resolved by name and associated with a binder id (`binder_id`), so the
|
||||
/// ordering is not important there.
|
||||
fn visit_early_late<F>(&mut self,
|
||||
fn_id: ast::NodeId,
|
||||
early_space: subst::ParamSpace,
|
||||
decl: &hir::FnDecl,
|
||||
generics: &hir::Generics,
|
||||
walk: F) where
|
||||
F: FnOnce(&mut LifetimeContext),
|
||||
{
|
||||
let referenced_idents = early_bound_lifetime_names(generics);
|
||||
let fn_def_id = self.hir_map.local_def_id(fn_id);
|
||||
insert_late_bound_lifetimes(self.map,
|
||||
fn_def_id,
|
||||
decl,
|
||||
generics);
|
||||
|
||||
debug!("visit_early_late: referenced_idents={:?}",
|
||||
referenced_idents);
|
||||
let (late, early): (Vec<_>, _) =
|
||||
generics.lifetimes
|
||||
.iter()
|
||||
.cloned()
|
||||
.partition(|l| self.map.late_bound.contains_key(&l.lifetime.id));
|
||||
|
||||
let (early, late): (Vec<_>, _) = generics.lifetimes.iter().cloned().partition(
|
||||
|l| referenced_idents.iter().any(|&i| i == l.lifetime.name));
|
||||
|
||||
self.with(EarlyScope(early_space, &early, self.scope), move |old_scope, this| {
|
||||
let this = self;
|
||||
this.with(EarlyScope(early_space, &early, this.scope), move |old_scope, this| {
|
||||
this.with(LateScope(&late, this.scope), move |_, this| {
|
||||
this.check_lifetime_defs(old_scope, &generics.lifetimes);
|
||||
walk(this);
|
||||
|
|
@ -756,11 +787,12 @@ impl<'a> LifetimeContext<'a> {
|
|||
probably a bug in syntax::fold");
|
||||
}
|
||||
|
||||
debug!("lifetime_ref={:?} id={:?} resolved to {:?}",
|
||||
lifetime_to_string(lifetime_ref),
|
||||
lifetime_ref.id,
|
||||
def);
|
||||
self.named_region_map.insert(lifetime_ref.id, def);
|
||||
debug!("lifetime_ref={:?} id={:?} resolved to {:?} span={:?}",
|
||||
lifetime_to_string(lifetime_ref),
|
||||
lifetime_ref.id,
|
||||
def,
|
||||
self.sess.codemap().span_to_string(lifetime_ref.span));
|
||||
self.map.defs.insert(lifetime_ref.id, def);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -777,95 +809,132 @@ fn search_lifetimes<'a>(lifetimes: &'a [hir::LifetimeDef],
|
|||
|
||||
///////////////////////////////////////////////////////////////////////////
|
||||
|
||||
pub fn early_bound_lifetimes<'a>(generics: &'a hir::Generics) -> Vec<hir::LifetimeDef> {
|
||||
let referenced_idents = early_bound_lifetime_names(generics);
|
||||
if referenced_idents.is_empty() {
|
||||
return Vec::new();
|
||||
/// Detects late-bound lifetimes and inserts them into
|
||||
/// `map.late_bound`.
|
||||
///
|
||||
/// A region declared on a fn is **late-bound** if:
|
||||
/// - it is constrained by an argument type;
|
||||
/// - it does not appear in a where-clause.
|
||||
///
|
||||
/// "Constrained" basically means that it appears in any type but
|
||||
/// not amongst the inputs to a projection. In other words, `<&'a
|
||||
/// T as Trait<''b>>::Foo` does not constrain `'a` or `'b`.
|
||||
fn insert_late_bound_lifetimes(map: &mut NamedRegionMap,
|
||||
fn_def_id: DefId,
|
||||
decl: &hir::FnDecl,
|
||||
generics: &hir::Generics) {
|
||||
debug!("insert_late_bound_lifetimes(decl={:?}, generics={:?})", decl, generics);
|
||||
|
||||
let mut constrained_by_input = ConstrainedCollector { regions: FnvHashSet() };
|
||||
for arg in &decl.inputs {
|
||||
constrained_by_input.visit_ty(&arg.ty);
|
||||
}
|
||||
|
||||
generics.lifetimes.iter()
|
||||
.filter(|l| referenced_idents.iter().any(|&i| i == l.lifetime.name))
|
||||
.cloned()
|
||||
.collect()
|
||||
}
|
||||
let mut appears_in_output = AllCollector { regions: FnvHashSet() };
|
||||
intravisit::walk_fn_ret_ty(&mut appears_in_output, &decl.output);
|
||||
|
||||
/// Given a set of generic declarations, returns a list of names containing all early bound
|
||||
/// lifetime names for those generics. (In fact, this list may also contain other names.)
|
||||
fn early_bound_lifetime_names(generics: &hir::Generics) -> Vec<ast::Name> {
|
||||
// Create two lists, dividing the lifetimes into early/late bound.
|
||||
// Initially, all of them are considered late, but we will move
|
||||
// things from late into early as we go if we find references to
|
||||
// them.
|
||||
let mut early_bound = Vec::new();
|
||||
let mut late_bound = generics.lifetimes.iter()
|
||||
.map(|l| l.lifetime.name)
|
||||
.collect();
|
||||
debug!("insert_late_bound_lifetimes: constrained_by_input={:?}",
|
||||
constrained_by_input.regions);
|
||||
|
||||
// Any lifetime that appears in a type bound is early.
|
||||
{
|
||||
let mut collector =
|
||||
FreeLifetimeCollector { early_bound: &mut early_bound,
|
||||
late_bound: &mut late_bound };
|
||||
for ty_param in generics.ty_params.iter() {
|
||||
walk_list!(&mut collector, visit_ty_param_bound, &ty_param.bounds);
|
||||
}
|
||||
for predicate in &generics.where_clause.predicates {
|
||||
match predicate {
|
||||
&hir::WherePredicate::BoundPredicate(hir::WhereBoundPredicate{ref bounds,
|
||||
ref bounded_ty,
|
||||
..}) => {
|
||||
collector.visit_ty(&bounded_ty);
|
||||
walk_list!(&mut collector, visit_ty_param_bound, bounds);
|
||||
}
|
||||
&hir::WherePredicate::RegionPredicate(hir::WhereRegionPredicate{ref lifetime,
|
||||
ref bounds,
|
||||
..}) => {
|
||||
collector.visit_lifetime(lifetime);
|
||||
|
||||
for bound in bounds {
|
||||
collector.visit_lifetime(bound);
|
||||
}
|
||||
}
|
||||
&hir::WherePredicate::EqPredicate(_) => bug!("unimplemented")
|
||||
}
|
||||
}
|
||||
// Walk the lifetimes that appear in where clauses.
|
||||
//
|
||||
// Subtle point: because we disallow nested bindings, we can just
|
||||
// ignore binders here and scrape up all names we see.
|
||||
let mut appears_in_where_clause = AllCollector { regions: FnvHashSet() };
|
||||
for ty_param in generics.ty_params.iter() {
|
||||
walk_list!(&mut appears_in_where_clause,
|
||||
visit_ty_param_bound,
|
||||
&ty_param.bounds);
|
||||
}
|
||||
|
||||
// Any lifetime that either has a bound or is referenced by a
|
||||
// bound is early.
|
||||
walk_list!(&mut appears_in_where_clause,
|
||||
visit_where_predicate,
|
||||
&generics.where_clause.predicates);
|
||||
for lifetime_def in &generics.lifetimes {
|
||||
if !lifetime_def.bounds.is_empty() {
|
||||
shuffle(&mut early_bound, &mut late_bound,
|
||||
lifetime_def.lifetime.name);
|
||||
for bound in &lifetime_def.bounds {
|
||||
shuffle(&mut early_bound, &mut late_bound,
|
||||
bound.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
return early_bound;
|
||||
|
||||
struct FreeLifetimeCollector<'a> {
|
||||
early_bound: &'a mut Vec<ast::Name>,
|
||||
late_bound: &'a mut Vec<ast::Name>,
|
||||
}
|
||||
|
||||
impl<'a, 'v> Visitor<'v> for FreeLifetimeCollector<'a> {
|
||||
fn visit_lifetime(&mut self, lifetime_ref: &hir::Lifetime) {
|
||||
shuffle(self.early_bound, self.late_bound,
|
||||
lifetime_ref.name);
|
||||
// `'a: 'b` means both `'a` and `'b` are referenced
|
||||
appears_in_where_clause.visit_lifetime_def(lifetime_def);
|
||||
}
|
||||
}
|
||||
|
||||
fn shuffle(early_bound: &mut Vec<ast::Name>,
|
||||
late_bound: &mut Vec<ast::Name>,
|
||||
name: ast::Name) {
|
||||
match late_bound.iter().position(|n| *n == name) {
|
||||
Some(index) => {
|
||||
late_bound.swap_remove(index);
|
||||
early_bound.push(name);
|
||||
debug!("insert_late_bound_lifetimes: appears_in_where_clause={:?}",
|
||||
appears_in_where_clause.regions);
|
||||
|
||||
// Late bound regions are those that:
|
||||
// - appear in the inputs
|
||||
// - do not appear in the where-clauses
|
||||
for lifetime in &generics.lifetimes {
|
||||
let name = lifetime.lifetime.name;
|
||||
|
||||
// appears in the where clauses? early-bound.
|
||||
if appears_in_where_clause.regions.contains(&name) { continue; }
|
||||
|
||||
// does not appear in the inputs, but appears in the return
|
||||
// type? eventually this will be early-bound, but for now we
|
||||
// just mark it so we can issue warnings.
|
||||
let constrained_by_input = constrained_by_input.regions.contains(&name);
|
||||
let appears_in_output = appears_in_output.regions.contains(&name);
|
||||
let will_change = !constrained_by_input && appears_in_output;
|
||||
let issue_32330 = if will_change {
|
||||
ty::Issue32330::WillChange {
|
||||
fn_def_id: fn_def_id,
|
||||
region_name: name,
|
||||
}
|
||||
None => { }
|
||||
} else {
|
||||
ty::Issue32330::WontChange
|
||||
};
|
||||
|
||||
debug!("insert_late_bound_lifetimes: \
|
||||
lifetime {:?} with id {:?} is late-bound ({:?}",
|
||||
lifetime.lifetime.name, lifetime.lifetime.id, issue_32330);
|
||||
|
||||
let prev = map.late_bound.insert(lifetime.lifetime.id, issue_32330);
|
||||
assert!(prev.is_none(), "visited lifetime {:?} twice", lifetime.lifetime.id);
|
||||
}
|
||||
|
||||
return;
|
||||
|
||||
struct ConstrainedCollector {
|
||||
regions: FnvHashSet<ast::Name>,
|
||||
}
|
||||
|
||||
impl<'v> Visitor<'v> for ConstrainedCollector {
|
||||
fn visit_ty(&mut self, ty: &'v hir::Ty) {
|
||||
match ty.node {
|
||||
hir::TyPath(Some(_), _) => {
|
||||
// ignore lifetimes appearing in associated type
|
||||
// projections, as they are not *constrained*
|
||||
// (defined above)
|
||||
}
|
||||
|
||||
hir::TyPath(None, ref path) => {
|
||||
// consider only the lifetimes on the final
|
||||
// segment; I am not sure it's even currently
|
||||
// valid to have them elsewhere, but even if it
|
||||
// is, those would be potentially inputs to
|
||||
// projections
|
||||
if let Some(last_segment) = path.segments.last() {
|
||||
self.visit_path_segment(path.span, last_segment);
|
||||
}
|
||||
}
|
||||
|
||||
_ => {
|
||||
intravisit::walk_ty(self, ty);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_lifetime(&mut self, lifetime_ref: &'v hir::Lifetime) {
|
||||
self.regions.insert(lifetime_ref.name);
|
||||
}
|
||||
}
|
||||
|
||||
struct AllCollector {
|
||||
regions: FnvHashSet<ast::Name>,
|
||||
}
|
||||
|
||||
impl<'v> Visitor<'v> for AllCollector {
|
||||
fn visit_lifetime(&mut self, lifetime_ref: &'v hir::Lifetime) {
|
||||
self.regions.insert(lifetime_ref.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -171,10 +171,12 @@ impl<'a, 'gcx, 'tcx> FulfillmentContext<'tcx> {
|
|||
// debug output much nicer to read and so on.
|
||||
let obligation = infcx.resolve_type_vars_if_possible(&obligation);
|
||||
|
||||
debug!("register_predicate_obligation(obligation={:?})", obligation);
|
||||
|
||||
infcx.obligations_in_snapshot.set(true);
|
||||
|
||||
if infcx.tcx.fulfilled_predicates.borrow().check_duplicate(&obligation.predicate)
|
||||
{
|
||||
if infcx.tcx.fulfilled_predicates.borrow().check_duplicate(&obligation.predicate) {
|
||||
debug!("register_predicate_obligation: duplicate");
|
||||
return
|
||||
}
|
||||
|
||||
|
|
@ -406,6 +408,8 @@ fn process_predicate<'a, 'gcx, 'tcx>(
|
|||
// also includes references to its upvars as part
|
||||
// of its type, and those types are resolved at
|
||||
// the same time.
|
||||
//
|
||||
// FIXME(#32286) logic seems false if no upvars
|
||||
pending_obligation.stalled_on =
|
||||
trait_ref_type_vars(selcx, data.to_poly_trait_ref());
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
//! Trait Resolution. See the Book for more.
|
||||
//! Trait Resolution. See README.md for an overview of how this works.
|
||||
|
||||
pub use self::SelectionError::*;
|
||||
pub use self::FulfillmentErrorCode::*;
|
||||
|
|
@ -30,8 +30,9 @@ pub use self::coherence::orphan_check;
|
|||
pub use self::coherence::overlapping_impls;
|
||||
pub use self::coherence::OrphanCheckErr;
|
||||
pub use self::fulfill::{FulfillmentContext, GlobalFulfilledPredicates, RegionObligation};
|
||||
pub use self::project::{MismatchedProjectionTypes, ProjectionMode};
|
||||
pub use self::project::MismatchedProjectionTypes;
|
||||
pub use self::project::{normalize, normalize_projection_type, Normalized};
|
||||
pub use self::project::{ProjectionCache, ProjectionCacheSnapshot, ProjectionMode};
|
||||
pub use self::object_safety::ObjectSafetyViolation;
|
||||
pub use self::object_safety::MethodViolationCode;
|
||||
pub use self::select::{EvaluationCache, SelectionContext, SelectionCache};
|
||||
|
|
|
|||
|
|
@ -24,12 +24,13 @@ use super::VtableImplData;
|
|||
use super::util;
|
||||
|
||||
use hir::def_id::DefId;
|
||||
use infer::{self, InferOk, TypeOrigin};
|
||||
use infer::{InferOk, TypeOrigin};
|
||||
use rustc_data_structures::snapshot_map::{Snapshot, SnapshotMap};
|
||||
use syntax::parse::token;
|
||||
use syntax::ast;
|
||||
use ty::subst::Subst;
|
||||
use ty::{self, ToPredicate, ToPolyTraitRef, Ty, TyCtxt};
|
||||
use ty::fold::{TypeFoldable, TypeFolder};
|
||||
use syntax::parse::token;
|
||||
use syntax::ast;
|
||||
use util::common::FN_OUTPUT_NAME;
|
||||
|
||||
use std::rc::Rc;
|
||||
|
|
@ -182,7 +183,8 @@ pub fn poly_project_and_unify_type<'cx, 'gcx, 'tcx>(
|
|||
let skol_obligation = obligation.with(skol_predicate);
|
||||
match project_and_unify_type(selcx, &skol_obligation) {
|
||||
Ok(result) => {
|
||||
match infcx.leak_check(false, &skol_map, snapshot) {
|
||||
let span = obligation.cause.span;
|
||||
match infcx.leak_check(false, span, &skol_map, snapshot) {
|
||||
Ok(()) => Ok(infcx.plug_leaks(skol_map, snapshot, &result)),
|
||||
Err(e) => Err(MismatchedProjectionTypes { err: e }),
|
||||
}
|
||||
|
|
@ -256,9 +258,13 @@ pub fn normalize_with_depth<'a, 'b, 'gcx, 'tcx, T>(
|
|||
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
debug!("normalize_with_depth(depth={}, value={:?})", depth, value);
|
||||
let mut normalizer = AssociatedTypeNormalizer::new(selcx, cause, depth);
|
||||
let result = normalizer.fold(value);
|
||||
|
||||
debug!("normalize_with_depth: depth={} result={:?} with {} obligations",
|
||||
depth, result, normalizer.obligations.len());
|
||||
debug!("normalize_with_depth: depth={} obligations={:?}",
|
||||
depth, normalizer.obligations);
|
||||
Normalized {
|
||||
value: result,
|
||||
obligations: normalizer.obligations,
|
||||
|
|
@ -330,13 +336,16 @@ impl<'a, 'b, 'gcx, 'tcx> TypeFolder<'gcx, 'tcx> for AssociatedTypeNormalizer<'a,
|
|||
// binder). It would be better to normalize in a
|
||||
// binding-aware fashion.
|
||||
|
||||
let Normalized { value: ty, obligations } =
|
||||
let Normalized { value: normalized_ty, obligations } =
|
||||
normalize_projection_type(self.selcx,
|
||||
data.clone(),
|
||||
self.cause.clone(),
|
||||
self.depth);
|
||||
debug!("AssociatedTypeNormalizer: depth={} normalized {:?} to {:?} \
|
||||
with {} add'l obligations",
|
||||
self.depth, ty, normalized_ty, obligations.len());
|
||||
self.obligations.extend(obligations);
|
||||
ty
|
||||
normalized_ty
|
||||
}
|
||||
|
||||
_ => {
|
||||
|
|
@ -404,64 +413,161 @@ fn opt_normalize_projection_type<'a, 'b, 'gcx, 'tcx>(
|
|||
depth: usize)
|
||||
-> Option<NormalizedTy<'tcx>>
|
||||
{
|
||||
debug!("normalize_projection_type(\
|
||||
let infcx = selcx.infcx();
|
||||
|
||||
let projection_ty = infcx.resolve_type_vars_if_possible(&projection_ty);
|
||||
|
||||
debug!("opt_normalize_projection_type(\
|
||||
projection_ty={:?}, \
|
||||
depth={})",
|
||||
projection_ty,
|
||||
depth);
|
||||
|
||||
// FIXME(#20304) For now, I am caching here, which is good, but it
|
||||
// means we don't capture the type variables that are created in
|
||||
// the case of ambiguity. Which means we may create a large stream
|
||||
// of such variables. OTOH, if we move the caching up a level, we
|
||||
// would not benefit from caching when proving `T: Trait<U=Foo>`
|
||||
// bounds. It might be the case that we want two distinct caches,
|
||||
// or else another kind of cache entry.
|
||||
|
||||
match infcx.projection_cache.borrow_mut().try_start(projection_ty) {
|
||||
Ok(()) => { }
|
||||
Err(ProjectionCacheEntry::Ambiguous) => {
|
||||
// If we found ambiguity the last time, that generally
|
||||
// means we will continue to do so until some type in the
|
||||
// key changes (and we know it hasn't, because we just
|
||||
// fully resolved it). One exception though is closure
|
||||
// types, which can transition from having a fixed kind to
|
||||
// no kind with no visible change in the key.
|
||||
//
|
||||
// FIXME(#32286) refactor this so that closure type
|
||||
// changes
|
||||
debug!("opt_normalize_projection_type: \
|
||||
found cache entry: ambiguous");
|
||||
if !projection_ty.has_closure_types() {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
Err(ProjectionCacheEntry::InProgress) => {
|
||||
// If while normalized A::B, we are asked to normalize
|
||||
// A::B, just return A::B itself. This is a conservative
|
||||
// answer, in the sense that A::B *is* clearly equivalent
|
||||
// to A::B, though there may be a better value we can
|
||||
// find.
|
||||
|
||||
// Under lazy normalization, this can arise when
|
||||
// bootstrapping. That is, imagine an environment with a
|
||||
// where-clause like `A::B == u32`. Now, if we are asked
|
||||
// to normalize `A::B`, we will want to check the
|
||||
// where-clauses in scope. So we will try to unify `A::B`
|
||||
// with `A::B`, which can trigger a recursive
|
||||
// normalization. In that case, I think we will want this code:
|
||||
//
|
||||
// ```
|
||||
// let ty = selcx.tcx().mk_projection(projection_ty.trait_ref,
|
||||
// projection_ty.item_name);
|
||||
// return Some(NormalizedTy { value: v, obligations: vec![] });
|
||||
// ```
|
||||
|
||||
debug!("opt_normalize_projection_type: \
|
||||
found cache entry: in-progress");
|
||||
|
||||
// But for now, let's classify this as an overflow:
|
||||
let recursion_limit = selcx.tcx().sess.recursion_limit.get();
|
||||
let obligation = Obligation::with_depth(cause.clone(),
|
||||
recursion_limit,
|
||||
projection_ty);
|
||||
selcx.infcx().report_overflow_error(&obligation, false);
|
||||
}
|
||||
Err(ProjectionCacheEntry::NormalizedTy(ty)) => {
|
||||
// If we find the value in the cache, then the obligations
|
||||
// have already been returned from the previous entry (and
|
||||
// should therefore have been honored).
|
||||
debug!("opt_normalize_projection_type: \
|
||||
found normalized ty `{:?}`",
|
||||
ty);
|
||||
return Some(NormalizedTy { value: ty, obligations: vec![] });
|
||||
}
|
||||
Err(ProjectionCacheEntry::Error) => {
|
||||
debug!("opt_normalize_projection_type: \
|
||||
found error");
|
||||
return Some(normalize_to_error(selcx, projection_ty, cause, depth));
|
||||
}
|
||||
}
|
||||
|
||||
let obligation = Obligation::with_depth(cause.clone(), depth, projection_ty.clone());
|
||||
match project_type(selcx, &obligation) {
|
||||
Ok(ProjectedTy::Progress(projected_ty, mut obligations)) => {
|
||||
Ok(ProjectedTy::Progress(Progress { ty: projected_ty,
|
||||
mut obligations,
|
||||
cacheable })) => {
|
||||
// if projection succeeded, then what we get out of this
|
||||
// is also non-normalized (consider: it was derived from
|
||||
// an impl, where-clause etc) and hence we must
|
||||
// re-normalize it
|
||||
|
||||
debug!("normalize_projection_type: projected_ty={:?} depth={} obligations={:?}",
|
||||
debug!("opt_normalize_projection_type: \
|
||||
projected_ty={:?} \
|
||||
depth={} \
|
||||
obligations={:?} \
|
||||
cacheable={:?}",
|
||||
projected_ty,
|
||||
depth,
|
||||
obligations);
|
||||
obligations,
|
||||
cacheable);
|
||||
|
||||
if projected_ty.has_projection_types() {
|
||||
let result = if projected_ty.has_projection_types() {
|
||||
let mut normalizer = AssociatedTypeNormalizer::new(selcx, cause, depth+1);
|
||||
let normalized_ty = normalizer.fold(&projected_ty);
|
||||
|
||||
debug!("normalize_projection_type: normalized_ty={:?} depth={}",
|
||||
debug!("opt_normalize_projection_type: \
|
||||
normalized_ty={:?} depth={}",
|
||||
normalized_ty,
|
||||
depth);
|
||||
|
||||
obligations.extend(normalizer.obligations);
|
||||
Some(Normalized {
|
||||
Normalized {
|
||||
value: normalized_ty,
|
||||
obligations: obligations,
|
||||
})
|
||||
}
|
||||
} else {
|
||||
Some(Normalized {
|
||||
Normalized {
|
||||
value: projected_ty,
|
||||
obligations: obligations,
|
||||
})
|
||||
}
|
||||
}
|
||||
};
|
||||
infcx.projection_cache.borrow_mut()
|
||||
.complete(projection_ty, &result, cacheable);
|
||||
Some(result)
|
||||
}
|
||||
Ok(ProjectedTy::NoProgress(projected_ty)) => {
|
||||
debug!("normalize_projection_type: projected_ty={:?} no progress",
|
||||
debug!("opt_normalize_projection_type: \
|
||||
projected_ty={:?} no progress",
|
||||
projected_ty);
|
||||
Some(Normalized {
|
||||
let result = Normalized {
|
||||
value: projected_ty,
|
||||
obligations: vec!()
|
||||
})
|
||||
};
|
||||
infcx.projection_cache.borrow_mut()
|
||||
.complete(projection_ty, &result, true);
|
||||
Some(result)
|
||||
}
|
||||
Err(ProjectionTyError::TooManyCandidates) => {
|
||||
debug!("normalize_projection_type: too many candidates");
|
||||
debug!("opt_normalize_projection_type: \
|
||||
too many candidates");
|
||||
infcx.projection_cache.borrow_mut()
|
||||
.ambiguous(projection_ty);
|
||||
None
|
||||
}
|
||||
Err(ProjectionTyError::TraitSelectionError(_)) => {
|
||||
debug!("normalize_projection_type: ERROR");
|
||||
debug!("opt_normalize_projection_type: ERROR");
|
||||
// if we got an error processing the `T as Trait` part,
|
||||
// just return `ty::err` but add the obligation `T :
|
||||
// Trait`, which when processed will cause the error to be
|
||||
// reported later
|
||||
|
||||
infcx.projection_cache.borrow_mut()
|
||||
.error(projection_ty);
|
||||
Some(normalize_to_error(selcx, projection_ty, cause, depth))
|
||||
}
|
||||
}
|
||||
|
|
@ -504,11 +610,43 @@ fn normalize_to_error<'a, 'gcx, 'tcx>(selcx: &mut SelectionContext<'a, 'gcx, 'tc
|
|||
}
|
||||
|
||||
enum ProjectedTy<'tcx> {
|
||||
Progress(Ty<'tcx>, Vec<PredicateObligation<'tcx>>),
|
||||
Progress(Progress<'tcx>),
|
||||
NoProgress(Ty<'tcx>),
|
||||
}
|
||||
|
||||
struct Progress<'tcx> {
|
||||
ty: Ty<'tcx>,
|
||||
obligations: Vec<PredicateObligation<'tcx>>,
|
||||
cacheable: bool,
|
||||
}
|
||||
|
||||
impl<'tcx> Progress<'tcx> {
|
||||
fn error<'a,'gcx>(tcx: TyCtxt<'a,'gcx,'tcx>) -> Self {
|
||||
Progress {
|
||||
ty: tcx.types.err,
|
||||
obligations: vec![],
|
||||
cacheable: true
|
||||
}
|
||||
}
|
||||
|
||||
fn with_addl_obligations(mut self,
|
||||
mut obligations: Vec<PredicateObligation<'tcx>>)
|
||||
-> Self {
|
||||
debug!("with_addl_obligations: self.obligations.len={} obligations.len={}",
|
||||
self.obligations.len(), obligations.len());
|
||||
|
||||
debug!("with_addl_obligations: self.obligations={:?} obligations={:?}",
|
||||
self.obligations, obligations);
|
||||
|
||||
self.obligations.append(&mut obligations);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
/// Compute the result of a projection type (if we can).
|
||||
///
|
||||
/// IMPORTANT:
|
||||
/// - `obligation` must be fully normalized
|
||||
fn project_type<'cx, 'gcx, 'tcx>(
|
||||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>)
|
||||
|
|
@ -523,13 +661,12 @@ fn project_type<'cx, 'gcx, 'tcx>(
|
|||
selcx.infcx().report_overflow_error(&obligation, true);
|
||||
}
|
||||
|
||||
let obligation_trait_ref =
|
||||
selcx.infcx().resolve_type_vars_if_possible(&obligation.predicate.trait_ref);
|
||||
let obligation_trait_ref = &obligation.predicate.trait_ref;
|
||||
|
||||
debug!("project: obligation_trait_ref={:?}", obligation_trait_ref);
|
||||
|
||||
if obligation_trait_ref.references_error() {
|
||||
return Ok(ProjectedTy::Progress(selcx.tcx().types.err, vec!()));
|
||||
return Ok(ProjectedTy::Progress(Progress::error(selcx.tcx())));
|
||||
}
|
||||
|
||||
let mut candidates = ProjectionTyCandidateSet {
|
||||
|
|
@ -607,16 +744,17 @@ fn project_type<'cx, 'gcx, 'tcx>(
|
|||
|
||||
match candidates.vec.pop() {
|
||||
Some(candidate) => {
|
||||
let (ty, obligations) = confirm_candidate(selcx,
|
||||
obligation,
|
||||
&obligation_trait_ref,
|
||||
candidate);
|
||||
Ok(ProjectedTy::Progress(ty, obligations))
|
||||
Ok(ProjectedTy::Progress(
|
||||
confirm_candidate(selcx,
|
||||
obligation,
|
||||
&obligation_trait_ref,
|
||||
candidate)))
|
||||
}
|
||||
None => {
|
||||
Ok(ProjectedTy::NoProgress(selcx.tcx().mk_projection(
|
||||
obligation.predicate.trait_ref.clone(),
|
||||
obligation.predicate.item_name)))
|
||||
Ok(ProjectedTy::NoProgress(
|
||||
selcx.tcx().mk_projection(
|
||||
obligation.predicate.trait_ref.clone(),
|
||||
obligation.predicate.item_name)))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -919,7 +1057,7 @@ fn confirm_candidate<'cx, 'gcx, 'tcx>(
|
|||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
obligation_trait_ref: &ty::TraitRef<'tcx>,
|
||||
candidate: ProjectionTyCandidate<'tcx>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
debug!("confirm_candidate(candidate={:?}, obligation={:?})",
|
||||
candidate,
|
||||
|
|
@ -941,7 +1079,7 @@ fn confirm_select_candidate<'cx, 'gcx, 'tcx>(
|
|||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
obligation_trait_ref: &ty::TraitRef<'tcx>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
let poly_trait_ref = obligation_trait_ref.to_poly_trait_ref();
|
||||
let trait_obligation = obligation.with(poly_trait_ref.to_poly_trait_predicate());
|
||||
|
|
@ -979,7 +1117,7 @@ fn confirm_object_candidate<'cx, 'gcx, 'tcx>(
|
|||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
obligation_trait_ref: &ty::TraitRef<'tcx>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
let self_ty = obligation_trait_ref.self_ty();
|
||||
let object_ty = selcx.infcx().shallow_resolve(self_ty);
|
||||
|
|
@ -991,7 +1129,7 @@ fn confirm_object_candidate<'cx, 'gcx, 'tcx>(
|
|||
span_bug!(
|
||||
obligation.cause.span,
|
||||
"confirm_object_candidate called with non-object: {:?}",
|
||||
object_ty);
|
||||
object_ty)
|
||||
}
|
||||
};
|
||||
let projection_bounds = data.projection_bounds_with_self_ty(selcx.tcx(), object_ty);
|
||||
|
|
@ -1035,7 +1173,7 @@ fn confirm_object_candidate<'cx, 'gcx, 'tcx>(
|
|||
debug!("confirm_object_candidate: no env-predicate \
|
||||
found in object type `{:?}`; ill-formed",
|
||||
object_ty);
|
||||
return (selcx.tcx().types.err, vec!());
|
||||
return Progress::error(selcx.tcx());
|
||||
}
|
||||
}
|
||||
};
|
||||
|
|
@ -1047,7 +1185,7 @@ fn confirm_fn_pointer_candidate<'cx, 'gcx, 'tcx>(
|
|||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
fn_pointer_vtable: VtableFnPointerData<'tcx, PredicateObligation<'tcx>>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
// FIXME(#32730) drop this assertion once obligations are propagated from inference (fn pointer
|
||||
// vtable nested obligations ONLY come from unification in inference)
|
||||
|
|
@ -1061,23 +1199,29 @@ fn confirm_closure_candidate<'cx, 'gcx, 'tcx>(
|
|||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
vtable: VtableClosureData<'tcx, PredicateObligation<'tcx>>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
let closure_typer = selcx.closure_typer();
|
||||
let closure_type = closure_typer.closure_type(vtable.closure_def_id, vtable.substs);
|
||||
let Normalized {
|
||||
value: closure_type,
|
||||
mut obligations
|
||||
obligations
|
||||
} = normalize_with_depth(selcx,
|
||||
obligation.cause.clone(),
|
||||
obligation.recursion_depth+1,
|
||||
&closure_type);
|
||||
let (ty, mut cc_obligations) = confirm_callable_candidate(selcx,
|
||||
obligation,
|
||||
&closure_type.sig,
|
||||
util::TupleArgumentsFlag::No);
|
||||
obligations.append(&mut cc_obligations);
|
||||
(ty, obligations)
|
||||
|
||||
debug!("confirm_closure_candidate: obligation={:?},closure_type={:?},obligations={:?}",
|
||||
obligation,
|
||||
closure_type,
|
||||
obligations);
|
||||
|
||||
confirm_callable_candidate(selcx,
|
||||
obligation,
|
||||
&closure_type.sig,
|
||||
util::TupleArgumentsFlag::No)
|
||||
.with_addl_obligations(obligations)
|
||||
.with_addl_obligations(vtable.nested)
|
||||
}
|
||||
|
||||
fn confirm_callable_candidate<'cx, 'gcx, 'tcx>(
|
||||
|
|
@ -1085,7 +1229,7 @@ fn confirm_callable_candidate<'cx, 'gcx, 'tcx>(
|
|||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
fn_sig: &ty::PolyFnSig<'tcx>,
|
||||
flag: util::TupleArgumentsFlag)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
let tcx = selcx.tcx();
|
||||
|
||||
|
|
@ -1118,47 +1262,38 @@ fn confirm_param_env_candidate<'cx, 'gcx, 'tcx>(
|
|||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
poly_projection: ty::PolyProjectionPredicate<'tcx>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
let infcx = selcx.infcx();
|
||||
|
||||
let projection =
|
||||
infcx.replace_late_bound_regions_with_fresh_var(
|
||||
obligation.cause.span,
|
||||
infer::LateBoundRegionConversionTime::HigherRankedType,
|
||||
&poly_projection).0;
|
||||
|
||||
assert_eq!(projection.projection_ty.item_name,
|
||||
obligation.predicate.item_name);
|
||||
|
||||
let origin = TypeOrigin::RelateOutputImplTypes(obligation.cause.span);
|
||||
let obligations = match infcx.eq_trait_refs(false,
|
||||
origin,
|
||||
obligation.predicate.trait_ref.clone(),
|
||||
projection.projection_ty.trait_ref.clone()) {
|
||||
Ok(InferOk { obligations, .. }) => {
|
||||
// FIXME(#32730) once obligations are generated in inference, remove this assertion
|
||||
let trait_ref = obligation.predicate.trait_ref;
|
||||
match infcx.match_poly_projection_predicate(origin, poly_projection, trait_ref) {
|
||||
Ok(InferOk { value: ty_match, obligations }) => {
|
||||
// FIXME(#32730) once obligations are generated in inference, drop this assertion
|
||||
assert!(obligations.is_empty());
|
||||
obligations
|
||||
Progress {
|
||||
ty: ty_match.value,
|
||||
obligations: obligations,
|
||||
cacheable: ty_match.unconstrained_regions.is_empty(),
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
span_bug!(
|
||||
obligation.cause.span,
|
||||
"Failed to unify `{:?}` and `{:?}` in projection: {}",
|
||||
"Failed to unify obligation `{:?}` \
|
||||
with poly_projection `{:?}`: {:?}",
|
||||
obligation,
|
||||
projection,
|
||||
poly_projection,
|
||||
e);
|
||||
}
|
||||
};
|
||||
|
||||
(projection.ty, obligations)
|
||||
}
|
||||
}
|
||||
|
||||
fn confirm_impl_candidate<'cx, 'gcx, 'tcx>(
|
||||
selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
|
||||
obligation: &ProjectionTyObligation<'tcx>,
|
||||
impl_vtable: VtableImplData<'tcx, PredicateObligation<'tcx>>)
|
||||
-> (Ty<'tcx>, Vec<PredicateObligation<'tcx>>)
|
||||
-> Progress<'tcx>
|
||||
{
|
||||
let VtableImplData { substs, nested, impl_def_id } = impl_vtable;
|
||||
|
||||
|
|
@ -1179,7 +1314,11 @@ fn confirm_impl_candidate<'cx, 'gcx, 'tcx>(
|
|||
tcx.types.err
|
||||
});
|
||||
let substs = translate_substs(selcx.infcx(), impl_def_id, substs, node_item.node);
|
||||
(ty.subst(tcx, substs), nested)
|
||||
Progress {
|
||||
ty: ty.subst(tcx, substs),
|
||||
obligations: nested,
|
||||
cacheable: true
|
||||
}
|
||||
}
|
||||
None => {
|
||||
span_bug!(obligation.cause.span,
|
||||
|
|
@ -1222,3 +1361,91 @@ fn assoc_ty_def<'cx, 'gcx, 'tcx>(
|
|||
.next()
|
||||
}
|
||||
}
|
||||
|
||||
// # Cache
|
||||
|
||||
pub struct ProjectionCache<'tcx> {
|
||||
map: SnapshotMap<ty::ProjectionTy<'tcx>, ProjectionCacheEntry<'tcx>>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
enum ProjectionCacheEntry<'tcx> {
|
||||
InProgress,
|
||||
Ambiguous,
|
||||
Error,
|
||||
NormalizedTy(Ty<'tcx>),
|
||||
}
|
||||
|
||||
// NB: intentionally not Clone
|
||||
pub struct ProjectionCacheSnapshot {
|
||||
snapshot: Snapshot
|
||||
}
|
||||
|
||||
impl<'tcx> ProjectionCache<'tcx> {
|
||||
pub fn new() -> Self {
|
||||
ProjectionCache {
|
||||
map: SnapshotMap::new()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn snapshot(&mut self) -> ProjectionCacheSnapshot {
|
||||
ProjectionCacheSnapshot { snapshot: self.map.snapshot() }
|
||||
}
|
||||
|
||||
pub fn rollback_to(&mut self, snapshot: ProjectionCacheSnapshot) {
|
||||
self.map.rollback_to(snapshot.snapshot);
|
||||
}
|
||||
|
||||
pub fn commit(&mut self, snapshot: ProjectionCacheSnapshot) {
|
||||
self.map.commit(snapshot.snapshot);
|
||||
}
|
||||
|
||||
/// Try to start normalize `key`; returns an error if
|
||||
/// normalization already occured (this error corresponds to a
|
||||
/// cache hit, so it's actually a good thing).
|
||||
fn try_start(&mut self, key: ty::ProjectionTy<'tcx>)
|
||||
-> Result<(), ProjectionCacheEntry<'tcx>> {
|
||||
match self.map.get(&key) {
|
||||
Some(entry) => return Err(entry.clone()),
|
||||
None => { }
|
||||
}
|
||||
|
||||
self.map.insert(key, ProjectionCacheEntry::InProgress);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Indicates that `key` was normalized to `value`. If `cacheable` is false,
|
||||
/// then this result is sadly not cacheable.
|
||||
fn complete(&mut self,
|
||||
key: ty::ProjectionTy<'tcx>,
|
||||
value: &NormalizedTy<'tcx>,
|
||||
cacheable: bool) {
|
||||
let fresh_key = if cacheable {
|
||||
debug!("ProjectionCacheEntry::complete: adding cache entry: key={:?}, value={:?}",
|
||||
key, value);
|
||||
self.map.insert(key, ProjectionCacheEntry::NormalizedTy(value.value))
|
||||
} else {
|
||||
debug!("ProjectionCacheEntry::complete: cannot cache: key={:?}, value={:?}",
|
||||
key, value);
|
||||
!self.map.remove(key)
|
||||
};
|
||||
|
||||
assert!(!fresh_key, "never started projecting `{:?}`", key);
|
||||
}
|
||||
|
||||
/// Indicates that trying to normalize `key` resulted in
|
||||
/// ambiguity. No point in trying it again then until we gain more
|
||||
/// type information (in which case, the "fully resolved" key will
|
||||
/// be different).
|
||||
fn ambiguous(&mut self, key: ty::ProjectionTy<'tcx>) {
|
||||
let fresh = self.map.insert(key, ProjectionCacheEntry::Ambiguous);
|
||||
assert!(!fresh, "never started projecting `{:?}`", key);
|
||||
}
|
||||
|
||||
/// Indicates that trying to normalize `key` resulted in
|
||||
/// error.
|
||||
fn error(&mut self, key: ty::ProjectionTy<'tcx>) {
|
||||
let fresh = self.map.insert(key, ProjectionCacheEntry::Error);
|
||||
assert!(!fresh, "never started projecting `{:?}`", key);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -46,6 +46,7 @@ use rustc_data_structures::snapshot_vec::{SnapshotVecDelegate, SnapshotVec};
|
|||
use std::cell::RefCell;
|
||||
use std::fmt;
|
||||
use std::marker::PhantomData;
|
||||
use std::mem;
|
||||
use std::rc::Rc;
|
||||
use syntax::abi::Abi;
|
||||
use hir;
|
||||
|
|
@ -1237,6 +1238,9 @@ impl<'cx, 'gcx, 'tcx> SelectionContext<'cx, 'gcx, 'tcx> {
|
|||
skol_trait_predicate.trait_ref.clone(),
|
||||
&skol_map,
|
||||
snapshot);
|
||||
|
||||
self.infcx.pop_skolemized(skol_map, snapshot);
|
||||
|
||||
assert!(result);
|
||||
true
|
||||
}
|
||||
|
|
@ -1263,7 +1267,7 @@ impl<'cx, 'gcx, 'tcx> SelectionContext<'cx, 'gcx, 'tcx> {
|
|||
Err(_) => { return false; }
|
||||
}
|
||||
|
||||
self.infcx.leak_check(false, skol_map, snapshot).is_ok()
|
||||
self.infcx.leak_check(false, obligation.cause.span, skol_map, snapshot).is_ok()
|
||||
}
|
||||
|
||||
/// Given an obligation like `<SomeTrait for T>`, search the obligations that the caller
|
||||
|
|
@ -1422,9 +1426,16 @@ impl<'cx, 'gcx, 'tcx> SelectionContext<'cx, 'gcx, 'tcx> {
|
|||
self.tcx(),
|
||||
obligation.predicate.0.trait_ref.self_ty(),
|
||||
|impl_def_id| {
|
||||
self.probe(|this, snapshot| {
|
||||
if let Ok(_) = this.match_impl(impl_def_id, obligation, snapshot) {
|
||||
candidates.vec.push(ImplCandidate(impl_def_id));
|
||||
self.probe(|this, snapshot| { /* [1] */
|
||||
match this.match_impl(impl_def_id, obligation, snapshot) {
|
||||
Ok(skol_map) => {
|
||||
candidates.vec.push(ImplCandidate(impl_def_id));
|
||||
|
||||
// NB: we can safely drop the skol map
|
||||
// since we are in a probe [1]
|
||||
mem::drop(skol_map);
|
||||
}
|
||||
Err(_) => { }
|
||||
}
|
||||
});
|
||||
}
|
||||
|
|
@ -1509,9 +1520,11 @@ impl<'cx, 'gcx, 'tcx> SelectionContext<'cx, 'gcx, 'tcx> {
|
|||
return;
|
||||
}
|
||||
|
||||
self.probe(|this, snapshot| {
|
||||
let (self_ty, _) =
|
||||
this.infcx().skolemize_late_bound_regions(&obligation.self_ty(), snapshot);
|
||||
self.probe(|this, _snapshot| {
|
||||
// the code below doesn't care about regions, and the
|
||||
// self-ty here doesn't escape this probe, so just erase
|
||||
// any LBR.
|
||||
let self_ty = this.tcx().erase_late_bound_regions(&obligation.self_ty());
|
||||
let poly_trait_ref = match self_ty.sty {
|
||||
ty::TyTrait(ref data) => {
|
||||
match this.tcx().lang_items.to_builtin_kind(obligation.predicate.def_id()) {
|
||||
|
|
@ -2710,7 +2723,10 @@ impl<'cx, 'gcx, 'tcx> SelectionContext<'cx, 'gcx, 'tcx> {
|
|||
})?;
|
||||
self.inferred_obligations.extend(obligations);
|
||||
|
||||
if let Err(e) = self.infcx.leak_check(false, &skol_map, snapshot) {
|
||||
if let Err(e) = self.infcx.leak_check(false,
|
||||
obligation.cause.span,
|
||||
&skol_map,
|
||||
snapshot) {
|
||||
debug!("match_impl: failed leak check due to `{}`", e);
|
||||
return Err(());
|
||||
}
|
||||
|
|
|
|||
|
|
@ -176,9 +176,13 @@ impl FlagComputation {
|
|||
|
||||
fn add_region(&mut self, r: ty::Region) {
|
||||
match r {
|
||||
ty::ReVar(..) |
|
||||
ty::ReVar(..) => {
|
||||
self.add_flags(TypeFlags::HAS_RE_INFER);
|
||||
self.add_flags(TypeFlags::KEEP_IN_LOCAL_TCX);
|
||||
}
|
||||
ty::ReSkolemized(..) => {
|
||||
self.add_flags(TypeFlags::HAS_RE_INFER);
|
||||
self.add_flags(TypeFlags::HAS_RE_SKOL);
|
||||
self.add_flags(TypeFlags::KEEP_IN_LOCAL_TCX);
|
||||
}
|
||||
ty::ReLateBound(debruijn, _) => { self.add_depth(debruijn.depth); }
|
||||
|
|
|
|||
|
|
@ -60,6 +60,7 @@ pub use self::sty::{ClosureTy, InferTy, ParamTy, ProjectionTy, TraitTy};
|
|||
pub use self::sty::{ClosureSubsts, TypeAndMut};
|
||||
pub use self::sty::{TraitRef, TypeVariants, PolyTraitRef};
|
||||
pub use self::sty::{BoundRegion, EarlyBoundRegion, FreeRegion, Region};
|
||||
pub use self::sty::Issue32330;
|
||||
pub use self::sty::{TyVid, IntVid, FloatVid, RegionVid, SkolemizedRegionVid};
|
||||
pub use self::sty::BoundRegion::*;
|
||||
pub use self::sty::FnOutput::*;
|
||||
|
|
@ -514,19 +515,20 @@ bitflags! {
|
|||
const HAS_SELF = 1 << 1,
|
||||
const HAS_TY_INFER = 1 << 2,
|
||||
const HAS_RE_INFER = 1 << 3,
|
||||
const HAS_RE_EARLY_BOUND = 1 << 4,
|
||||
const HAS_FREE_REGIONS = 1 << 5,
|
||||
const HAS_TY_ERR = 1 << 6,
|
||||
const HAS_PROJECTION = 1 << 7,
|
||||
const HAS_TY_CLOSURE = 1 << 8,
|
||||
const HAS_RE_SKOL = 1 << 4,
|
||||
const HAS_RE_EARLY_BOUND = 1 << 5,
|
||||
const HAS_FREE_REGIONS = 1 << 6,
|
||||
const HAS_TY_ERR = 1 << 7,
|
||||
const HAS_PROJECTION = 1 << 8,
|
||||
const HAS_TY_CLOSURE = 1 << 9,
|
||||
|
||||
// true if there are "names" of types and regions and so forth
|
||||
// that are local to a particular fn
|
||||
const HAS_LOCAL_NAMES = 1 << 9,
|
||||
const HAS_LOCAL_NAMES = 1 << 10,
|
||||
|
||||
// Present if the type belongs in a local type context.
|
||||
// Only set for TyInfer other than Fresh.
|
||||
const KEEP_IN_LOCAL_TCX = 1 << 10,
|
||||
const KEEP_IN_LOCAL_TCX = 1 << 11,
|
||||
|
||||
const NEEDS_SUBST = TypeFlags::HAS_PARAMS.bits |
|
||||
TypeFlags::HAS_SELF.bits |
|
||||
|
|
@ -739,7 +741,8 @@ impl RegionParameterDef {
|
|||
})
|
||||
}
|
||||
pub fn to_bound_region(&self) -> ty::BoundRegion {
|
||||
ty::BoundRegion::BrNamed(self.def_id, self.name)
|
||||
// this is an early bound region, so unaffected by #32330
|
||||
ty::BoundRegion::BrNamed(self.def_id, self.name, Issue32330::WontChange)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1013,7 +1016,7 @@ pub type PolyTypeOutlivesPredicate<'tcx> = PolyOutlivesPredicate<Ty<'tcx>, ty::R
|
|||
/// equality between arbitrary types. Processing an instance of Form
|
||||
/// #2 eventually yields one of these `ProjectionPredicate`
|
||||
/// instances to normalize the LHS.
|
||||
#[derive(Clone, PartialEq, Eq, Hash)]
|
||||
#[derive(Copy, Clone, PartialEq, Eq, Hash)]
|
||||
pub struct ProjectionPredicate<'tcx> {
|
||||
pub projection_ty: ProjectionTy<'tcx>,
|
||||
pub ty: Ty<'tcx>,
|
||||
|
|
@ -2855,7 +2858,7 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
|
|||
for def in generics.regions.as_slice() {
|
||||
let region =
|
||||
ReFree(FreeRegion { scope: free_id_outlive,
|
||||
bound_region: BrNamed(def.def_id, def.name) });
|
||||
bound_region: def.to_bound_region() });
|
||||
debug!("push_region_params {:?}", region);
|
||||
regions.push(def.space, region);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -827,7 +827,7 @@ impl<'tcx> TypeFoldable<'tcx> for ty::RegionParameterDef {
|
|||
def_id: self.def_id,
|
||||
space: self.space,
|
||||
index: self.index,
|
||||
bounds: self.bounds.fold_with(folder)
|
||||
bounds: self.bounds.fold_with(folder),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -58,7 +58,7 @@ pub enum BoundRegion {
|
|||
///
|
||||
/// The def-id is needed to distinguish free regions in
|
||||
/// the event of shadowing.
|
||||
BrNamed(DefId, Name),
|
||||
BrNamed(DefId, Name, Issue32330),
|
||||
|
||||
/// Fresh bound identifiers created during GLB computations.
|
||||
BrFresh(u32),
|
||||
|
|
@ -68,6 +68,25 @@ pub enum BoundRegion {
|
|||
BrEnv
|
||||
}
|
||||
|
||||
/// True if this late-bound region is unconstrained, and hence will
|
||||
/// become early-bound once #32330 is fixed.
|
||||
#[derive(Copy, Clone, Debug, PartialEq, PartialOrd, Eq, Ord, Hash,
|
||||
RustcEncodable, RustcDecodable)]
|
||||
pub enum Issue32330 {
|
||||
WontChange,
|
||||
|
||||
/// this region will change from late-bound to early-bound once
|
||||
/// #32330 is fixed.
|
||||
WillChange {
|
||||
/// fn where is region declared
|
||||
fn_def_id: DefId,
|
||||
|
||||
/// name of region; duplicates the info in BrNamed but convenient
|
||||
/// to have it here, and this code is only temporary
|
||||
region_name: ast::Name,
|
||||
}
|
||||
}
|
||||
|
||||
// NB: If you change this, you'll probably want to change the corresponding
|
||||
// AST structure in libsyntax/ast.rs as well.
|
||||
#[derive(Clone, PartialEq, Eq, Hash, Debug)]
|
||||
|
|
@ -697,7 +716,7 @@ pub struct EarlyBoundRegion {
|
|||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub struct TyVid {
|
||||
pub index: u32
|
||||
pub index: u32,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash)]
|
||||
|
|
|
|||
|
|
@ -261,7 +261,7 @@ fn in_binder<'a, 'gcx, 'tcx, T, U>(f: &mut fmt::Formatter,
|
|||
let new_value = tcx.replace_late_bound_regions(&value, |br| {
|
||||
let _ = start_or_continue(f, "for<", ", ");
|
||||
ty::ReLateBound(ty::DebruijnIndex::new(1), match br {
|
||||
ty::BrNamed(_, name) => {
|
||||
ty::BrNamed(_, name, _) => {
|
||||
let _ = write!(f, "{}", name);
|
||||
br
|
||||
}
|
||||
|
|
@ -270,7 +270,9 @@ fn in_binder<'a, 'gcx, 'tcx, T, U>(f: &mut fmt::Formatter,
|
|||
ty::BrEnv => {
|
||||
let name = token::intern("'r");
|
||||
let _ = write!(f, "{}", name);
|
||||
ty::BrNamed(tcx.map.local_def_id(CRATE_NODE_ID), name)
|
||||
ty::BrNamed(tcx.map.local_def_id(CRATE_NODE_ID),
|
||||
name,
|
||||
ty::Issue32330::WontChange)
|
||||
}
|
||||
})
|
||||
}).0;
|
||||
|
|
@ -485,7 +487,7 @@ impl fmt::Display for ty::BoundRegion {
|
|||
}
|
||||
|
||||
match *self {
|
||||
BrNamed(_, name) => write!(f, "{}", name),
|
||||
BrNamed(_, name, _) => write!(f, "{}", name),
|
||||
BrAnon(_) | BrFresh(_) | BrEnv => Ok(())
|
||||
}
|
||||
}
|
||||
|
|
@ -496,8 +498,9 @@ impl fmt::Debug for ty::BoundRegion {
|
|||
match *self {
|
||||
BrAnon(n) => write!(f, "BrAnon({:?})", n),
|
||||
BrFresh(n) => write!(f, "BrFresh({:?})", n),
|
||||
BrNamed(did, name) => {
|
||||
write!(f, "BrNamed({:?}:{:?}, {:?})", did.krate, did.index, name)
|
||||
BrNamed(did, name, issue32330) => {
|
||||
write!(f, "BrNamed({:?}:{:?}, {:?}, {:?})",
|
||||
did.krate, did.index, name, issue32330)
|
||||
}
|
||||
BrEnv => "BrEnv".fmt(f),
|
||||
}
|
||||
|
|
|
|||
|
|
@ -42,6 +42,7 @@ pub mod bitvec;
|
|||
pub mod graph;
|
||||
pub mod ivar;
|
||||
pub mod obligation_forest;
|
||||
pub mod snapshot_map;
|
||||
pub mod snapshot_vec;
|
||||
pub mod transitive_relation;
|
||||
pub mod unify;
|
||||
|
|
|
|||
138
src/librustc_data_structures/snapshot_map/mod.rs
Normal file
138
src/librustc_data_structures/snapshot_map/mod.rs
Normal file
|
|
@ -0,0 +1,138 @@
|
|||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use fnv::FnvHashMap;
|
||||
use std::hash::Hash;
|
||||
use std::ops;
|
||||
|
||||
#[cfg(test)]
|
||||
mod test;
|
||||
|
||||
pub struct SnapshotMap<K, V>
|
||||
where K: Hash + Clone + Eq
|
||||
{
|
||||
map: FnvHashMap<K, V>,
|
||||
undo_log: Vec<UndoLog<K, V>>,
|
||||
}
|
||||
|
||||
pub struct Snapshot {
|
||||
len: usize
|
||||
}
|
||||
|
||||
enum UndoLog<K, V> {
|
||||
OpenSnapshot,
|
||||
CommittedSnapshot,
|
||||
Inserted(K),
|
||||
Overwrite(K, V),
|
||||
}
|
||||
|
||||
impl<K, V> SnapshotMap<K, V>
|
||||
where K: Hash + Clone + Eq
|
||||
{
|
||||
pub fn new() -> Self {
|
||||
SnapshotMap {
|
||||
map: FnvHashMap(),
|
||||
undo_log: vec![]
|
||||
}
|
||||
}
|
||||
|
||||
pub fn insert(&mut self, key: K, value: V) -> bool {
|
||||
match self.map.insert(key.clone(), value) {
|
||||
None => {
|
||||
if !self.undo_log.is_empty() {
|
||||
self.undo_log.push(UndoLog::Inserted(key));
|
||||
}
|
||||
true
|
||||
}
|
||||
Some(old_value) => {
|
||||
if !self.undo_log.is_empty() {
|
||||
self.undo_log.push(UndoLog::Overwrite(key, old_value));
|
||||
}
|
||||
false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn remove(&mut self, key: K) -> bool {
|
||||
match self.map.remove(&key) {
|
||||
Some(old_value) => {
|
||||
if !self.undo_log.is_empty() {
|
||||
self.undo_log.push(UndoLog::Overwrite(key, old_value));
|
||||
}
|
||||
true
|
||||
}
|
||||
None => {
|
||||
false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get(&self, key: &K) -> Option<&V> {
|
||||
self.map.get(key)
|
||||
}
|
||||
|
||||
pub fn snapshot(&mut self) -> Snapshot {
|
||||
self.undo_log.push(UndoLog::OpenSnapshot);
|
||||
let len = self.undo_log.len() - 1;
|
||||
Snapshot { len: len }
|
||||
}
|
||||
|
||||
fn assert_open_snapshot(&self, snapshot: &Snapshot) {
|
||||
assert!(snapshot.len < self.undo_log.len());
|
||||
assert!(match self.undo_log[snapshot.len] {
|
||||
UndoLog::OpenSnapshot => true,
|
||||
_ => false
|
||||
});
|
||||
}
|
||||
|
||||
pub fn commit(&mut self, snapshot: Snapshot) {
|
||||
self.assert_open_snapshot(&snapshot);
|
||||
if snapshot.len == 0 {
|
||||
// The root snapshot.
|
||||
self.undo_log.truncate(0);
|
||||
} else {
|
||||
self.undo_log[snapshot.len] = UndoLog::CommittedSnapshot;
|
||||
}
|
||||
}
|
||||
|
||||
pub fn rollback_to(&mut self, snapshot: Snapshot) {
|
||||
self.assert_open_snapshot(&snapshot);
|
||||
while self.undo_log.len() > snapshot.len + 1 {
|
||||
match self.undo_log.pop().unwrap() {
|
||||
UndoLog::OpenSnapshot => {
|
||||
panic!("cannot rollback an uncommitted snapshot");
|
||||
}
|
||||
|
||||
UndoLog::CommittedSnapshot => { }
|
||||
|
||||
UndoLog::Inserted(key) => {
|
||||
self.map.remove(&key);
|
||||
}
|
||||
|
||||
UndoLog::Overwrite(key, old_value) => {
|
||||
self.map.insert(key, old_value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let v = self.undo_log.pop().unwrap();
|
||||
assert!(match v { UndoLog::OpenSnapshot => true, _ => false });
|
||||
assert!(self.undo_log.len() == snapshot.len);
|
||||
}
|
||||
}
|
||||
|
||||
impl<'k, K, V> ops::Index<&'k K> for SnapshotMap<K, V>
|
||||
where K: Hash + Clone + Eq
|
||||
{
|
||||
type Output = V;
|
||||
fn index(&self, key: &'k K) -> &V {
|
||||
&self.map[key]
|
||||
}
|
||||
}
|
||||
50
src/librustc_data_structures/snapshot_map/test.rs
Normal file
50
src/librustc_data_structures/snapshot_map/test.rs
Normal file
|
|
@ -0,0 +1,50 @@
|
|||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use super::SnapshotMap;
|
||||
|
||||
#[test]
|
||||
fn basic() {
|
||||
let mut map = SnapshotMap::new();
|
||||
map.insert(22, "twenty-two");
|
||||
let snapshot = map.snapshot();
|
||||
map.insert(22, "thirty-three");
|
||||
assert_eq!(map[&22], "thirty-three");
|
||||
map.insert(44, "fourty-four");
|
||||
assert_eq!(map[&44], "fourty-four");
|
||||
assert_eq!(map.get(&33), None);
|
||||
map.rollback_to(snapshot);
|
||||
assert_eq!(map[&22], "twenty-two");
|
||||
assert_eq!(map.get(&33), None);
|
||||
assert_eq!(map.get(&44), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[should_panic]
|
||||
fn out_of_order() {
|
||||
let mut map = SnapshotMap::new();
|
||||
map.insert(22, "twenty-two");
|
||||
let snapshot1 = map.snapshot();
|
||||
let _snapshot2 = map.snapshot();
|
||||
map.rollback_to(snapshot1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn nested_commit_then_rollback() {
|
||||
let mut map = SnapshotMap::new();
|
||||
map.insert(22, "twenty-two");
|
||||
let snapshot1 = map.snapshot();
|
||||
let snapshot2 = map.snapshot();
|
||||
map.insert(22, "thirty-three");
|
||||
map.commit(snapshot2);
|
||||
assert_eq!(map[&22], "thirty-three");
|
||||
map.rollback_to(snapshot1);
|
||||
assert_eq!(map[&22], "twenty-two");
|
||||
}
|
||||
|
|
@ -1662,31 +1662,12 @@ fn doc_generics<'a, 'tcx>(base_doc: rbml::Doc,
|
|||
}
|
||||
|
||||
let mut regions = subst::VecPerParamSpace::empty();
|
||||
for rp_doc in reader::tagged_docs(doc, tag_region_param_def) {
|
||||
let ident_str_doc = reader::get_doc(rp_doc,
|
||||
tag_region_param_def_ident);
|
||||
let name = item_name(&token::get_ident_interner(), ident_str_doc);
|
||||
let def_id_doc = reader::get_doc(rp_doc,
|
||||
tag_region_param_def_def_id);
|
||||
let def_id = translated_def_id(cdata, def_id_doc);
|
||||
|
||||
let doc = reader::get_doc(rp_doc, tag_region_param_def_space);
|
||||
let space = subst::ParamSpace::from_uint(reader::doc_as_u64(doc) as usize);
|
||||
|
||||
let doc = reader::get_doc(rp_doc, tag_region_param_def_index);
|
||||
let index = reader::doc_as_u64(doc) as u32;
|
||||
|
||||
let bounds = reader::tagged_docs(rp_doc, tag_items_data_region).map(|p| {
|
||||
for p in reader::tagged_docs(doc, tag_region_param_def) {
|
||||
let bd =
|
||||
TyDecoder::with_doc(tcx, cdata.cnum, p,
|
||||
&mut |did| translate_def_id(cdata, did))
|
||||
.parse_region()
|
||||
}).collect();
|
||||
|
||||
regions.push(space, ty::RegionParameterDef { name: name,
|
||||
def_id: def_id,
|
||||
space: space,
|
||||
index: index,
|
||||
bounds: bounds });
|
||||
.parse_region_param_def();
|
||||
regions.push(bd.space, bd);
|
||||
}
|
||||
|
||||
ty::Generics { types: types, regions: regions }
|
||||
|
|
|
|||
|
|
@ -203,15 +203,6 @@ fn encode_type<'a, 'tcx>(ecx: &EncodeContext<'a, 'tcx>,
|
|||
rbml_w.end_tag();
|
||||
}
|
||||
|
||||
fn encode_region(ecx: &EncodeContext,
|
||||
rbml_w: &mut Encoder,
|
||||
r: ty::Region) {
|
||||
rbml_w.start_tag(tag_items_data_region);
|
||||
tyencode::enc_region(rbml_w.writer, &ecx.ty_str_ctxt(), r);
|
||||
rbml_w.mark_stable_position();
|
||||
rbml_w.end_tag();
|
||||
}
|
||||
|
||||
fn encode_disr_val(_: &EncodeContext,
|
||||
rbml_w: &mut Encoder,
|
||||
disr_val: ty::Disr) {
|
||||
|
|
@ -535,24 +526,8 @@ fn encode_generics<'a, 'tcx>(rbml_w: &mut Encoder,
|
|||
// Region parameters
|
||||
for param in &generics.regions {
|
||||
rbml_w.start_tag(tag_region_param_def);
|
||||
|
||||
rbml_w.start_tag(tag_region_param_def_ident);
|
||||
encode_name(rbml_w, param.name);
|
||||
rbml_w.end_tag();
|
||||
|
||||
rbml_w.wr_tagged_u64(tag_region_param_def_def_id,
|
||||
def_to_u64(param.def_id));
|
||||
|
||||
rbml_w.wr_tagged_u64(tag_region_param_def_space,
|
||||
param.space.to_uint() as u64);
|
||||
|
||||
rbml_w.wr_tagged_u64(tag_region_param_def_index,
|
||||
param.index as u64);
|
||||
|
||||
for &bound_region in ¶m.bounds {
|
||||
encode_region(ecx, rbml_w, bound_region);
|
||||
}
|
||||
|
||||
tyencode::enc_region_param_def(rbml_w.writer, &ecx.ty_str_ctxt(), param);
|
||||
rbml_w.mark_stable_position();
|
||||
rbml_w.end_tag();
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -158,8 +158,21 @@ impl<'a,'tcx> TyDecoder<'a,'tcx> {
|
|||
}
|
||||
'[' => {
|
||||
let def = self.parse_def();
|
||||
let name = token::intern(&self.parse_str(']'));
|
||||
ty::BrNamed(def, name)
|
||||
let name = token::intern(&self.parse_str('|'));
|
||||
let issue32330 = match self.next() {
|
||||
'n' => {
|
||||
assert_eq!(self.next(), ']');
|
||||
ty::Issue32330::WontChange
|
||||
}
|
||||
'y' => {
|
||||
ty::Issue32330::WillChange {
|
||||
fn_def_id: self.parse_def(),
|
||||
region_name: token::intern(&self.parse_str(']')),
|
||||
}
|
||||
}
|
||||
c => panic!("expected n or y not {}", c)
|
||||
};
|
||||
ty::BrNamed(def, name, issue32330)
|
||||
}
|
||||
'f' => {
|
||||
let id = self.parse_u32();
|
||||
|
|
@ -623,7 +636,7 @@ impl<'a,'tcx> TyDecoder<'a,'tcx> {
|
|||
def_id: def_id,
|
||||
space: space,
|
||||
index: index,
|
||||
bounds: bounds
|
||||
bounds: bounds,
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -308,10 +308,17 @@ fn enc_bound_region(w: &mut Cursor<Vec<u8>>, cx: &ctxt, br: ty::BoundRegion) {
|
|||
ty::BrAnon(idx) => {
|
||||
write!(w, "a{}|", idx);
|
||||
}
|
||||
ty::BrNamed(d, name) => {
|
||||
write!(w, "[{}|{}]",
|
||||
(cx.ds)(cx.tcx, d),
|
||||
name);
|
||||
ty::BrNamed(d, name, issue32330) => {
|
||||
write!(w, "[{}|{}|",
|
||||
(cx.ds)(cx.tcx, d),
|
||||
name);
|
||||
|
||||
match issue32330 {
|
||||
ty::Issue32330::WontChange =>
|
||||
write!(w, "n]"),
|
||||
ty::Issue32330::WillChange { fn_def_id, region_name } =>
|
||||
write!(w, "y{}|{}]", (cx.ds)(cx.tcx, fn_def_id), region_name),
|
||||
};
|
||||
}
|
||||
ty::BrFresh(id) => {
|
||||
write!(w, "f{}|", id);
|
||||
|
|
|
|||
|
|
@ -1061,7 +1061,7 @@ pub fn fulfill_obligation<'a, 'tcx>(scx: &SharedCrateContext<'a, 'tcx>,
|
|||
let trait_ref = tcx.erase_regions(&trait_ref);
|
||||
|
||||
scx.trait_cache().memoize(trait_ref, || {
|
||||
debug!("trans fulfill_obligation: trait_ref={:?} def_id={:?}",
|
||||
debug!("trans::fulfill_obligation(trait_ref={:?}, def_id={:?})",
|
||||
trait_ref, trait_ref.def_id());
|
||||
|
||||
// Do the initial selection for the obligation. This yields the
|
||||
|
|
@ -1096,11 +1096,14 @@ pub fn fulfill_obligation<'a, 'tcx>(scx: &SharedCrateContext<'a, 'tcx>,
|
|||
}
|
||||
};
|
||||
|
||||
debug!("fulfill_obligation: selection={:?}", selection);
|
||||
|
||||
// Currently, we use a fulfillment context to completely resolve
|
||||
// all nested obligations. This is because they can inform the
|
||||
// inference of the impl's type parameters.
|
||||
let mut fulfill_cx = traits::FulfillmentContext::new();
|
||||
let vtable = selection.map(|predicate| {
|
||||
debug!("fulfill_obligation: register_predicate_obligation {:?}", predicate);
|
||||
fulfill_cx.register_predicate_obligation(&infcx, predicate);
|
||||
});
|
||||
let vtable = infcx.drain_fulfillment_cx_or_panic(span, &mut fulfill_cx, &vtable);
|
||||
|
|
|
|||
|
|
@ -170,7 +170,7 @@ type TraitAndProjections<'tcx> = (ty::PolyTraitRef<'tcx>, Vec<ty::PolyProjection
|
|||
|
||||
pub fn ast_region_to_region(tcx: TyCtxt, lifetime: &hir::Lifetime)
|
||||
-> ty::Region {
|
||||
let r = match tcx.named_region_map.get(&lifetime.id) {
|
||||
let r = match tcx.named_region_map.defs.get(&lifetime.id) {
|
||||
None => {
|
||||
// should have been recorded by the `resolve_lifetime` pass
|
||||
span_bug!(lifetime.span, "unresolved lifetime");
|
||||
|
|
@ -181,7 +181,20 @@ pub fn ast_region_to_region(tcx: TyCtxt, lifetime: &hir::Lifetime)
|
|||
}
|
||||
|
||||
Some(&rl::DefLateBoundRegion(debruijn, id)) => {
|
||||
ty::ReLateBound(debruijn, ty::BrNamed(tcx.map.local_def_id(id), lifetime.name))
|
||||
// If this region is declared on a function, it will have
|
||||
// an entry in `late_bound`, but if it comes from
|
||||
// `for<'a>` in some type or something, it won't
|
||||
// necessarily have one. In that case though, we won't be
|
||||
// changed from late to early bound, so we can just
|
||||
// substitute false.
|
||||
let issue_32330 = tcx.named_region_map
|
||||
.late_bound
|
||||
.get(&id)
|
||||
.cloned()
|
||||
.unwrap_or(ty::Issue32330::WontChange);
|
||||
ty::ReLateBound(debruijn, ty::BrNamed(tcx.map.local_def_id(id),
|
||||
lifetime.name,
|
||||
issue_32330))
|
||||
}
|
||||
|
||||
Some(&rl::DefEarlyBoundRegion(space, index, _)) => {
|
||||
|
|
@ -193,11 +206,21 @@ pub fn ast_region_to_region(tcx: TyCtxt, lifetime: &hir::Lifetime)
|
|||
}
|
||||
|
||||
Some(&rl::DefFreeRegion(scope, id)) => {
|
||||
// As in DefLateBoundRegion above, could be missing for some late-bound
|
||||
// regions, but also for early-bound regions.
|
||||
let issue_32330 = tcx.named_region_map
|
||||
.late_bound
|
||||
.get(&id)
|
||||
.cloned()
|
||||
.unwrap_or(ty::Issue32330::WontChange);
|
||||
ty::ReFree(ty::FreeRegion {
|
||||
scope: scope.to_code_extent(&tcx.region_maps),
|
||||
bound_region: ty::BrNamed(tcx.map.local_def_id(id),
|
||||
lifetime.name)
|
||||
})
|
||||
lifetime.name,
|
||||
issue_32330)
|
||||
})
|
||||
|
||||
// (*) -- not late-bound, won't change
|
||||
}
|
||||
};
|
||||
|
||||
|
|
@ -911,7 +934,7 @@ impl<'o, 'gcx: 'tcx, 'tcx> AstConv<'gcx, 'tcx>+'o {
|
|||
debug!("late_bound_in_ty = {:?}", late_bound_in_ty);
|
||||
for br in late_bound_in_ty.difference(&late_bound_in_trait_ref) {
|
||||
let br_name = match *br {
|
||||
ty::BrNamed(_, name) => name,
|
||||
ty::BrNamed(_, name, _) => name,
|
||||
_ => {
|
||||
span_bug!(
|
||||
binding.span,
|
||||
|
|
@ -1675,7 +1698,7 @@ impl<'o, 'gcx: 'tcx, 'tcx> AstConv<'gcx, 'tcx>+'o {
|
|||
let late_bound_in_ret = tcx.collect_referenced_late_bound_regions(&output);
|
||||
for br in late_bound_in_ret.difference(&late_bound_in_args) {
|
||||
let br_name = match *br {
|
||||
ty::BrNamed(_, name) => name,
|
||||
ty::BrNamed(_, name, _) => name,
|
||||
_ => {
|
||||
span_bug!(
|
||||
bf.decl.output.span(),
|
||||
|
|
|
|||
|
|
@ -64,7 +64,6 @@ use hir::def::Def;
|
|||
use hir::def_id::DefId;
|
||||
use constrained_type_params as ctp;
|
||||
use middle::lang_items::SizedTraitLangItem;
|
||||
use middle::resolve_lifetime;
|
||||
use middle::const_val::ConstVal;
|
||||
use rustc_const_eval::EvalHint::UncheckedExprHint;
|
||||
use rustc_const_eval::{eval_const_expr_partial, ConstEvalErr};
|
||||
|
|
@ -1745,14 +1744,16 @@ fn add_unsized_bound<'tcx>(astconv: &AstConv<'tcx, 'tcx>,
|
|||
/// the lifetimes that are declared. For fns or methods, we have to
|
||||
/// screen out those that do not appear in any where-clauses etc using
|
||||
/// `resolve_lifetime::early_bound_lifetimes`.
|
||||
fn early_bound_lifetimes_from_generics(space: ParamSpace,
|
||||
ast_generics: &hir::Generics)
|
||||
-> Vec<hir::LifetimeDef>
|
||||
fn early_bound_lifetimes_from_generics<'a, 'tcx, 'hir>(
|
||||
ccx: &CrateCtxt<'a, 'tcx>,
|
||||
ast_generics: &'hir hir::Generics)
|
||||
-> Vec<&'hir hir::LifetimeDef>
|
||||
{
|
||||
match space {
|
||||
SelfSpace | TypeSpace => ast_generics.lifetimes.to_vec(),
|
||||
FnSpace => resolve_lifetime::early_bound_lifetimes(ast_generics),
|
||||
}
|
||||
ast_generics
|
||||
.lifetimes
|
||||
.iter()
|
||||
.filter(|l| !ccx.tcx.named_region_map.late_bound.contains_key(&l.lifetime.id))
|
||||
.collect()
|
||||
}
|
||||
|
||||
fn ty_generic_predicates<'a,'tcx>(ccx: &CrateCtxt<'a,'tcx>,
|
||||
|
|
@ -1781,7 +1782,7 @@ fn ty_generic_predicates<'a,'tcx>(ccx: &CrateCtxt<'a,'tcx>,
|
|||
// Collect the region predicates that were declared inline as
|
||||
// well. In the case of parameters declared on a fn or method, we
|
||||
// have to be careful to only iterate over early-bound regions.
|
||||
let early_lifetimes = early_bound_lifetimes_from_generics(space, ast_generics);
|
||||
let early_lifetimes = early_bound_lifetimes_from_generics(ccx, ast_generics);
|
||||
for (index, param) in early_lifetimes.iter().enumerate() {
|
||||
let index = index as u32;
|
||||
let region =
|
||||
|
|
@ -1864,7 +1865,7 @@ fn ty_generics<'a,'tcx>(ccx: &CrateCtxt<'a,'tcx>,
|
|||
let tcx = ccx.tcx;
|
||||
let mut result = base_generics.clone();
|
||||
|
||||
let early_lifetimes = early_bound_lifetimes_from_generics(space, ast_generics);
|
||||
let early_lifetimes = early_bound_lifetimes_from_generics(ccx, ast_generics);
|
||||
for (i, l) in early_lifetimes.iter().enumerate() {
|
||||
let bounds = l.bounds.iter()
|
||||
.map(|l| ast_region_to_region(tcx, l))
|
||||
|
|
|
|||
|
|
@ -144,7 +144,7 @@ impl<'a, 'tcx> ConstraintContext<'a, 'tcx> {
|
|||
fn find_binding_for_lifetime(&self, param_id: ast::NodeId) -> ast::NodeId {
|
||||
let tcx = self.terms_cx.tcx;
|
||||
assert!(is_lifetime(&tcx.map, param_id));
|
||||
match tcx.named_region_map.get(¶m_id) {
|
||||
match tcx.named_region_map.defs.get(¶m_id) {
|
||||
Some(&rl::DefEarlyBoundRegion(_, _, lifetime_decl_id))
|
||||
=> lifetime_decl_id,
|
||||
Some(_) => bug!("should not encounter non early-bound cases"),
|
||||
|
|
|
|||
|
|
@ -819,7 +819,7 @@ impl Clean<Option<Lifetime>> for ty::Region {
|
|||
fn clean(&self, cx: &DocContext) -> Option<Lifetime> {
|
||||
match *self {
|
||||
ty::ReStatic => Some(Lifetime::statik()),
|
||||
ty::ReLateBound(_, ty::BrNamed(_, name)) => Some(Lifetime(name.to_string())),
|
||||
ty::ReLateBound(_, ty::BrNamed(_, name, _)) => Some(Lifetime(name.to_string())),
|
||||
ty::ReEarlyBound(ref data) => Some(Lifetime(data.name.clean(cx))),
|
||||
|
||||
ty::ReLateBound(..) |
|
||||
|
|
|
|||
|
|
@ -422,7 +422,7 @@ pub fn expand_quote_expr<'cx>(cx: &'cx mut ExtCtxt,
|
|||
base::MacEager::expr(expanded)
|
||||
}
|
||||
|
||||
pub fn expand_quote_item<'cx>(cx: &mut ExtCtxt,
|
||||
pub fn expand_quote_item<'cx>(cx: &'cx mut ExtCtxt,
|
||||
sp: Span,
|
||||
tts: &[TokenTree])
|
||||
-> Box<base::MacResult+'cx> {
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ use syntax::parse::token;
|
|||
use syntax::parse::token::str_to_ident;
|
||||
use syntax::ptr::P;
|
||||
|
||||
pub fn expand_syntax_ext<'cx>(cx: &mut ExtCtxt, sp: Span, tts: &[TokenTree])
|
||||
pub fn expand_syntax_ext<'cx>(cx: &'cx mut ExtCtxt, sp: Span, tts: &[TokenTree])
|
||||
-> Box<base::MacResult+'cx> {
|
||||
if !cx.ecfg.enable_concat_idents() {
|
||||
feature_gate::emit_feature_err(&cx.parse_sess.span_diagnostic,
|
||||
|
|
|
|||
39
src/test/compile-fail/associated-types/cache/chrono-scan.rs
vendored
Normal file
39
src/test/compile-fail/associated-types/cache/chrono-scan.rs
vendored
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
#![allow(warnings)]
|
||||
|
||||
pub type ParseResult<T> = Result<T, ()>;
|
||||
|
||||
pub enum Item<'a> { Literal(&'a str),
|
||||
}
|
||||
|
||||
pub fn colon_or_space(s: &str) -> ParseResult<&str> {
|
||||
unimplemented!()
|
||||
}
|
||||
|
||||
pub fn timezone_offset_zulu<F>(s: &str, colon: F) -> ParseResult<(&str, i32)>
|
||||
where F: FnMut(&str) -> ParseResult<&str> {
|
||||
unimplemented!()
|
||||
}
|
||||
|
||||
pub fn parse<'a, I>(mut s: &str, items: I) -> ParseResult<()>
|
||||
where I: Iterator<Item=Item<'a>> {
|
||||
macro_rules! try_consume {
|
||||
($e:expr) => ({ let (s_, v) = try!($e); s = s_; v })
|
||||
}
|
||||
let offset = try_consume!(timezone_offset_zulu(s.trim_left(), colon_or_space));
|
||||
let offset = try_consume!(timezone_offset_zulu(s.trim_left(), colon_or_space));
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[rustc_error]
|
||||
fn main() { } //~ ERROR compilation successful
|
||||
34
src/test/compile-fail/associated-types/cache/elision.rs
vendored
Normal file
34
src/test/compile-fail/associated-types/cache/elision.rs
vendored
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
#![allow(warnings)]
|
||||
|
||||
// Check that you are allowed to implement using elision but write
|
||||
// trait without elision (a bug in this cropped up during
|
||||
// bootstrapping, so this is a regression test).
|
||||
|
||||
pub struct SplitWhitespace<'a> {
|
||||
x: &'a u8
|
||||
}
|
||||
|
||||
pub trait UnicodeStr {
|
||||
fn split_whitespace<'a>(&'a self) -> SplitWhitespace<'a>;
|
||||
}
|
||||
|
||||
impl UnicodeStr for str {
|
||||
#[inline]
|
||||
fn split_whitespace(&self) -> SplitWhitespace {
|
||||
unimplemented!()
|
||||
}
|
||||
}
|
||||
|
||||
#[rustc_error]
|
||||
fn main() { } //~ ERROR compilation successful
|
||||
65
src/test/compile-fail/associated-types/cache/project-fn-ret-contravariant.rs
vendored
Normal file
65
src/test/compile-fail/associated-types/cache/project-fn-ret-contravariant.rs
vendored
Normal file
|
|
@ -0,0 +1,65 @@
|
|||
// Copyright 2012 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(unboxed_closures)]
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
// Test for projection cache. We should be able to project distinct
|
||||
// lifetimes from `foo` as we reinstantiate it multiple times, but not
|
||||
// if we do it just once. In this variant, the region `'a` is used in
|
||||
// an contravariant position, which affects the results.
|
||||
|
||||
// revisions: ok oneuse transmute krisskross
|
||||
|
||||
#![allow(dead_code, unused_variables)]
|
||||
|
||||
fn foo<'a>() -> &'a u32 { loop { } }
|
||||
|
||||
fn bar<T>(t: T, x: T::Output) -> T::Output
|
||||
where T: FnOnce<()>
|
||||
{
|
||||
t()
|
||||
}
|
||||
|
||||
#[cfg(ok)] // two instantiations: OK
|
||||
fn baz<'a,'b>(x: &'a u32, y: &'b u32) -> (&'a u32, &'b u32) {
|
||||
let a = bar(foo, x);
|
||||
let b = bar(foo, y);
|
||||
(a, b)
|
||||
}
|
||||
|
||||
#[cfg(oneuse)] // one instantiation: OK (surprisingly)
|
||||
fn baz<'a,'b>(x: &'a u32, y: &'b u32) -> (&'a u32, &'b u32) {
|
||||
let f /* : fn() -> &'static u32 */ = foo; // <-- inferred type annotated
|
||||
let a = bar(f, x); // this is considered ok because fn args are contravariant...
|
||||
let b = bar(f, y); // ...and hence we infer T to distinct values in each call.
|
||||
(a, b)
|
||||
}
|
||||
|
||||
// FIXME(#32330)
|
||||
//#[cfg(transmute)] // one instantiations: BAD
|
||||
//fn baz<'a,'b>(x: &'a u32) -> &'static u32 {
|
||||
// bar(foo, x) //[transmute] ERROR E0495
|
||||
//}
|
||||
|
||||
// FIXME(#32330)
|
||||
//#[cfg(krisskross)] // two instantiations, mixing and matching: BAD
|
||||
//fn transmute<'a,'b>(x: &'a u32, y: &'b u32) -> (&'a u32, &'b u32) {
|
||||
// let a = bar(foo, y); //[krisskross] ERROR E0495
|
||||
// let b = bar(foo, x); //[krisskross] ERROR E0495
|
||||
// (a, b)
|
||||
//}
|
||||
|
||||
#[rustc_error]
|
||||
fn main() { }
|
||||
//[ok]~^ ERROR compilation successful
|
||||
//[oneuse]~^^ ERROR compilation successful
|
||||
//[transmute]~^^^ ERROR compilation successful
|
||||
//[krisskross]~^^^^ ERROR compilation successful
|
||||
76
src/test/compile-fail/associated-types/cache/project-fn-ret-invariant.rs
vendored
Normal file
76
src/test/compile-fail/associated-types/cache/project-fn-ret-invariant.rs
vendored
Normal file
|
|
@ -0,0 +1,76 @@
|
|||
// Copyright 2012 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(unboxed_closures)]
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
// Test for projection cache. We should be able to project distinct
|
||||
// lifetimes from `foo` as we reinstantiate it multiple times, but not
|
||||
// if we do it just once. In this variant, the region `'a` is used in
|
||||
// an invariant position, which affects the results.
|
||||
|
||||
// revisions: ok oneuse transmute krisskross
|
||||
|
||||
#![allow(dead_code, unused_variables)]
|
||||
|
||||
use std::marker::PhantomData;
|
||||
|
||||
struct Type<'a> {
|
||||
// Invariant
|
||||
data: PhantomData<fn(&'a u32) -> &'a u32>
|
||||
}
|
||||
|
||||
fn foo<'a>() -> Type<'a> { loop { } }
|
||||
|
||||
fn bar<T>(t: T, x: T::Output) -> T::Output
|
||||
where T: FnOnce<()>
|
||||
{
|
||||
t()
|
||||
}
|
||||
|
||||
#[cfg(ok)] // two instantiations: OK
|
||||
fn baz<'a,'b>(x: Type<'a>, y: Type<'b>) -> (Type<'a>, Type<'b>) {
|
||||
let a = bar(foo, x);
|
||||
let b = bar(foo, y);
|
||||
(a, b)
|
||||
}
|
||||
|
||||
// FIXME(#32330)
|
||||
//#[cfg(oneuse)] // one instantiation: BAD
|
||||
//fn baz<'a,'b>(x: Type<'a>, y: Type<'b>) -> (Type<'a>, Type<'b>) {
|
||||
// let f = foo; // <-- No consistent type can be inferred for `f` here.
|
||||
// let a = bar(f, x); //[oneuse] ERROR E0495
|
||||
// let b = bar(f, y);
|
||||
// (a, b)
|
||||
//}
|
||||
|
||||
// FIXME(#32330)
|
||||
//#[cfg(transmute)] // one instantiations: BAD
|
||||
//fn baz<'a,'b>(x: Type<'a>) -> Type<'static> {
|
||||
// // Cannot instantiate `foo` with any lifetime other than `'a`,
|
||||
// // since it is provided as input.
|
||||
//
|
||||
// bar(foo, x) //[transmute] ERROR E0495
|
||||
//}
|
||||
|
||||
// FIXME(#32330)
|
||||
//#[cfg(krisskross)] // two instantiations, mixing and matching: BAD
|
||||
//fn transmute<'a,'b>(x: Type<'a>, y: Type<'b>) -> (Type<'a>, Type<'b>) {
|
||||
// let a = bar(foo, y); //[krisskross] ERROR E0495
|
||||
// let b = bar(foo, x); //[krisskross] ERROR E0495
|
||||
// (a, b)
|
||||
//}
|
||||
|
||||
#[rustc_error]
|
||||
fn main() { }
|
||||
//[ok]~^ ERROR compilation successful
|
||||
//[oneuse]~^^ ERROR compilation successful
|
||||
//[transmute]~^^^ ERROR compilation successful
|
||||
//[krisskross]~^^^^ ERROR compilation successful
|
||||
49
src/test/compile-fail/associated-types/cache/wasm-issue-32330.rs
vendored
Normal file
49
src/test/compile-fail/associated-types/cache/wasm-issue-32330.rs
vendored
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// This test was derived from the wasm and parsell crates. They
|
||||
// stopped compiling when #32330 is fixed.
|
||||
|
||||
#![allow(dead_code, unused_variables)]
|
||||
#![deny(hr_lifetime_in_assoc_type)]
|
||||
#![feature(unboxed_closures)]
|
||||
|
||||
use std::str::Chars;
|
||||
|
||||
pub trait HasOutput<Ch, Str> {
|
||||
type Output;
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Hash, Ord, PartialOrd, Debug)]
|
||||
pub enum Token<'a> {
|
||||
Begin(&'a str)
|
||||
}
|
||||
|
||||
fn mk_unexpected_char_err<'a>() -> Option<&'a i32> {
|
||||
unimplemented!()
|
||||
}
|
||||
|
||||
fn foo<'a>(data: &mut Chars<'a>) {
|
||||
bar(mk_unexpected_char_err)
|
||||
//~^ ERROR lifetime parameter `'a` declared on fn `mk_unexpected_char_err`
|
||||
//~| WARNING hard error in a future release
|
||||
}
|
||||
|
||||
fn bar<F>(t: F)
|
||||
// No type can satisfy this requirement, since `'a` does not
|
||||
// appear in any of the input types:
|
||||
where F: for<'a> Fn() -> Option<&'a i32>
|
||||
//~^ ERROR associated type `Output` references lifetime `'a`, which does not
|
||||
//~| WARNING hard error in a future release
|
||||
{
|
||||
}
|
||||
|
||||
fn main() {
|
||||
}
|
||||
119
src/test/compile-fail/hr-subtype.rs
Normal file
119
src/test/compile-fail/hr-subtype.rs
Normal file
|
|
@ -0,0 +1,119 @@
|
|||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Targeted tests for the higher-ranked subtyping code.
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
#![allow(dead_code)]
|
||||
|
||||
// revisions: bound_a_vs_bound_a
|
||||
// revisions: bound_a_vs_bound_b
|
||||
// revisions: bound_inv_a_vs_bound_inv_b
|
||||
// revisions: bound_co_a_vs_bound_co_b
|
||||
// revisions: bound_a_vs_free_x
|
||||
// revisions: free_x_vs_free_x
|
||||
// revisions: free_x_vs_free_y
|
||||
// revisions: free_inv_x_vs_free_inv_y
|
||||
// revisions: bound_a_b_vs_bound_a
|
||||
// revisions: bound_co_a_b_vs_bound_co_a
|
||||
// revisions: bound_contra_a_contra_b_ret_co_a
|
||||
// revisions: bound_co_a_co_b_ret_contra_a
|
||||
// revisions: bound_inv_a_b_vs_bound_inv_a
|
||||
// revisions: bound_a_b_ret_a_vs_bound_a_ret_a
|
||||
|
||||
fn gimme<T>(_: Option<T>) { }
|
||||
|
||||
struct Inv<'a> { x: *mut &'a u32 }
|
||||
|
||||
struct Co<'a> { x: fn(&'a u32) }
|
||||
|
||||
struct Contra<'a> { x: &'a u32 }
|
||||
|
||||
macro_rules! check {
|
||||
($rev:ident: ($t1:ty, $t2:ty)) => {
|
||||
#[cfg($rev)]
|
||||
fn subtype<'x,'y:'x,'z:'y>() {
|
||||
gimme::<$t2>(None::<$t1>);
|
||||
//[free_inv_x_vs_free_inv_y]~^ ERROR mismatched types
|
||||
}
|
||||
|
||||
#[cfg($rev)]
|
||||
fn supertype<'x,'y:'x,'z:'y>() {
|
||||
gimme::<$t1>(None::<$t2>);
|
||||
//[bound_a_vs_free_x]~^ ERROR mismatched types
|
||||
//[free_x_vs_free_y]~^^ ERROR mismatched types
|
||||
//[bound_inv_a_b_vs_bound_inv_a]~^^^ ERROR mismatched types
|
||||
//[bound_a_b_ret_a_vs_bound_a_ret_a]~^^^^ ERROR mismatched types
|
||||
//[free_inv_x_vs_free_inv_y]~^^^^^ ERROR mismatched types
|
||||
//[bound_a_b_vs_bound_a]~^^^^^^ ERROR mismatched types
|
||||
//[bound_co_a_b_vs_bound_co_a]~^^^^^^^ ERROR mismatched types
|
||||
//[bound_contra_a_contra_b_ret_co_a]~^^^^^^^^ ERROR mismatched types
|
||||
//[bound_co_a_co_b_ret_contra_a]~^^^^^^^^^ ERROR mismatched types
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If both have bound regions, they are equivalent, regardless of
|
||||
// variant.
|
||||
check! { bound_a_vs_bound_a: (for<'a> fn(&'a u32),
|
||||
for<'a> fn(&'a u32)) }
|
||||
check! { bound_a_vs_bound_b: (for<'a> fn(&'a u32),
|
||||
for<'b> fn(&'b u32)) }
|
||||
check! { bound_inv_a_vs_bound_inv_b: (for<'a> fn(Inv<'a>),
|
||||
for<'b> fn(Inv<'b>)) }
|
||||
check! { bound_co_a_vs_bound_co_b: (for<'a> fn(Co<'a>),
|
||||
for<'b> fn(Co<'b>)) }
|
||||
|
||||
// Bound is a subtype of free.
|
||||
check! { bound_a_vs_free_x: (for<'a> fn(&'a u32),
|
||||
fn(&'x u32)) }
|
||||
|
||||
// Two free regions are relatable if subtyping holds.
|
||||
check! { free_x_vs_free_x: (fn(&'x u32),
|
||||
fn(&'x u32)) }
|
||||
check! { free_x_vs_free_y: (fn(&'x u32),
|
||||
fn(&'y u32)) }
|
||||
check! { free_inv_x_vs_free_inv_y: (fn(Inv<'x>),
|
||||
fn(Inv<'y>)) }
|
||||
|
||||
// Somewhat surprisingly, a fn taking two distinct bound lifetimes and
|
||||
// a fn taking one bound lifetime can be interchangable, but only if
|
||||
// we are co- or contra-variant with respect to both lifetimes.
|
||||
//
|
||||
// The reason is:
|
||||
// - if we are covariant, then 'a and 'b can be set to the call-site
|
||||
// intersection;
|
||||
// - if we are contravariant, then 'a can be inferred to 'static.
|
||||
//
|
||||
// FIXME(#32330) this is true, but we are not currently impl'ing this
|
||||
// full semantics
|
||||
check! { bound_a_b_vs_bound_a: (for<'a,'b> fn(&'a u32, &'b u32),
|
||||
for<'a> fn(&'a u32, &'a u32)) }
|
||||
check! { bound_co_a_b_vs_bound_co_a: (for<'a,'b> fn(Co<'a>, Co<'b>),
|
||||
for<'a> fn(Co<'a>, Co<'a>)) }
|
||||
check! { bound_contra_a_contra_b_ret_co_a: (for<'a,'b> fn(Contra<'a>, Contra<'b>) -> Co<'a>,
|
||||
for<'a> fn(Contra<'a>, Contra<'a>) -> Co<'a>) }
|
||||
check! { bound_co_a_co_b_ret_contra_a: (for<'a,'b> fn(Co<'a>, Co<'b>) -> Contra<'a>,
|
||||
for<'a> fn(Co<'a>, Co<'a>) -> Contra<'a>) }
|
||||
|
||||
// If we make those lifetimes invariant, then the two types are not interchangable.
|
||||
check! { bound_inv_a_b_vs_bound_inv_a: (for<'a,'b> fn(Inv<'a>, Inv<'b>),
|
||||
for<'a> fn(Inv<'a>, Inv<'a>)) }
|
||||
check! { bound_a_b_ret_a_vs_bound_a_ret_a: (for<'a,'b> fn(&'a u32, &'b u32) -> &'a u32,
|
||||
for<'a> fn(&'a u32, &'a u32) -> &'a u32) }
|
||||
|
||||
#[rustc_error]
|
||||
fn main() {
|
||||
//[bound_a_vs_bound_a]~^ ERROR compilation successful
|
||||
//[bound_a_vs_bound_b]~^^ ERROR compilation successful
|
||||
//[bound_inv_a_vs_bound_inv_b]~^^^ ERROR compilation successful
|
||||
//[bound_co_a_vs_bound_co_b]~^^^^ ERROR compilation successful
|
||||
//[free_x_vs_free_x]~^^^^^ ERROR compilation successful
|
||||
}
|
||||
|
|
@ -28,7 +28,7 @@ impl<'a> Test<'a> for Foo<'a> {
|
|||
|
||||
impl<'a> NoLifetime for Foo<'a> {
|
||||
fn get<'p, T : Test<'a>>(&self) -> T {
|
||||
//~^ ERROR lifetime parameters or bounds on method `get` do not match the trait declaration
|
||||
//~^ ERROR E0195
|
||||
return *self as T;
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -39,7 +39,6 @@ impl<'a> Publisher<'a> for MyStruct<'a> {
|
|||
// Not obvious, but there is an implicit lifetime here -------^
|
||||
//~^^ ERROR cannot infer
|
||||
//~| ERROR cannot infer
|
||||
//~| ERROR cannot infer
|
||||
//
|
||||
// The fact that `Publisher` is using an implicit lifetime is
|
||||
// what was causing the debruijn accounting to be off, so
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ use std::marker::PhantomData;
|
|||
|
||||
struct Bar<'x, 'y, 'z> { bar: &'y i32, baz: i32, marker: PhantomData<(&'x(),&'y(),&'z())> }
|
||||
fn bar1<'a>(x: &Bar) -> (&'a i32, &'a i32, &'a i32) {
|
||||
//~^ HELP: consider using an explicit lifetime parameter as shown: fn bar1<'a>(x: &'a Bar) -> (&'a i32, &'a i32, &'a i32)
|
||||
//~^ HELP consider using an explicit lifetime parameter as shown: fn bar1<'b, 'c, 'a>(x: &'a Bar<'b, 'a, 'c>) -> (&'a i32, &'a i32, &'a i32)
|
||||
(x.bar, &x.baz, &x.baz)
|
||||
//~^ ERROR E0312
|
||||
//~| ERROR cannot infer
|
||||
|
|
|
|||
|
|
@ -49,7 +49,7 @@ struct Baz<'x> {
|
|||
|
||||
impl<'a> Baz<'a> {
|
||||
fn baz2<'b>(&self, x: &isize) -> (&'b isize, &'b isize) {
|
||||
//~^ HELP consider using an explicit lifetime parameter as shown: fn baz2<'b>(&self, x: &'b isize) -> (&'a isize, &'a isize)
|
||||
//~^ HELP consider using an explicit lifetime parameter as shown: fn baz2<'b>(&self, x: &'a isize) -> (&'a isize, &'a isize)
|
||||
(self.bar, x) //~ ERROR E0312
|
||||
//~^ ERROR E0312
|
||||
}
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ trait SomeTrait { fn get(&self) -> isize; }
|
|||
fn make_object1<A:SomeTrait>(v: A) -> Box<SomeTrait+'static> {
|
||||
box v as Box<SomeTrait+'static>
|
||||
//~^ ERROR the parameter type `A` may not live long enough
|
||||
//~^^ ERROR the parameter type `A` may not live long enough
|
||||
//~| ERROR the parameter type `A` may not live long enough
|
||||
}
|
||||
|
||||
fn make_object2<'a,A:SomeTrait+'a>(v: A) -> Box<SomeTrait+'a> {
|
||||
|
|
@ -29,7 +29,7 @@ fn make_object2<'a,A:SomeTrait+'a>(v: A) -> Box<SomeTrait+'a> {
|
|||
fn make_object3<'a,'b,A:SomeTrait+'a>(v: A) -> Box<SomeTrait+'b> {
|
||||
box v as Box<SomeTrait+'b>
|
||||
//~^ ERROR the parameter type `A` may not live long enough
|
||||
//~^^ ERROR the parameter type `A` may not live long enough
|
||||
//~| ERROR the parameter type `A` may not live long enough
|
||||
}
|
||||
|
||||
fn main() { }
|
||||
|
|
|
|||
|
|
@ -28,11 +28,7 @@ impl<'a> GetRef<'a> for Box<'a> {
|
|||
impl<'a> Box<'a> {
|
||||
fn or<'b,G:GetRef<'b>>(&self, g2: G) -> &'a isize {
|
||||
g2.get()
|
||||
//~^ ERROR mismatched types
|
||||
//~| expected type `&'a isize`
|
||||
//~| found type `&'b isize`
|
||||
//~| lifetime mismatch
|
||||
|
||||
//~^ ERROR E0312
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -27,7 +27,7 @@ impl<'a,T:Clone> GetRef<'a,T> for Box<'a,T> {
|
|||
|
||||
fn get<'a,'b,G:GetRef<'a, isize>>(g1: G, b: &'b isize) -> &'b isize {
|
||||
g1.get()
|
||||
//~^ ERROR mismatched types
|
||||
//~^ ERROR E0312
|
||||
}
|
||||
|
||||
fn main() {
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@
|
|||
|
||||
|
||||
struct Invariant<'a> {
|
||||
f: Box<for<'b> FnOnce() -> &'b mut &'a isize + 'static>,
|
||||
f: Box<FnOnce() -> *mut &'a isize + 'static>,
|
||||
}
|
||||
|
||||
fn to_same_lifetime<'r>(b_isize: Invariant<'r>) {
|
||||
|
|
|
|||
|
|
@ -15,10 +15,10 @@ trait Contravariant {
|
|||
fn foo(&self) { }
|
||||
}
|
||||
|
||||
impl Contravariant for for<'a,'b> fn(&'a u8, &'b u8) {
|
||||
impl Contravariant for for<'a,'b> fn(&'a u8, &'b u8) -> &'a u8 {
|
||||
}
|
||||
|
||||
impl Contravariant for for<'a> fn(&'a u8, &'a u8) {
|
||||
impl Contravariant for for<'a> fn(&'a u8, &'a u8) -> &'a u8 {
|
||||
}
|
||||
|
||||
///////////////////////////////////////////////////////////////////////////
|
||||
|
|
@ -27,10 +27,10 @@ trait Covariant {
|
|||
fn foo(&self) { }
|
||||
}
|
||||
|
||||
impl Covariant for for<'a,'b> fn(&'a u8, &'b u8) {
|
||||
impl Covariant for for<'a,'b> fn(&'a u8, &'b u8) -> &'a u8 {
|
||||
}
|
||||
|
||||
impl Covariant for for<'a> fn(&'a u8, &'a u8) {
|
||||
impl Covariant for for<'a> fn(&'a u8, &'a u8) -> &'a u8 {
|
||||
}
|
||||
|
||||
///////////////////////////////////////////////////////////////////////////
|
||||
|
|
@ -39,10 +39,10 @@ trait Invariant {
|
|||
fn foo(&self) { }
|
||||
}
|
||||
|
||||
impl Invariant for for<'a,'b> fn(&'a u8, &'b u8) {
|
||||
impl Invariant for for<'a,'b> fn(&'a u8, &'b u8) -> &'a u8 {
|
||||
}
|
||||
|
||||
impl Invariant for for<'a> fn(&'a u8, &'a u8) {
|
||||
impl Invariant for for<'a> fn(&'a u8, &'a u8) -> &'a u8 {
|
||||
}
|
||||
|
||||
fn main() { }
|
||||
|
|
|
|||
|
|
@ -458,7 +458,7 @@ struct S<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for S<'a> {
|
||||
fn new<'b>(name: &'static str) -> S<'b> {
|
||||
fn new(name: &'static str) -> S<'a> {
|
||||
S { name: name, mark: Cell::new(0), next: Cell::new(None) }
|
||||
}
|
||||
fn name(&self) -> &str { self.name }
|
||||
|
|
@ -476,7 +476,7 @@ struct S2<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for S2<'a> {
|
||||
fn new<'b>(name: &'static str) -> S2<'b> {
|
||||
fn new(name: &'static str) -> S2<'a> {
|
||||
S2 { name: name, mark: Cell::new(0), next: Cell::new((None, None)) }
|
||||
}
|
||||
fn name(&self) -> &str { self.name }
|
||||
|
|
@ -496,7 +496,7 @@ struct V<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for V<'a> {
|
||||
fn new<'b>(name: &'static str) -> V<'b> {
|
||||
fn new(name: &'static str) -> V<'a> {
|
||||
V { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: vec![Cell::new(None), Cell::new(None)]
|
||||
|
|
@ -518,7 +518,7 @@ struct H<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for H<'a> {
|
||||
fn new<'b>(name: &'static str) -> H<'b> {
|
||||
fn new(name: &'static str) -> H<'a> {
|
||||
H { name: name, mark: Cell::new(0), next: Cell::new(None) }
|
||||
}
|
||||
fn name(&self) -> &str { self.name }
|
||||
|
|
@ -549,7 +549,7 @@ struct HM<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for HM<'a> {
|
||||
fn new<'b>(name: &'static str) -> HM<'b> {
|
||||
fn new(name: &'static str) -> HM<'a> {
|
||||
HM { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
@ -583,7 +583,7 @@ struct VD<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for VD<'a> {
|
||||
fn new<'b>(name: &'static str) -> VD<'b> {
|
||||
fn new(name: &'static str) -> VD<'a> {
|
||||
VD { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
@ -604,7 +604,7 @@ struct VM<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for VM<'a> {
|
||||
fn new<'b>(name: &'static str) -> VM<'b> {
|
||||
fn new(name: &'static str) -> VM<'a> {
|
||||
VM { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
@ -625,7 +625,7 @@ struct LL<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for LL<'a> {
|
||||
fn new<'b>(name: &'static str) -> LL<'b> {
|
||||
fn new(name: &'static str) -> LL<'a> {
|
||||
LL { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
@ -646,7 +646,7 @@ struct BH<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for BH<'a> {
|
||||
fn new<'b>(name: &'static str) -> BH<'b> {
|
||||
fn new(name: &'static str) -> BH<'a> {
|
||||
BH { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
@ -687,7 +687,7 @@ struct BTM<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for BTM<'a> {
|
||||
fn new<'b>(name: &'static str) -> BTM<'b> {
|
||||
fn new(name: &'static str) -> BTM<'a> {
|
||||
BTM { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
@ -728,7 +728,7 @@ struct BTS<'a> {
|
|||
}
|
||||
|
||||
impl<'a> Named for BTS<'a> {
|
||||
fn new<'b>(name: &'static str) -> BTS<'b> {
|
||||
fn new(name: &'static str) -> BTS<'a> {
|
||||
BTS { name: name,
|
||||
mark: Cell::new(0),
|
||||
contents: Cell::new(None)
|
||||
|
|
|
|||
75
src/test/run-pass/project-cache-issue-31849.rs
Normal file
75
src/test/run-pass/project-cache-issue-31849.rs
Normal file
|
|
@ -0,0 +1,75 @@
|
|||
// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Regression test for #31849: the problem here was actually a performance
|
||||
// cliff, but I'm adding the test for reference.
|
||||
|
||||
pub trait Upcast<T> {
|
||||
fn upcast(self) -> T;
|
||||
}
|
||||
|
||||
impl<S1, S2, T1, T2> Upcast<(T1, T2)> for (S1,S2)
|
||||
where S1: Upcast<T1>,
|
||||
S2: Upcast<T2>,
|
||||
{
|
||||
fn upcast(self) -> (T1, T2) { (self.0.upcast(), self.1.upcast()) }
|
||||
}
|
||||
|
||||
impl Upcast<()> for ()
|
||||
{
|
||||
fn upcast(self) -> () { () }
|
||||
}
|
||||
|
||||
pub trait ToStatic {
|
||||
type Static: 'static;
|
||||
fn to_static(self) -> Self::Static where Self: Sized;
|
||||
}
|
||||
|
||||
impl<T, U> ToStatic for (T, U)
|
||||
where T: ToStatic,
|
||||
U: ToStatic
|
||||
{
|
||||
type Static = (T::Static, U::Static);
|
||||
fn to_static(self) -> Self::Static { (self.0.to_static(), self.1.to_static()) }
|
||||
}
|
||||
|
||||
impl ToStatic for ()
|
||||
{
|
||||
type Static = ();
|
||||
fn to_static(self) -> () { () }
|
||||
}
|
||||
|
||||
|
||||
trait Factory {
|
||||
type Output;
|
||||
fn build(&self) -> Self::Output;
|
||||
}
|
||||
|
||||
impl<S,T> Factory for (S, T)
|
||||
where S: Factory,
|
||||
T: Factory,
|
||||
S::Output: ToStatic,
|
||||
<S::Output as ToStatic>::Static: Upcast<S::Output>,
|
||||
{
|
||||
type Output = (S::Output, T::Output);
|
||||
fn build(&self) -> Self::Output { (self.0.build().to_static().upcast(), self.1.build()) }
|
||||
}
|
||||
|
||||
impl Factory for () {
|
||||
type Output = ();
|
||||
fn build(&self) -> Self::Output { () }
|
||||
}
|
||||
|
||||
fn main() {
|
||||
// More parens, more time.
|
||||
let it = ((((((((((),()),()),()),()),()),()),()),()),());
|
||||
it.build();
|
||||
}
|
||||
|
||||
Loading…
Add table
Add a link
Reference in a new issue