commit
fe2fd8a2d3
1190 changed files with 30471 additions and 13281 deletions
54
.github/ISSUE_TEMPLATE/rustdoc.md
vendored
Normal file
54
.github/ISSUE_TEMPLATE/rustdoc.md
vendored
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
---
|
||||
name: Problem with rustdoc
|
||||
about: Report an issue with how docs get generated.
|
||||
labels: C-bug, T-rustdoc
|
||||
---
|
||||
<!--
|
||||
Thank you for filing a rustdoc issue! Rustdoc is the tool that handles the generation of docs. It is usually invoked via `cargo doc`, but can also be used directly.
|
||||
|
||||
If you have an issue with the actual content of the docs, use the "Documentation problem" template instead.
|
||||
-->
|
||||
|
||||
# Code
|
||||
<!-- problematic snippet and/or link to repo and/or full path of standard library function -->
|
||||
|
||||
```rust
|
||||
<code>
|
||||
```
|
||||
|
||||
# Reproduction Steps
|
||||
<!--
|
||||
* command(s) to run, if any
|
||||
* permalink to hosted documentation, if any
|
||||
* search query, if any
|
||||
-->
|
||||
|
||||
# Expected Outcome
|
||||
<!--
|
||||
What did you want to happen?
|
||||
|
||||
For GUI issues, feel free to provide a mockup image of what you want it to look like.
|
||||
|
||||
For diagnostics, please provide a mockup of the desired output in a code block.
|
||||
-->
|
||||
|
||||
# Actual Output
|
||||
<!--
|
||||
* rustdoc console output
|
||||
* browser screenshot of generated html
|
||||
* rustdoc json (prettify by running through `jq` or running thorugh an online formatter)
|
||||
-->
|
||||
```console
|
||||
<code>
|
||||
```
|
||||
|
||||
|
||||
# Version
|
||||
<!--
|
||||
Available via `rustdoc --version` or under the "Help" menu.
|
||||
|
||||
If the issue involves opening the documentation in a browser, please also provide the name and version of the browser used.
|
||||
-->
|
||||
|
||||
# Additional Details
|
||||
<!-- Anything else you think is relevant -->
|
||||
5
.mailmap
5
.mailmap
|
|
@ -162,8 +162,10 @@ David Carlier <devnexen@gmail.com>
|
|||
David Klein <david.klein@baesystemsdetica.com>
|
||||
David Manescu <david.manescu@gmail.com> <dman2626@uni.sydney.edu.au>
|
||||
David Ross <daboross@daboross.net>
|
||||
David Wood <david@davidtw.co> <david.wood@huawei.com>
|
||||
David Wood <david@davidtw.co> <Q0KPU0H1YOEPHRY1R2SN5B5RL@david.davidtw.co>
|
||||
David Wood <david@davidtw.co> <agile.lion3441@fuligin.ink>
|
||||
David Wood <david@davidtw.co> <david.wood2@arm.com>
|
||||
David Wood <david@davidtw.co> <david.wood@huawei.com>
|
||||
Deadbeef <ent3rm4n@gmail.com>
|
||||
Deadbeef <ent3rm4n@gmail.com> <fee1-dead-beef@protonmail.com>
|
||||
dependabot[bot] <dependabot[bot]@users.noreply.github.com> <27856297+dependabot-preview[bot]@users.noreply.github.com>
|
||||
|
|
@ -698,3 +700,4 @@ Zach Pomerantz <zmp@umich.edu>
|
|||
Zack Corr <zack@z0w0.me> <zackcorr95@gmail.com>
|
||||
Zack Slayton <zack.slayton@gmail.com>
|
||||
Zbigniew Siciarz <zbigniew@siciarz.net> Zbigniew Siciarz <antyqjon@gmail.com>
|
||||
y21 <30553356+y21@users.noreply.github.com>
|
||||
|
|
|
|||
336
Cargo.lock
336
Cargo.lock
File diff suppressed because it is too large
Load diff
|
|
@ -60,7 +60,7 @@ exclude = [
|
|||
"obj",
|
||||
]
|
||||
|
||||
[profile.release.package.rustc-rayon-core]
|
||||
[profile.release.package.rustc_thread_pool]
|
||||
# The rustc fork of Rayon has deadlock detection code which intermittently
|
||||
# causes overflows in the CI (see https://github.com/rust-lang/rust/issues/90227)
|
||||
# so we turn overflow checks off for now.
|
||||
|
|
@ -89,3 +89,8 @@ codegen-units = 1
|
|||
# FIXME: LTO cannot be enabled for binaries in a workspace
|
||||
# <https://github.com/rust-lang/cargo/issues/9330>
|
||||
# lto = true
|
||||
|
||||
# If you want to use a crate with local modifications, you can set a path or git dependency here.
|
||||
# For git dependencies, also add your source to ALLOWED_SOURCES in src/tools/tidy/src/extdeps.rs.
|
||||
#[patch.crates-io]
|
||||
|
||||
|
|
|
|||
100
RELEASES.md
100
RELEASES.md
|
|
@ -1,3 +1,103 @@
|
|||
Version 1.88.0 (2025-06-26)
|
||||
==========================
|
||||
|
||||
<a id="1.88.0-Language"></a>
|
||||
|
||||
Language
|
||||
--------
|
||||
- [Stabilize `#![feature(let_chains)]` in the 2024 edition.](https://github.com/rust-lang/rust/pull/132833)
|
||||
This feature allows `&&`-chaining `let` statements inside `if` and `while`, allowing intermixture with boolean expressions. The patterns inside the `let` sub-expressions can be irrefutable or refutable.
|
||||
- [Stabilize `#![feature(naked_functions)]`.](https://github.com/rust-lang/rust/pull/134213)
|
||||
Naked functions allow writing functions with no compiler-generated epilogue and prologue, allowing full control over the generated assembly for a particular function.
|
||||
- [Stabilize `#![feature(cfg_boolean_literals)]`.](https://github.com/rust-lang/rust/pull/138632)
|
||||
This allows using boolean literals as `cfg` predicates, e.g. `#[cfg(true)]` and `#[cfg(false)]`.
|
||||
- [Fully de-stabilize the `#[bench]` attribute](https://github.com/rust-lang/rust/pull/134273). Usage of `#[bench]` without `#![feature(custom_test_frameworks)]` already triggered a deny-by-default future-incompatibility lint since Rust 1.77, but will now become a hard error.
|
||||
- [Add warn-by-default `dangerous_implicit_autorefs` lint against implicit autoref of raw pointer dereference.](https://github.com/rust-lang/rust/pull/123239)
|
||||
The lint [will be bumped to deny-by-default](https://github.com/rust-lang/rust/pull/141661) in the next version of Rust.
|
||||
- [Add `invalid_null_arguments` lint to prevent invalid usage of null pointers.](https://github.com/rust-lang/rust/pull/119220)
|
||||
This lint is uplifted from `clippy::invalid_null_ptr_usage`.
|
||||
- [Change trait impl candidate preference for builtin impls and trivial where-clauses.](https://github.com/rust-lang/rust/pull/138176)
|
||||
- [Check types of generic const parameter defaults](https://github.com/rust-lang/rust/pull/139646)
|
||||
|
||||
<a id="1.88.0-Compiler"></a>
|
||||
|
||||
Compiler
|
||||
--------
|
||||
- [Stabilize `-Cdwarf-version` for selecting the version of DWARF debug information to generate.](https://github.com/rust-lang/rust/pull/136926)
|
||||
|
||||
|
||||
<a id="1.88.0-Platform-Support"></a>
|
||||
|
||||
Platform Support
|
||||
----------------
|
||||
- [Demote `i686-pc-windows-gnu` to Tier 2.](https://blog.rust-lang.org/2025/05/26/demoting-i686-pc-windows-gnu/)
|
||||
|
||||
|
||||
Refer to Rust's [platform support page][platform-support-doc]
|
||||
for more information on Rust's tiered platform support.
|
||||
|
||||
[platform-support-doc]: https://doc.rust-lang.org/rustc/platform-support.html
|
||||
|
||||
<a id="1.88.0-Libraries"></a>
|
||||
|
||||
Libraries
|
||||
---------
|
||||
- [Remove backticks from `#[should_panic]` test failure message.](https://github.com/rust-lang/rust/pull/136160)
|
||||
- [Guarantee that `[T; N]::from_fn` is generated in order of increasing indices.](https://github.com/rust-lang/rust/pull/139099), for those passing it a stateful closure.
|
||||
- [The libtest flag `--nocapture` is deprecated in favor of the more consistent `--no-capture` flag.](https://github.com/rust-lang/rust/pull/139224)
|
||||
- [Guarantee that `{float}::NAN` is a quiet NaN.](https://github.com/rust-lang/rust/pull/139483)
|
||||
|
||||
|
||||
<a id="1.88.0-Stabilized-APIs"></a>
|
||||
|
||||
Stabilized APIs
|
||||
---------------
|
||||
|
||||
- [`Cell::update`](https://doc.rust-lang.org/stable/std/cell/struct.Cell.html#method.update)
|
||||
- [`impl Default for *const T`](https://doc.rust-lang.org/nightly/std/primitive.pointer.html#impl-Default-for-*const+T)
|
||||
- [`impl Default for *mut T`](https://doc.rust-lang.org/nightly/std/primitive.pointer.html#impl-Default-for-*mut+T)
|
||||
- [`HashMap::extract_if`](https://doc.rust-lang.org/stable/std/collections/struct.HashMap.html#method.extract_if)
|
||||
- [`HashSet::extract_if`](https://doc.rust-lang.org/stable/std/collections/struct.HashSet.html#method.extract_if)
|
||||
- [`proc_macro::Span::line`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.line)
|
||||
- [`proc_macro::Span::column`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.column)
|
||||
- [`proc_macro::Span::start`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.start)
|
||||
- [`proc_macro::Span::end`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.end)
|
||||
- [`proc_macro::Span::file`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.file)
|
||||
- [`proc_macro::Span::local_file`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.local_file)
|
||||
|
||||
These previously stable APIs are now stable in const contexts:
|
||||
|
||||
- [`NonNull<T>::replace`](https://doc.rust-lang.org/stable/std/ptr/struct.NonNull.html#method.replace)
|
||||
- [`<*mut T>::replace`](https://doc.rust-lang.org/stable/std/primitive.pointer.html#method.replace)
|
||||
- [`std::ptr::swap_nonoverlapping`](https://github.com/rust-lang/rust/pull/137280)
|
||||
- [`Cell::{replace, get, get_mut, from_mut, as_slice_of_cells}`](https://github.com/rust-lang/rust/pull/137928)
|
||||
|
||||
|
||||
<a id="1.88.0-Cargo"></a>
|
||||
|
||||
Cargo
|
||||
-----
|
||||
- [Stabilize automatic garbage collection.](https://github.com/rust-lang/cargo/pull/14287/)
|
||||
- [use `zlib-rs` for gzip compression in rust code](https://github.com/rust-lang/cargo/pull/15417/)
|
||||
|
||||
<a id="1.88.0-Rustdoc"></a>
|
||||
|
||||
Rustdoc
|
||||
-----
|
||||
- [Doctests can be ignored based on target names using `ignore-*` attributes.](https://github.com/rust-lang/rust/pull/137096)
|
||||
- [Stabilize the `--test-runtool` and `--test-runtool-arg` CLI options to specify a program (like qemu) and its arguments to run a doctest.](https://github.com/rust-lang/rust/pull/137096)
|
||||
|
||||
<a id="1.88.0-Compatibility-Notes"></a>
|
||||
|
||||
Compatibility Notes
|
||||
-------------------
|
||||
- [Finish changing the internal representation of pasted tokens](https://github.com/rust-lang/rust/pull/124141). Certain invalid declarative macros that were previously accepted in obscure circumstances are now correctly rejected by the compiler. Use of a `tt` fragment specifier can often fix these macros.
|
||||
- [Fully de-stabilize the `#[bench]` attribute](https://github.com/rust-lang/rust/pull/134273). Usage of `#[bench]` without `#![feature(custom_test_frameworks)]` already triggered a deny-by-default future-incompatibility lint since Rust 1.77, but will now become a hard error.
|
||||
- [Fix borrow checking some always-true patterns.](https://github.com/rust-lang/rust/pull/139042)
|
||||
The borrow checker was overly permissive in some cases, allowing programs that shouldn't have compiled.
|
||||
- [Update the minimum external LLVM to 19.](https://github.com/rust-lang/rust/pull/139275)
|
||||
- [Make it a hard error to use a vector type with a non-Rust ABI without enabling the required target feature.](https://github.com/rust-lang/rust/pull/139309)
|
||||
|
||||
Version 1.87.0 (2025-05-15)
|
||||
==========================
|
||||
|
||||
|
|
|
|||
|
|
@ -8,6 +8,14 @@
|
|||
# `bootstrap.toml` in the current directory of a build for build configuration, but
|
||||
# a custom configuration file can also be specified with `--config` to the build
|
||||
# system.
|
||||
#
|
||||
# Note that the following are equivelent, for more details see <https://toml.io/en/v1.0.0>.
|
||||
#
|
||||
# build.verbose = 1
|
||||
#
|
||||
# [build]
|
||||
# verbose = 1
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Global Settings
|
||||
|
|
@ -44,7 +52,6 @@
|
|||
# =============================================================================
|
||||
# Tweaking how LLVM is compiled
|
||||
# =============================================================================
|
||||
[llvm]
|
||||
|
||||
# Whether to use Rust CI built LLVM instead of locally building it.
|
||||
#
|
||||
|
|
@ -62,50 +69,50 @@
|
|||
#
|
||||
# Note that many of the LLVM options are not currently supported for
|
||||
# downloading. Currently only the "assertions" option can be toggled.
|
||||
#download-ci-llvm = true
|
||||
#llvm.download-ci-llvm = true
|
||||
|
||||
# Indicates whether the LLVM build is a Release or Debug build
|
||||
#optimize = true
|
||||
#llvm.optimize = true
|
||||
|
||||
# Indicates whether LLVM should be built with ThinLTO. Note that this will
|
||||
# only succeed if you use clang, lld, llvm-ar, and llvm-ranlib in your C/C++
|
||||
# toolchain (see the `cc`, `cxx`, `linker`, `ar`, and `ranlib` options below).
|
||||
# More info at: https://clang.llvm.org/docs/ThinLTO.html#clang-bootstrap
|
||||
#thin-lto = false
|
||||
#llvm.thin-lto = false
|
||||
|
||||
# Indicates whether an LLVM Release build should include debug info
|
||||
#release-debuginfo = false
|
||||
#llvm.release-debuginfo = false
|
||||
|
||||
# Indicates whether the LLVM assertions are enabled or not
|
||||
# NOTE: When assertions are disabled, bugs in the integration between rustc and LLVM can lead to
|
||||
# unsoundness (segfaults, etc.) in the rustc process itself, not just in the generated code.
|
||||
#assertions = false
|
||||
#llvm.assertions = false
|
||||
|
||||
# Indicates whether the LLVM testsuite is enabled in the build or not. Does
|
||||
# not execute the tests as part of the build as part of x.py build et al,
|
||||
# just makes it possible to do `ninja check-llvm` in the staged LLVM build
|
||||
# directory when doing LLVM development as part of Rust development.
|
||||
#tests = false
|
||||
#llvm.tests = false
|
||||
|
||||
# Indicates whether the LLVM plugin is enabled or not
|
||||
#plugins = false
|
||||
#llvm.plugins = false
|
||||
|
||||
# Whether to build Enzyme as AutoDiff backend.
|
||||
#enzyme = false
|
||||
#llvm.enzyme = false
|
||||
|
||||
# Whether to build LLVM with support for it's gpu offload runtime.
|
||||
#offload = false
|
||||
#llvm.offload = false
|
||||
|
||||
# When true, link libstdc++ statically into the rustc_llvm.
|
||||
# This is useful if you don't want to use the dynamic version of that
|
||||
# library provided by LLVM.
|
||||
#static-libstdcpp = false
|
||||
#llvm.static-libstdcpp = false
|
||||
|
||||
# Enable LLVM to use zstd for compression.
|
||||
#libzstd = false
|
||||
#llvm.libzstd = false
|
||||
|
||||
# Whether to use Ninja to build LLVM. This runs much faster than make.
|
||||
#ninja = true
|
||||
#llvm.ninja = true
|
||||
|
||||
# LLVM targets to build support for.
|
||||
# Note: this is NOT related to Rust compilation targets. However, as Rust is
|
||||
|
|
@ -113,13 +120,13 @@
|
|||
# the resulting rustc being unable to compile for the disabled architectures.
|
||||
#
|
||||
# To add support for new targets, see https://rustc-dev-guide.rust-lang.org/building/new-target.html.
|
||||
#targets = "AArch64;AMDGPU;ARM;BPF;Hexagon;LoongArch;MSP430;Mips;NVPTX;PowerPC;RISCV;Sparc;SystemZ;WebAssembly;X86"
|
||||
#llvm.targets = "AArch64;AMDGPU;ARM;BPF;Hexagon;LoongArch;MSP430;Mips;NVPTX;PowerPC;RISCV;Sparc;SystemZ;WebAssembly;X86"
|
||||
|
||||
# LLVM experimental targets to build support for. These targets are specified in
|
||||
# the same format as above, but since these targets are experimental, they are
|
||||
# not built by default and the experimental Rust compilation targets that depend
|
||||
# on them will not work unless the user opts in to building them.
|
||||
#experimental-targets = "AVR;M68k;CSKY"
|
||||
#llvm.experimental-targets = "AVR;M68k;CSKY"
|
||||
|
||||
# Cap the number of parallel linker invocations when compiling LLVM.
|
||||
# This can be useful when building LLVM with debug info, which significantly
|
||||
|
|
@ -127,86 +134,84 @@
|
|||
# each linker process.
|
||||
# If set to 0, linker invocations are treated like any other job and
|
||||
# controlled by bootstrap's -j parameter.
|
||||
#link-jobs = 0
|
||||
#llvm.link-jobs = 0
|
||||
|
||||
# Whether to build LLVM as a dynamically linked library (as opposed to statically linked).
|
||||
# Under the hood, this passes `--shared` to llvm-config.
|
||||
# NOTE: To avoid performing LTO multiple times, we suggest setting this to `true` when `thin-lto` is enabled.
|
||||
#link-shared = llvm.thin-lto
|
||||
#llvm.link-shared = llvm.thin-lto
|
||||
|
||||
# When building llvm, this configures what is being appended to the version.
|
||||
# To use LLVM version as is, provide an empty string.
|
||||
#version-suffix = if rust.channel == "dev" { "-rust-dev" } else { "-rust-$version-$channel" }
|
||||
#llvm.version-suffix = if rust.channel == "dev" { "-rust-dev" } else { "-rust-$version-$channel" }
|
||||
|
||||
# On MSVC you can compile LLVM with clang-cl, but the test suite doesn't pass
|
||||
# with clang-cl, so this is special in that it only compiles LLVM with clang-cl.
|
||||
# Note that this takes a /path/to/clang-cl, not a boolean.
|
||||
#clang-cl = cc
|
||||
#llvm.clang-cl = cc
|
||||
|
||||
# Pass extra compiler and linker flags to the LLVM CMake build.
|
||||
#cflags = ""
|
||||
#cxxflags = ""
|
||||
#ldflags = ""
|
||||
#llvm.cflags = ""
|
||||
#llvm.cxxflags = ""
|
||||
#llvm.ldflags = ""
|
||||
|
||||
# Use libc++ when building LLVM instead of libstdc++. This is the default on
|
||||
# platforms already use libc++ as the default C++ library, but this option
|
||||
# allows you to use libc++ even on platforms when it's not. You need to ensure
|
||||
# that your host compiler ships with libc++.
|
||||
#use-libcxx = false
|
||||
#llvm.use-libcxx = false
|
||||
|
||||
# The value specified here will be passed as `-DLLVM_USE_LINKER` to CMake.
|
||||
#use-linker = <none> (path)
|
||||
#llvm.use-linker = <none> (path)
|
||||
|
||||
# Whether or not to specify `-DLLVM_TEMPORARILY_ALLOW_OLD_TOOLCHAIN=YES`
|
||||
#allow-old-toolchain = false
|
||||
#llvm.allow-old-toolchain = false
|
||||
|
||||
# Whether to include the Polly optimizer.
|
||||
#polly = false
|
||||
#llvm.polly = false
|
||||
|
||||
# Whether to build the clang compiler.
|
||||
#clang = false
|
||||
#llvm.clang = false
|
||||
|
||||
# Whether to enable llvm compilation warnings.
|
||||
#enable-warnings = false
|
||||
#llvm.enable-warnings = false
|
||||
|
||||
# Custom CMake defines to set when building LLVM.
|
||||
#build-config = {}
|
||||
#llvm.build-config = {}
|
||||
|
||||
# =============================================================================
|
||||
# Tweaking how GCC is compiled
|
||||
# =============================================================================
|
||||
[gcc]
|
||||
# Download GCC from CI instead of building it locally.
|
||||
# Note that this will attempt to download GCC even if there are local
|
||||
# modifications to the `src/gcc` submodule.
|
||||
# Currently, this is only supported for the `x86_64-unknown-linux-gnu` target.
|
||||
#download-ci-gcc = false
|
||||
#gcc.download-ci-gcc = false
|
||||
|
||||
# =============================================================================
|
||||
# General build configuration options
|
||||
# =============================================================================
|
||||
[build]
|
||||
|
||||
# The default stage to use for the `check` subcommand
|
||||
#check-stage = 0
|
||||
#build.check-stage = 0
|
||||
|
||||
# The default stage to use for the `doc` subcommand
|
||||
#doc-stage = 0
|
||||
#build.doc-stage = 0
|
||||
|
||||
# The default stage to use for the `build` subcommand
|
||||
#build-stage = 1
|
||||
#build.build-stage = 1
|
||||
|
||||
# The default stage to use for the `test` subcommand
|
||||
#test-stage = 1
|
||||
#build.test-stage = 1
|
||||
|
||||
# The default stage to use for the `dist` subcommand
|
||||
#dist-stage = 2
|
||||
#build.dist-stage = 2
|
||||
|
||||
# The default stage to use for the `install` subcommand
|
||||
#install-stage = 2
|
||||
#build.install-stage = 2
|
||||
|
||||
# The default stage to use for the `bench` subcommand
|
||||
#bench-stage = 2
|
||||
#build.bench-stage = 2
|
||||
|
||||
# A descriptive string to be appended to version output (e.g., `rustc --version`),
|
||||
# which is also used in places like debuginfo `DW_AT_producer`. This may be useful for
|
||||
|
|
@ -217,7 +222,7 @@
|
|||
# upstream Rust you need to set this to "". However, note that if you set this to "" but
|
||||
# are not actually compatible -- for example if you've backported patches that change
|
||||
# behavior -- this may lead to miscompilations or other bugs.
|
||||
#description = ""
|
||||
#build.description = ""
|
||||
|
||||
# Build triple for the pre-compiled snapshot compiler. If `rustc` is set, this must match its host
|
||||
# triple (see `rustc --version --verbose`; cross-compiling the rust build system itself is NOT
|
||||
|
|
@ -229,14 +234,14 @@
|
|||
# Otherwise, `x.py` will try to infer it from the output of `uname`.
|
||||
# If `uname` is not found in PATH, we assume this is `x86_64-pc-windows-msvc`.
|
||||
# This may be changed in the future.
|
||||
#build = "x86_64-unknown-linux-gnu" (as an example)
|
||||
#build.build = "x86_64-unknown-linux-gnu" (as an example)
|
||||
|
||||
# Which triples to produce a compiler toolchain for. Each of these triples will be bootstrapped from
|
||||
# the build triple themselves. In other words, this is the list of triples for which to build a
|
||||
# compiler that can RUN on that triple.
|
||||
#
|
||||
# Defaults to just the `build` triple.
|
||||
#host = [build.build] (list of triples)
|
||||
#build.host = [build.build] (list of triples)
|
||||
|
||||
# Which triples to build libraries (core/alloc/std/test/proc_macro) for. Each of these triples will
|
||||
# be bootstrapped from the build triple themselves. In other words, this is the list of triples for
|
||||
|
|
@ -245,32 +250,32 @@
|
|||
# Defaults to `host`. If you set this explicitly, you likely want to add all
|
||||
# host triples to this list as well in order for those host toolchains to be
|
||||
# able to compile programs for their native target.
|
||||
#target = build.host (list of triples)
|
||||
#build.target = build.host (list of triples)
|
||||
|
||||
# Use this directory to store build artifacts. Paths are relative to the current directory, not to
|
||||
# the root of the repository.
|
||||
#build-dir = "build"
|
||||
#build.build-dir = "build"
|
||||
|
||||
# Instead of downloading the src/stage0 version of Cargo specified, use
|
||||
# this Cargo binary instead to build all Rust code
|
||||
# If you set this, you likely want to set `rustc` as well.
|
||||
#cargo = "/path/to/cargo"
|
||||
#build.cargo = "/path/to/cargo"
|
||||
|
||||
# Instead of downloading the src/stage0 version of the compiler
|
||||
# specified, use this rustc binary instead as the stage0 snapshot compiler.
|
||||
# If you set this, you likely want to set `cargo` as well.
|
||||
#rustc = "/path/to/rustc"
|
||||
#build.rustc = "/path/to/rustc"
|
||||
|
||||
# Instead of downloading the src/stage0 version of rustfmt specified,
|
||||
# use this rustfmt binary instead as the stage0 snapshot rustfmt.
|
||||
#rustfmt = "/path/to/rustfmt"
|
||||
#build.rustfmt = "/path/to/rustfmt"
|
||||
|
||||
# Instead of downloading the src/stage0 version of cargo-clippy specified,
|
||||
# use this cargo-clippy binary instead as the stage0 snapshot cargo-clippy.
|
||||
#
|
||||
# Note that this option should be used with the same toolchain as the `rustc` option above.
|
||||
# Otherwise, clippy is likely to fail due to a toolchain conflict.
|
||||
#cargo-clippy = "/path/to/cargo-clippy"
|
||||
#build.cargo-clippy = "/path/to/cargo-clippy"
|
||||
|
||||
# Whether to build documentation by default. If false, rustdoc and
|
||||
# friends will still be compiled but they will not be used to generate any
|
||||
|
|
@ -278,47 +283,47 @@
|
|||
#
|
||||
# You can still build documentation when this is disabled by explicitly passing paths,
|
||||
# e.g. `x doc library`.
|
||||
#docs = true
|
||||
#build.docs = true
|
||||
|
||||
# Flag to specify whether CSS, JavaScript, and HTML are minified when
|
||||
# docs are generated. JSON is always minified, because it's enormous,
|
||||
# and generated in already-minified form from the beginning.
|
||||
#docs-minification = true
|
||||
#build.docs-minification = true
|
||||
|
||||
# Flag to specify whether private items should be included in the library docs.
|
||||
#library-docs-private-items = false
|
||||
#build.library-docs-private-items = false
|
||||
|
||||
# Indicate whether to build compiler documentation by default.
|
||||
# You can still build documentation when this is disabled by explicitly passing a path: `x doc compiler`.
|
||||
#compiler-docs = false
|
||||
#build.compiler-docs = false
|
||||
|
||||
# Indicate whether git submodules are managed and updated automatically.
|
||||
#submodules = true
|
||||
#build.submodules = true
|
||||
|
||||
# The path to (or name of) the GDB executable to use. This is only used for
|
||||
# executing the debuginfo test suite.
|
||||
#gdb = "gdb"
|
||||
#build.gdb = "gdb"
|
||||
|
||||
# The path to (or name of) the LLDB executable to use. This is only used for
|
||||
# executing the debuginfo test suite.
|
||||
#lldb = "lldb"
|
||||
#build.lldb = "lldb"
|
||||
|
||||
# The node.js executable to use. Note that this is only used for the emscripten
|
||||
# target when running tests, otherwise this can be omitted.
|
||||
#nodejs = "node"
|
||||
#build.nodejs = "node"
|
||||
|
||||
# The npm executable to use. Note that this is used for rustdoc-gui tests,
|
||||
# otherwise this can be omitted.
|
||||
#
|
||||
# Under Windows this should be `npm.cmd` or path to it (verified on nodejs v18.06), or
|
||||
# error will be emitted.
|
||||
#npm = "npm"
|
||||
#build.npm = "npm"
|
||||
|
||||
# Python interpreter to use for various tasks throughout the build, notably
|
||||
# rustdoc tests, the lldb python interpreter, and some dist bits and pieces.
|
||||
#
|
||||
# Defaults to the Python interpreter used to execute x.py.
|
||||
#python = "python"
|
||||
#build.python = "python"
|
||||
|
||||
# The path to the REUSE executable to use. Note that REUSE is not required in
|
||||
# most cases, as our tooling relies on a cached (and shrunk) copy of the
|
||||
|
|
@ -328,17 +333,17 @@
|
|||
# repository to change, and the cached copy has to be regenerated.
|
||||
#
|
||||
# Defaults to the "reuse" command in the system path.
|
||||
#reuse = "reuse"
|
||||
#build.reuse = "reuse"
|
||||
|
||||
# Force Cargo to check that Cargo.lock describes the precise dependency
|
||||
# set that all the Cargo.toml files create, instead of updating it.
|
||||
#locked-deps = false
|
||||
#build.locked-deps = false
|
||||
|
||||
# Indicate whether the vendored sources are used for Rust dependencies or not.
|
||||
#
|
||||
# Vendoring requires additional setup. We recommend using the pre-generated source tarballs if you
|
||||
# want to use vendoring. See https://forge.rust-lang.org/infra/other-installation-methods.html#source-code.
|
||||
#vendor = if "is a tarball source" && "vendor" dir exists && ".cargo/config.toml" file exists { true } else { false }
|
||||
#build.vendor = if "is a tarball source" && "vendor" dir exists && ".cargo/config.toml" file exists { true } else { false }
|
||||
|
||||
# Typically the build system will build the Rust compiler twice. The second
|
||||
# compiler, however, will simply use its own libraries to link against. If you
|
||||
|
|
@ -346,11 +351,11 @@
|
|||
# then you can set this option to true.
|
||||
#
|
||||
# This is only useful for verifying that rustc generates reproducible builds.
|
||||
#full-bootstrap = false
|
||||
#build.full-bootstrap = false
|
||||
|
||||
# Set the bootstrap/download cache path. It is useful when building rust
|
||||
# repeatedly in a CI environment.
|
||||
#bootstrap-cache-path = /path/to/shared/cache
|
||||
#build.bootstrap-cache-path = /path/to/shared/cache
|
||||
|
||||
# Enable a build of the extended Rust tool set which is not only the compiler
|
||||
# but also tools such as Cargo. This will also produce "combined installers"
|
||||
|
|
@ -359,7 +364,7 @@
|
|||
# which tools should be built if `extended = true`.
|
||||
#
|
||||
# This is disabled by default.
|
||||
#extended = false
|
||||
#build.extended = false
|
||||
|
||||
# Set of tools to be included in the installation.
|
||||
#
|
||||
|
|
@ -368,7 +373,7 @@
|
|||
# If `extended = true`, they are all included.
|
||||
#
|
||||
# If any enabled tool fails to build, the installation fails.
|
||||
#tools = [
|
||||
#build.tools = [
|
||||
# "cargo",
|
||||
# "clippy",
|
||||
# "rustdoc",
|
||||
|
|
@ -388,17 +393,17 @@
|
|||
#
|
||||
# The default value for the `features` array is `[]`. However, please note that other flags in
|
||||
# `bootstrap.toml` might influence the features enabled for some tools.
|
||||
#tool.TOOL_NAME.features = [FEATURE1, FEATURE2]
|
||||
#build.tool.TOOL_NAME.features = [FEATURE1, FEATURE2]
|
||||
|
||||
# Verbosity level: 0 == not verbose, 1 == verbose, 2 == very verbose, 3 == print environment variables on each rustc invocation
|
||||
#verbose = 0
|
||||
#build.verbose = 0
|
||||
|
||||
# Build the sanitizer runtimes
|
||||
#sanitizers = false
|
||||
#build.sanitizers = false
|
||||
|
||||
# Build the profiler runtime (required when compiling with options that depend
|
||||
# on this runtime, such as `-C profile-generate` or `-C instrument-coverage`).
|
||||
#profiler = false
|
||||
#build.profiler = false
|
||||
|
||||
# Use the optimized LLVM C intrinsics for `compiler_builtins`, rather than Rust intrinsics.
|
||||
# Requires the LLVM submodule to be managed by bootstrap (i.e. not external) so that `compiler-rt`
|
||||
|
|
@ -406,102 +411,100 @@
|
|||
#
|
||||
# Setting this to `false` generates slower code, but removes the requirement for a C toolchain in
|
||||
# order to run `x check`.
|
||||
#optimized-compiler-builtins = if rust.channel == "dev" { false } else { true }
|
||||
#build.optimized-compiler-builtins = if rust.channel == "dev" { false } else { true }
|
||||
|
||||
# Indicates whether the native libraries linked into Cargo will be statically
|
||||
# linked or not.
|
||||
#cargo-native-static = false
|
||||
#build.cargo-native-static = false
|
||||
|
||||
# Run the build with low priority, by setting the process group's "nice" value
|
||||
# to +10 on Unix platforms, and by using a "low priority" job object on Windows.
|
||||
#low-priority = false
|
||||
#build.low-priority = false
|
||||
|
||||
# Arguments passed to the `./configure` script, used during distcheck. You
|
||||
# probably won't fill this in but rather it's filled in by the `./configure`
|
||||
# script. Useful for debugging.
|
||||
#configure-args = []
|
||||
#build.configure-args = []
|
||||
|
||||
# Indicates that a local rebuild is occurring instead of a full bootstrap,
|
||||
# essentially skipping stage0 as the local compiler is recompiling itself again.
|
||||
# Useful for modifying only the stage2 compiler without having to pass `--keep-stage 0` each time.
|
||||
#local-rebuild = false
|
||||
#build.local-rebuild = false
|
||||
|
||||
# Print out how long each bootstrap step took (mostly intended for CI and
|
||||
# tracking over time)
|
||||
#print-step-timings = false
|
||||
#build.print-step-timings = false
|
||||
|
||||
# Print out resource usage data for each bootstrap step, as defined by the Unix
|
||||
# struct rusage. (Note that this setting is completely unstable: the data it
|
||||
# captures, what platforms it supports, the format of its associated output, and
|
||||
# this setting's very existence, are all subject to change.)
|
||||
#print-step-rusage = false
|
||||
#build.print-step-rusage = false
|
||||
|
||||
# Always patch binaries for usage with Nix toolchains. If `true` then binaries
|
||||
# will be patched unconditionally. If `false` or unset, binaries will be patched
|
||||
# only if the current distribution is NixOS. This option is useful when using
|
||||
# a Nix toolchain on non-NixOS distributions.
|
||||
#patch-binaries-for-nix = false
|
||||
#build.patch-binaries-for-nix = false
|
||||
|
||||
# Collect information and statistics about the current build, and write it to
|
||||
# disk. Enabling this has no impact on the resulting build output. The
|
||||
# schema of the file generated by the build metrics feature is unstable, and
|
||||
# this is not intended to be used during local development.
|
||||
#metrics = false
|
||||
#build.metrics = false
|
||||
|
||||
# Specify the location of the Android NDK. Used when targeting Android.
|
||||
#android-ndk = "/path/to/android-ndk-r26d"
|
||||
#build.android-ndk = "/path/to/android-ndk-r26d"
|
||||
|
||||
# Number of parallel jobs to be used for building and testing. If set to `0` or
|
||||
# omitted, it will be automatically determined. This is the `-j`/`--jobs` flag
|
||||
# passed to cargo invocations.
|
||||
#jobs = 0
|
||||
#build.jobs = 0
|
||||
|
||||
# What custom diff tool to use for displaying compiletest tests.
|
||||
#compiletest-diff-tool = <none>
|
||||
#build.compiletest-diff-tool = <none>
|
||||
|
||||
# Whether to use the precompiled stage0 libtest with compiletest.
|
||||
#compiletest-use-stage0-libtest = true
|
||||
#build.compiletest-use-stage0-libtest = true
|
||||
|
||||
# Indicates whether ccache is used when building certain artifacts (e.g. LLVM).
|
||||
# Set to `true` to use the first `ccache` in PATH, or set an absolute path to use
|
||||
# a specific version.
|
||||
#ccache = false
|
||||
#build.ccache = false
|
||||
|
||||
# List of paths to exclude from the build and test processes.
|
||||
# For example, exclude = ["tests/ui", "src/tools/tidy"].
|
||||
#exclude = []
|
||||
#build.exclude = []
|
||||
|
||||
# =============================================================================
|
||||
# General install configuration options
|
||||
# =============================================================================
|
||||
[install]
|
||||
|
||||
# Where to install the generated toolchain. Must be an absolute path.
|
||||
#prefix = "/usr/local"
|
||||
#install.prefix = "/usr/local"
|
||||
|
||||
# Where to install system configuration files.
|
||||
# If this is a relative path, it will get installed in `prefix` above
|
||||
#sysconfdir = "/etc"
|
||||
#install.sysconfdir = "/etc"
|
||||
|
||||
# Where to install documentation in `prefix` above
|
||||
#docdir = "share/doc/rust"
|
||||
#install.docdir = "share/doc/rust"
|
||||
|
||||
# Where to install binaries in `prefix` above
|
||||
#bindir = "bin"
|
||||
#install.bindir = "bin"
|
||||
|
||||
# Where to install libraries in `prefix` above
|
||||
#libdir = "lib"
|
||||
#install.libdir = "lib"
|
||||
|
||||
# Where to install man pages in `prefix` above
|
||||
#mandir = "share/man"
|
||||
#install.mandir = "share/man"
|
||||
|
||||
# Where to install data in `prefix` above
|
||||
#datadir = "share"
|
||||
#install.datadir = "share"
|
||||
|
||||
# =============================================================================
|
||||
# Options for compiling Rust code itself
|
||||
# =============================================================================
|
||||
[rust]
|
||||
|
||||
# Whether or not to optimize when compiling the compiler and standard library,
|
||||
# and what level of optimization to use.
|
||||
|
|
@ -517,7 +520,7 @@
|
|||
# 3 - All optimizations.
|
||||
# "s" - Optimize for binary size.
|
||||
# "z" - Optimize for binary size, but also turn off loop vectorization.
|
||||
#optimize = true
|
||||
#rust.optimize = true
|
||||
|
||||
# Indicates that the build should be configured for debugging Rust. A
|
||||
# `debug`-enabled compiler and standard library will be somewhat
|
||||
|
|
@ -540,7 +543,7 @@
|
|||
# "maximally debuggable" environment (notably libstd) takes
|
||||
# hours to build.
|
||||
#
|
||||
#debug = false
|
||||
#rust.debug = false
|
||||
|
||||
# Whether to download the stage 1 and 2 compilers from CI. This is useful if you
|
||||
# are working on tools, doc-comments, or library (you will be able to build the
|
||||
|
|
@ -553,37 +556,37 @@
|
|||
#
|
||||
# Set this to `true` to always download or `false` to always use the in-tree
|
||||
# compiler.
|
||||
#download-rustc = false
|
||||
#rust.download-rustc = false
|
||||
|
||||
# Number of codegen units to use for each compiler invocation. A value of 0
|
||||
# means "the number of cores on this machine", and 1+ is passed through to the
|
||||
# compiler.
|
||||
#
|
||||
# Uses the rustc defaults: https://doc.rust-lang.org/rustc/codegen-options/index.html#codegen-units
|
||||
#codegen-units = if incremental { 256 } else { 16 }
|
||||
#rust.codegen-units = if incremental { 256 } else { 16 }
|
||||
|
||||
# Sets the number of codegen units to build the standard library with,
|
||||
# regardless of what the codegen-unit setting for the rest of the compiler is.
|
||||
# NOTE: building with anything other than 1 is known to occasionally have bugs.
|
||||
#codegen-units-std = codegen-units
|
||||
#rust.codegen-units-std = codegen-units
|
||||
|
||||
# Whether or not debug assertions are enabled for the compiler and standard library.
|
||||
# These can help find bugs at the cost of a small runtime slowdown.
|
||||
#
|
||||
# Defaults to rust.debug value
|
||||
#debug-assertions = rust.debug (boolean)
|
||||
#rust.debug-assertions = rust.debug (boolean)
|
||||
|
||||
# Whether or not debug assertions are enabled for the standard library.
|
||||
# Overrides the `debug-assertions` option, if defined.
|
||||
#
|
||||
# Defaults to rust.debug-assertions value
|
||||
#debug-assertions-std = rust.debug-assertions (boolean)
|
||||
#rust.debug-assertions-std = rust.debug-assertions (boolean)
|
||||
|
||||
# Whether or not debug assertions are enabled for the tools built by bootstrap.
|
||||
# Overrides the `debug-assertions` option, if defined.
|
||||
#
|
||||
# Defaults to rust.debug-assertions value
|
||||
#debug-assertions-tools = rust.debug-assertions (boolean)
|
||||
#rust.debug-assertions-tools = rust.debug-assertions (boolean)
|
||||
|
||||
# Whether or not to leave debug! and trace! calls in the rust binary.
|
||||
#
|
||||
|
|
@ -591,22 +594,22 @@
|
|||
#
|
||||
# If you see a message from `tracing` saying "some trace filter directives would enable traces that
|
||||
# are disabled statically" because `max_level_info` is enabled, set this value to `true`.
|
||||
#debug-logging = rust.debug-assertions (boolean)
|
||||
#rust.debug-logging = rust.debug-assertions (boolean)
|
||||
|
||||
# Whether or not to build rustc, tools and the libraries with randomized type layout
|
||||
#randomize-layout = false
|
||||
#rust.randomize-layout = false
|
||||
|
||||
# Whether or not overflow checks are enabled for the compiler and standard
|
||||
# library.
|
||||
#
|
||||
# Defaults to rust.debug value
|
||||
#overflow-checks = rust.debug (boolean)
|
||||
#rust.overflow-checks = rust.debug (boolean)
|
||||
|
||||
# Whether or not overflow checks are enabled for the standard library.
|
||||
# Overrides the `overflow-checks` option, if defined.
|
||||
#
|
||||
# Defaults to rust.overflow-checks value
|
||||
#overflow-checks-std = rust.overflow-checks (boolean)
|
||||
#rust.overflow-checks-std = rust.overflow-checks (boolean)
|
||||
|
||||
# Debuginfo level for most of Rust code, corresponds to the `-C debuginfo=N` option of `rustc`.
|
||||
# See https://doc.rust-lang.org/rustc/codegen-options/index.html#debuginfo for available options.
|
||||
|
|
@ -617,20 +620,20 @@
|
|||
#
|
||||
# Note that debuginfo-level = 2 generates several gigabytes of debuginfo
|
||||
# and will slow down the linking process significantly.
|
||||
#debuginfo-level = if rust.debug { 1 } else { 0 }
|
||||
#rust.debuginfo-level = if rust.debug { 1 } else { 0 }
|
||||
|
||||
# Debuginfo level for the compiler.
|
||||
#debuginfo-level-rustc = rust.debuginfo-level
|
||||
#rust.debuginfo-level-rustc = rust.debuginfo-level
|
||||
|
||||
# Debuginfo level for the standard library.
|
||||
#debuginfo-level-std = rust.debuginfo-level
|
||||
#rust.debuginfo-level-std = rust.debuginfo-level
|
||||
|
||||
# Debuginfo level for the tools.
|
||||
#debuginfo-level-tools = rust.debuginfo-level
|
||||
#rust.debuginfo-level-tools = rust.debuginfo-level
|
||||
|
||||
# Debuginfo level for the test suites run with compiletest.
|
||||
# FIXME(#61117): Some tests fail when this option is enabled.
|
||||
#debuginfo-level-tests = 0
|
||||
#rust.debuginfo-level-tests = 0
|
||||
|
||||
# Should rustc and the standard library be built with split debuginfo? Default
|
||||
# is platform dependent.
|
||||
|
|
@ -640,13 +643,13 @@
|
|||
# The value specified here is only used when targeting the `build.build` triple,
|
||||
# and is overridden by `target.<triple>.split-debuginfo` if specified.
|
||||
#
|
||||
#split-debuginfo = see target.<triple>.split-debuginfo
|
||||
#rust.split-debuginfo = see target.<triple>.split-debuginfo
|
||||
|
||||
# Whether or not `panic!`s generate backtraces (RUST_BACKTRACE)
|
||||
#backtrace = true
|
||||
#rust.backtrace = true
|
||||
|
||||
# Whether to always use incremental compilation when building rustc
|
||||
#incremental = false
|
||||
#rust.incremental = false
|
||||
|
||||
# The default linker that will be hard-coded into the generated
|
||||
# compiler for targets that don't specify a default linker explicitly
|
||||
|
|
@ -656,7 +659,7 @@
|
|||
# setting.
|
||||
#
|
||||
# See https://doc.rust-lang.org/rustc/codegen-options/index.html#linker for more information.
|
||||
#default-linker = <none> (path)
|
||||
#rust.default-linker = <none> (path)
|
||||
|
||||
# The "channel" for the Rust build to produce. The stable/beta channels only
|
||||
# allow using stable features, whereas the nightly and dev channels allow using
|
||||
|
|
@ -665,7 +668,7 @@
|
|||
# You can set the channel to "auto-detect" to load the channel name from `src/ci/channel`.
|
||||
#
|
||||
# If using tarball sources, default value is "auto-detect", otherwise, it's "dev".
|
||||
#channel = if "is a tarball source" { "auto-detect" } else { "dev" }
|
||||
#rust.channel = if "is a tarball source" { "auto-detect" } else { "dev" }
|
||||
|
||||
# The root location of the musl installation directory. The library directory
|
||||
# will also need to contain libunwind.a for an unwinding implementation. Note
|
||||
|
|
@ -673,65 +676,65 @@
|
|||
# linked binaries.
|
||||
#
|
||||
# Defaults to /usr on musl hosts. Has no default otherwise.
|
||||
#musl-root = <platform specific> (path)
|
||||
#rust.musl-root = <platform specific> (path)
|
||||
|
||||
# By default the `rustc` executable is built with `-Wl,-rpath` flags on Unix
|
||||
# platforms to ensure that the compiler is usable by default from the build
|
||||
# directory (as it links to a number of dynamic libraries). This may not be
|
||||
# desired in distributions, for example.
|
||||
#rpath = true
|
||||
#rust.rpath = true
|
||||
|
||||
# Indicates whether symbols should be stripped using `-Cstrip=symbols`.
|
||||
#strip = false
|
||||
#rust.strip = false
|
||||
|
||||
# Forces frame pointers to be used with `-Cforce-frame-pointers`.
|
||||
# This can be helpful for profiling at a small performance cost.
|
||||
#frame-pointers = false
|
||||
#rust.frame-pointers = false
|
||||
|
||||
# Indicates whether stack protectors should be used
|
||||
# via the unstable option `-Zstack-protector`.
|
||||
#
|
||||
# Valid options are : `none`(default),`basic`,`strong`, or `all`.
|
||||
# `strong` and `basic` options may be buggy and are not recommended, see rust-lang/rust#114903.
|
||||
#stack-protector = "none"
|
||||
#rust.stack-protector = "none"
|
||||
|
||||
# Prints each test name as it is executed, to help debug issues in the test harness itself.
|
||||
#verbose-tests = if is_verbose { true } else { false }
|
||||
#rust.verbose-tests = if is_verbose { true } else { false }
|
||||
|
||||
# Flag indicating whether tests are compiled with optimizations (the -O flag).
|
||||
#optimize-tests = true
|
||||
#rust.optimize-tests = true
|
||||
|
||||
# Flag indicating whether codegen tests will be run or not. If you get an error
|
||||
# saying that the FileCheck executable is missing, you may want to disable this.
|
||||
# Also see the target's llvm-filecheck option.
|
||||
#codegen-tests = true
|
||||
#rust.codegen-tests = true
|
||||
|
||||
# Flag indicating whether git info will be retrieved from .git automatically.
|
||||
# Having the git information can cause a lot of rebuilds during development.
|
||||
#omit-git-hash = if rust.channel == "dev" { true } else { false }
|
||||
#rust.omit-git-hash = if rust.channel == "dev" { true } else { false }
|
||||
|
||||
# Whether to create a source tarball by default when running `x dist`.
|
||||
#
|
||||
# You can still build a source tarball when this is disabled by explicitly passing `x dist rustc-src`.
|
||||
#dist-src = true
|
||||
#rust.dist-src = true
|
||||
|
||||
# After building or testing an optional component (e.g. the nomicon or reference), append the
|
||||
# result (broken, compiling, testing) into this JSON file.
|
||||
#save-toolstates = <none> (path)
|
||||
#rust.save-toolstates = <none> (path)
|
||||
|
||||
# This is an array of the codegen backends that will be compiled for the rustc
|
||||
# that's being compiled. The default is to only build the LLVM codegen backend,
|
||||
# and currently the only standard options supported are `"llvm"`, `"cranelift"`
|
||||
# and `"gcc"`. The first backend in this list will be used as default by rustc
|
||||
# when no explicit backend is specified.
|
||||
#codegen-backends = ["llvm"]
|
||||
#rust.codegen-backends = ["llvm"]
|
||||
|
||||
# Indicates whether LLD will be compiled and made available in the sysroot for rustc to execute, and
|
||||
# whether to set it as rustc's default linker on `x86_64-unknown-linux-gnu`. This will also only be
|
||||
# when *not* building an external LLVM (so only when using `download-ci-llvm` or building LLVM from
|
||||
# the in-tree source): setting `llvm-config` in the `[target.x86_64-unknown-linux-gnu]` section will
|
||||
# make this default to false.
|
||||
#lld = false in all cases, except on `x86_64-unknown-linux-gnu` as described above, where it is true
|
||||
#rust.lld = false in all cases, except on `x86_64-unknown-linux-gnu` as described above, where it is true
|
||||
|
||||
# Indicates whether LLD will be used to link Rust crates during bootstrap on
|
||||
# supported platforms.
|
||||
|
|
@ -742,56 +745,56 @@
|
|||
# On MSVC, LLD will not be used if we're cross linking.
|
||||
#
|
||||
# Explicitly setting the linker for a target will override this option when targeting MSVC.
|
||||
#use-lld = false
|
||||
#rust.use-lld = false
|
||||
|
||||
# Indicates whether some LLVM tools, like llvm-objdump, will be made available in the
|
||||
# sysroot.
|
||||
#llvm-tools = true
|
||||
#rust.llvm-tools = true
|
||||
|
||||
# Indicates whether the `self-contained` llvm-bitcode-linker, will be made available
|
||||
# in the sysroot. It is required for running nvptx tests.
|
||||
#llvm-bitcode-linker = false
|
||||
#rust.llvm-bitcode-linker = false
|
||||
|
||||
# Whether to deny warnings in crates
|
||||
#deny-warnings = true
|
||||
#rust.deny-warnings = true
|
||||
|
||||
# Print backtrace on internal compiler errors during bootstrap
|
||||
#backtrace-on-ice = false
|
||||
#rust.backtrace-on-ice = false
|
||||
|
||||
# Whether to verify generated LLVM IR
|
||||
#verify-llvm-ir = false
|
||||
#rust.verify-llvm-ir = false
|
||||
|
||||
# Compile the compiler with a non-default ThinLTO import limit. This import
|
||||
# limit controls the maximum size of functions imported by ThinLTO. Decreasing
|
||||
# will make code compile faster at the expense of lower runtime performance.
|
||||
#thin-lto-import-instr-limit = if incremental { 10 } else { LLVM default (currently 100) }
|
||||
#rust.thin-lto-import-instr-limit = if incremental { 10 } else { LLVM default (currently 100) }
|
||||
|
||||
# Map debuginfo paths to `/rust/$sha/...`.
|
||||
# Useful for reproducible builds. Generally only set for releases
|
||||
#remap-debuginfo = false
|
||||
#rust.remap-debuginfo = false
|
||||
|
||||
# Link the compiler and LLVM against `jemalloc` instead of the default libc allocator.
|
||||
# This option is only tested on Linux and OSX. It can also be configured per-target in the
|
||||
# [target.<tuple>] section.
|
||||
#jemalloc = false
|
||||
#rust.jemalloc = false
|
||||
|
||||
# Run tests in various test suites with the "nll compare mode" in addition to
|
||||
# running the tests in normal mode. Largely only used on CI and during local
|
||||
# development of NLL
|
||||
#test-compare-mode = false
|
||||
#rust.test-compare-mode = false
|
||||
|
||||
# Global default for llvm-libunwind for all targets. See the target-specific
|
||||
# documentation for llvm-libunwind below. Note that the target-specific
|
||||
# option will override this if set.
|
||||
#llvm-libunwind = 'no'
|
||||
#rust.llvm-libunwind = 'no'
|
||||
|
||||
# Enable Windows Control Flow Guard checks in the standard library.
|
||||
# This only applies from stage 1 onwards, and only for Windows targets.
|
||||
#control-flow-guard = false
|
||||
#rust.control-flow-guard = false
|
||||
|
||||
# Enable Windows EHCont Guard checks in the standard library.
|
||||
# This only applies from stage 1 onwards, and only for Windows targets.
|
||||
#ehcont-guard = false
|
||||
#rust.ehcont-guard = false
|
||||
|
||||
# Enable symbol-mangling-version v0. This can be helpful when profiling rustc,
|
||||
# as generics will be preserved in symbols (rather than erased into opaque T).
|
||||
|
|
@ -799,16 +802,16 @@
|
|||
# compiler and its tools and the legacy scheme will be used when compiling the
|
||||
# standard library.
|
||||
# If an explicit setting is given, it will be used for all parts of the codebase.
|
||||
#new-symbol-mangling = true|false (see comment)
|
||||
#rust.new-symbol-mangling = true|false (see comment)
|
||||
|
||||
# Select LTO mode that will be used for compiling rustc. By default, thin local LTO
|
||||
# (LTO within a single crate) is used (like for any Rust crate). You can also select
|
||||
# "thin" or "fat" to apply Thin/Fat LTO to the `rustc_driver` dylib, or "off" to disable
|
||||
# LTO entirely.
|
||||
#lto = "thin-local"
|
||||
#rust.lto = "thin-local"
|
||||
|
||||
# Build compiler with the optimization enabled and -Zvalidate-mir, currently only for `std`
|
||||
#validate-mir-opts = 3
|
||||
#rust.validate-mir-opts = 3
|
||||
|
||||
# Configure `std` features used during bootstrap.
|
||||
#
|
||||
|
|
@ -822,7 +825,57 @@
|
|||
#
|
||||
# Since libstd also builds libcore and liballoc as dependencies and all their features are mirrored
|
||||
# as libstd features, this option can also be used to configure features such as optimize_for_size.
|
||||
#std-features = ["panic_unwind"]
|
||||
#rust.std-features = ["panic_unwind"]
|
||||
|
||||
# =============================================================================
|
||||
# Distribution options
|
||||
#
|
||||
# These options are related to distribution, mostly for the Rust project itself.
|
||||
# You probably won't need to concern yourself with any of these options
|
||||
# =============================================================================
|
||||
|
||||
# This is the folder of artifacts that the build system will sign. All files in
|
||||
# this directory will be signed with the default gpg key using the system `gpg`
|
||||
# binary. The `asc` and `sha256` files will all be output into the standard dist
|
||||
# output folder (currently `build/dist`)
|
||||
#
|
||||
# This folder should be populated ahead of time before the build system is
|
||||
# invoked.
|
||||
#dist.sign-folder = <none> (path)
|
||||
|
||||
# The remote address that all artifacts will eventually be uploaded to. The
|
||||
# build system generates manifests which will point to these urls, and for the
|
||||
# manifests to be correct they'll have to have the right URLs encoded.
|
||||
#
|
||||
# Note that this address should not contain a trailing slash as file names will
|
||||
# be appended to it.
|
||||
#dist.upload-addr = <none> (URL)
|
||||
|
||||
# Whether to build a plain source tarball to upload
|
||||
# We disable that on Windows not to override the one already uploaded on S3
|
||||
# as the one built on Windows will contain backslashes in paths causing problems
|
||||
# on linux
|
||||
#dist.src-tarball = true
|
||||
|
||||
# List of compression formats to use when generating dist tarballs. The list of
|
||||
# formats is provided to rust-installer, which must support all of them.
|
||||
#
|
||||
# This list must be non-empty.
|
||||
#dist.compression-formats = ["gz", "xz"]
|
||||
|
||||
# How much time should be spent compressing the tarballs. The better the
|
||||
# compression profile, the longer compression will take.
|
||||
#
|
||||
# Available options: fast, balanced, best
|
||||
#dist.compression-profile = "fast"
|
||||
|
||||
# Copy the linker, DLLs, and various libraries from MinGW into the Rust toolchain.
|
||||
# Only applies when the host or target is pc-windows-gnu.
|
||||
#dist.include-mingw-linker = true
|
||||
|
||||
# Whether to vendor dependencies for the dist tarball.
|
||||
#dist.vendor = if "is a tarball source" || "is a git repository" { true } else { false }
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Options for specific targets
|
||||
|
|
@ -973,53 +1026,3 @@
|
|||
# Link the compiler and LLVM against `jemalloc` instead of the default libc allocator.
|
||||
# This overrides the global `rust.jemalloc` option. See that option for more info.
|
||||
#jemalloc = rust.jemalloc (bool)
|
||||
|
||||
# =============================================================================
|
||||
# Distribution options
|
||||
#
|
||||
# These options are related to distribution, mostly for the Rust project itself.
|
||||
# You probably won't need to concern yourself with any of these options
|
||||
# =============================================================================
|
||||
[dist]
|
||||
|
||||
# This is the folder of artifacts that the build system will sign. All files in
|
||||
# this directory will be signed with the default gpg key using the system `gpg`
|
||||
# binary. The `asc` and `sha256` files will all be output into the standard dist
|
||||
# output folder (currently `build/dist`)
|
||||
#
|
||||
# This folder should be populated ahead of time before the build system is
|
||||
# invoked.
|
||||
#sign-folder = <none> (path)
|
||||
|
||||
# The remote address that all artifacts will eventually be uploaded to. The
|
||||
# build system generates manifests which will point to these urls, and for the
|
||||
# manifests to be correct they'll have to have the right URLs encoded.
|
||||
#
|
||||
# Note that this address should not contain a trailing slash as file names will
|
||||
# be appended to it.
|
||||
#upload-addr = <none> (URL)
|
||||
|
||||
# Whether to build a plain source tarball to upload
|
||||
# We disable that on Windows not to override the one already uploaded on S3
|
||||
# as the one built on Windows will contain backslashes in paths causing problems
|
||||
# on linux
|
||||
#src-tarball = true
|
||||
|
||||
# List of compression formats to use when generating dist tarballs. The list of
|
||||
# formats is provided to rust-installer, which must support all of them.
|
||||
#
|
||||
# This list must be non-empty.
|
||||
#compression-formats = ["gz", "xz"]
|
||||
|
||||
# How much time should be spent compressing the tarballs. The better the
|
||||
# compression profile, the longer compression will take.
|
||||
#
|
||||
# Available options: fast, balanced, best
|
||||
#compression-profile = "fast"
|
||||
|
||||
# Copy the linker, DLLs, and various libraries from MinGW into the Rust toolchain.
|
||||
# Only applies when the host or target is pc-windows-gnu.
|
||||
#include-mingw-linker = true
|
||||
|
||||
# Whether to vendor dependencies for the dist tarball.
|
||||
#vendor = if "is a tarball source" || "is a git repository" { true } else { false }
|
||||
|
|
|
|||
|
|
@ -63,8 +63,8 @@ impl fmt::Display for CanonAbi {
|
|||
CanonAbi::Custom => ExternAbi::Custom,
|
||||
CanonAbi::Arm(arm_call) => match arm_call {
|
||||
ArmCall::Aapcs => ExternAbi::Aapcs { unwind: false },
|
||||
ArmCall::CCmseNonSecureCall => ExternAbi::CCmseNonSecureCall,
|
||||
ArmCall::CCmseNonSecureEntry => ExternAbi::CCmseNonSecureEntry,
|
||||
ArmCall::CCmseNonSecureCall => ExternAbi::CmseNonSecureCall,
|
||||
ArmCall::CCmseNonSecureEntry => ExternAbi::CmseNonSecureEntry,
|
||||
},
|
||||
CanonAbi::GpuKernel => ExternAbi::GpuKernel,
|
||||
CanonAbi::Interrupt(interrupt_kind) => match interrupt_kind {
|
||||
|
|
|
|||
|
|
@ -36,6 +36,10 @@ pub enum ExternAbi {
|
|||
/// Stronger than just `#[cold]` because `fn` pointers might be incompatible.
|
||||
RustCold,
|
||||
|
||||
/// An always-invalid ABI that's used to test "this ABI is not supported by this platform"
|
||||
/// in a platform-agnostic way.
|
||||
RustInvalid,
|
||||
|
||||
/// Unstable impl detail that directly uses Rust types to describe the ABI to LLVM.
|
||||
/// Even normally-compatible Rust types can become ABI-incompatible with this ABI!
|
||||
Unadjusted,
|
||||
|
|
@ -55,9 +59,9 @@ pub enum ExternAbi {
|
|||
unwind: bool,
|
||||
},
|
||||
/// extremely constrained barely-C ABI for TrustZone
|
||||
CCmseNonSecureCall,
|
||||
CmseNonSecureCall,
|
||||
/// extremely constrained barely-C ABI for TrustZone
|
||||
CCmseNonSecureEntry,
|
||||
CmseNonSecureEntry,
|
||||
|
||||
/* gpu */
|
||||
/// An entry-point function called by the GPU's host
|
||||
|
|
@ -136,8 +140,6 @@ macro_rules! abi_impls {
|
|||
abi_impls! {
|
||||
ExternAbi = {
|
||||
C { unwind: false } =><= "C",
|
||||
CCmseNonSecureCall =><= "C-cmse-nonsecure-call",
|
||||
CCmseNonSecureEntry =><= "C-cmse-nonsecure-entry",
|
||||
C { unwind: true } =><= "C-unwind",
|
||||
Rust =><= "Rust",
|
||||
Aapcs { unwind: false } =><= "aapcs",
|
||||
|
|
@ -146,6 +148,8 @@ abi_impls! {
|
|||
AvrNonBlockingInterrupt =><= "avr-non-blocking-interrupt",
|
||||
Cdecl { unwind: false } =><= "cdecl",
|
||||
Cdecl { unwind: true } =><= "cdecl-unwind",
|
||||
CmseNonSecureCall =><= "cmse-nonsecure-call",
|
||||
CmseNonSecureEntry =><= "cmse-nonsecure-entry",
|
||||
Custom =><= "custom",
|
||||
EfiApi =><= "efiapi",
|
||||
Fastcall { unwind: false } =><= "fastcall",
|
||||
|
|
@ -157,6 +161,7 @@ abi_impls! {
|
|||
RiscvInterruptS =><= "riscv-interrupt-s",
|
||||
RustCall =><= "rust-call",
|
||||
RustCold =><= "rust-cold",
|
||||
RustInvalid =><= "rust-invalid",
|
||||
Stdcall { unwind: false } =><= "stdcall",
|
||||
Stdcall { unwind: true } =><= "stdcall-unwind",
|
||||
System { unwind: false } =><= "system",
|
||||
|
|
|
|||
|
|
@ -432,7 +432,7 @@ impl<Cx: HasDataLayout> LayoutCalculator<Cx> {
|
|||
align = align.min(AbiAlign::new(pack));
|
||||
}
|
||||
// The unadjusted ABI alignment does not include repr(align), but does include repr(pack).
|
||||
// See documentation on `LayoutS::unadjusted_abi_align`.
|
||||
// See documentation on `LayoutData::unadjusted_abi_align`.
|
||||
let unadjusted_abi_align = align.abi;
|
||||
if let Some(repr_align) = repr.align {
|
||||
align = align.max(AbiAlign::new(repr_align));
|
||||
|
|
@ -602,10 +602,10 @@ impl<Cx: HasDataLayout> LayoutCalculator<Cx> {
|
|||
dont_niche_optimize_enum: bool,
|
||||
) -> LayoutCalculatorResult<FieldIdx, VariantIdx, F> {
|
||||
// Until we've decided whether to use the tagged or
|
||||
// niche filling LayoutS, we don't want to intern the
|
||||
// niche filling LayoutData, we don't want to intern the
|
||||
// variant layouts, so we can't store them in the
|
||||
// overall LayoutS. Store the overall LayoutS
|
||||
// and the variant LayoutSs here until then.
|
||||
// overall LayoutData. Store the overall LayoutData
|
||||
// and the variant LayoutDatas here until then.
|
||||
struct TmpLayout<FieldIdx: Idx, VariantIdx: Idx> {
|
||||
layout: LayoutData<FieldIdx, VariantIdx>,
|
||||
variants: IndexVec<VariantIdx, LayoutData<FieldIdx, VariantIdx>>,
|
||||
|
|
@ -1214,7 +1214,7 @@ impl<Cx: HasDataLayout> LayoutCalculator<Cx> {
|
|||
|
||||
match kind {
|
||||
StructKind::AlwaysSized | StructKind::MaybeUnsized => {
|
||||
// Currently `LayoutS` only exposes a single niche so sorting is usually
|
||||
// Currently `LayoutData` only exposes a single niche so sorting is usually
|
||||
// sufficient to get one niche into the preferred position. If it ever
|
||||
// supported multiple niches then a more advanced pick-and-pack approach could
|
||||
// provide better results. But even for the single-niche cache it's not
|
||||
|
|
@ -1333,7 +1333,7 @@ impl<Cx: HasDataLayout> LayoutCalculator<Cx> {
|
|||
}
|
||||
|
||||
// The unadjusted ABI alignment does not include repr(align), but does include repr(pack).
|
||||
// See documentation on `LayoutS::unadjusted_abi_align`.
|
||||
// See documentation on `LayoutData::unadjusted_abi_align`.
|
||||
let unadjusted_abi_align = align.abi;
|
||||
if let Some(repr_align) = repr.align {
|
||||
align = align.max(AbiAlign::new(repr_align));
|
||||
|
|
|
|||
|
|
@ -71,7 +71,7 @@ pub struct Layout<'a>(pub Interned<'a, LayoutData<FieldIdx, VariantIdx>>);
|
|||
|
||||
impl<'a> fmt::Debug for Layout<'a> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
// See comment on `<LayoutS as Debug>::fmt` above.
|
||||
// See comment on `<LayoutData as Debug>::fmt` above.
|
||||
self.0.0.fmt(f)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1785,7 +1785,7 @@ where
|
|||
{
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
// This is how `Layout` used to print before it become
|
||||
// `Interned<LayoutS>`. We print it like this to avoid having to update
|
||||
// `Interned<LayoutData>`. We print it like this to avoid having to update
|
||||
// expected output in a lot of tests.
|
||||
let LayoutData {
|
||||
size,
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ edition = "2024"
|
|||
# tidy-alphabetical-start
|
||||
bitflags = "2.4.1"
|
||||
memchr = "2.7.4"
|
||||
rustc-literal-escaper = "0.0.2"
|
||||
rustc-literal-escaper = "0.0.4"
|
||||
rustc_ast_ir = { path = "../rustc_ast_ir" }
|
||||
rustc_data_structures = { path = "../rustc_data_structures" }
|
||||
rustc_index = { path = "../rustc_index" }
|
||||
|
|
|
|||
|
|
@ -904,6 +904,10 @@ pub enum BorrowKind {
|
|||
/// The resulting type is either `*const T` or `*mut T`
|
||||
/// where `T = typeof($expr)`.
|
||||
Raw,
|
||||
/// A pinned borrow, `&pin const $expr` or `&pin mut $expr`.
|
||||
/// The resulting type is either `Pin<&'a T>` or `Pin<&'a mut T>`
|
||||
/// where `T = typeof($expr)` and `'a` is some lifetime.
|
||||
Pin,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Encodable, Decodable, HashStable_Generic)]
|
||||
|
|
|
|||
|
|
@ -321,6 +321,13 @@ impl<Wrapped, Tag> AstNodeWrapper<Wrapped, Tag> {
|
|||
}
|
||||
}
|
||||
|
||||
// FIXME: remove after `stmt_expr_attributes` is stabilized.
|
||||
impl<T, Tag> From<AstNodeWrapper<P<T>, Tag>> for AstNodeWrapper<T, Tag> {
|
||||
fn from(value: AstNodeWrapper<P<T>, Tag>) -> Self {
|
||||
AstNodeWrapper { wrapped: *value.wrapped, tag: value.tag }
|
||||
}
|
||||
}
|
||||
|
||||
impl<Wrapped: HasNodeId, Tag> HasNodeId for AstNodeWrapper<Wrapped, Tag> {
|
||||
fn node_id(&self) -> NodeId {
|
||||
self.wrapped.node_id()
|
||||
|
|
|
|||
|
|
@ -206,12 +206,24 @@ impl AttributeExt for Attribute {
|
|||
}
|
||||
}
|
||||
|
||||
fn style(&self) -> AttrStyle {
|
||||
self.style
|
||||
fn doc_resolution_scope(&self) -> Option<AttrStyle> {
|
||||
match &self.kind {
|
||||
AttrKind::DocComment(..) => Some(self.style),
|
||||
AttrKind::Normal(normal)
|
||||
if normal.item.path == sym::doc && normal.item.value_str().is_some() =>
|
||||
{
|
||||
Some(self.style)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Attribute {
|
||||
pub fn style(&self) -> AttrStyle {
|
||||
self.style
|
||||
}
|
||||
|
||||
pub fn may_have_doc_links(&self) -> bool {
|
||||
self.doc_str().is_some_and(|s| comments::may_have_doc_links(s.as_str()))
|
||||
}
|
||||
|
|
@ -806,7 +818,14 @@ pub trait AttributeExt: Debug {
|
|||
/// * `#[doc(...)]` returns `None`.
|
||||
fn doc_str_and_comment_kind(&self) -> Option<(Symbol, CommentKind)>;
|
||||
|
||||
fn style(&self) -> AttrStyle;
|
||||
/// Returns outer or inner if this is a doc attribute or a sugared doc
|
||||
/// comment, otherwise None.
|
||||
///
|
||||
/// This is used in the case of doc comments on modules, to decide whether
|
||||
/// to resolve intra-doc links against the symbols in scope within the
|
||||
/// commented module (for inner doc) vs within its parent module (for outer
|
||||
/// doc).
|
||||
fn doc_resolution_scope(&self) -> Option<AttrStyle>;
|
||||
}
|
||||
|
||||
// FIXME(fn_delegation): use function delegation instead of manually forwarding
|
||||
|
|
@ -881,8 +900,4 @@ impl Attribute {
|
|||
pub fn doc_str_and_comment_kind(&self) -> Option<(Symbol, CommentKind)> {
|
||||
AttributeExt::doc_str_and_comment_kind(self)
|
||||
}
|
||||
|
||||
pub fn style(&self) -> AttrStyle {
|
||||
AttributeExt::style(self)
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -50,6 +50,14 @@ pub struct FormatArgs {
|
|||
///
|
||||
/// Generally only useful for lints that care about the raw bytes the user wrote.
|
||||
pub uncooked_fmt_str: (LitKind, Symbol),
|
||||
/// Was the format literal written in the source?
|
||||
/// - `format!("boo")` => true,
|
||||
/// - `format!(concat!("b", "o", "o"))` => false,
|
||||
/// - `format!(include_str!("boo.txt"))` => false,
|
||||
///
|
||||
/// If it wasn't written in the source then we have to be careful with spans pointing into it
|
||||
/// and suggestions about rewriting it.
|
||||
pub is_source_literal: bool,
|
||||
}
|
||||
|
||||
/// A piece of a format template string.
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@ use std::panic;
|
|||
use rustc_data_structures::flat_map_in_place::FlatMapInPlace;
|
||||
use rustc_span::source_map::Spanned;
|
||||
use rustc_span::{Ident, Span};
|
||||
use smallvec::{Array, SmallVec, smallvec};
|
||||
use smallvec::{SmallVec, smallvec};
|
||||
use thin_vec::ThinVec;
|
||||
|
||||
use crate::ast::*;
|
||||
|
|
@ -21,17 +21,6 @@ use crate::ptr::P;
|
|||
use crate::tokenstream::*;
|
||||
use crate::visit::{AssocCtxt, BoundKind, FnCtxt, VisitorResult, try_visit, visit_opt, walk_list};
|
||||
|
||||
pub trait ExpectOne<A: Array> {
|
||||
fn expect_one(self, err: &'static str) -> A::Item;
|
||||
}
|
||||
|
||||
impl<A: Array> ExpectOne<A> for SmallVec<A> {
|
||||
fn expect_one(self, err: &'static str) -> A::Item {
|
||||
assert!(self.len() == 1, "{}", err);
|
||||
self.into_iter().next().unwrap()
|
||||
}
|
||||
}
|
||||
|
||||
mod sealed {
|
||||
use rustc_ast_ir::visit::VisitorResult;
|
||||
|
||||
|
|
@ -47,323 +36,6 @@ mod sealed {
|
|||
|
||||
use sealed::MutVisitorResult;
|
||||
|
||||
pub trait MutVisitor: Sized + MutVisitorResult<Result = ()> {
|
||||
// Methods in this trait have one of three forms:
|
||||
//
|
||||
// fn visit_t(&mut self, t: &mut T); // common
|
||||
// fn flat_map_t(&mut self, t: T) -> SmallVec<[T; 1]>; // rare
|
||||
// fn filter_map_t(&mut self, t: T) -> Option<T>; // rarest
|
||||
//
|
||||
// When writing these methods, it is better to use destructuring like this:
|
||||
//
|
||||
// fn visit_abc(&mut self, ABC { a, b, c: _ }: &mut ABC) {
|
||||
// visit_a(a);
|
||||
// visit_b(b);
|
||||
// }
|
||||
//
|
||||
// than to use field access like this:
|
||||
//
|
||||
// fn visit_abc(&mut self, abc: &mut ABC) {
|
||||
// visit_a(&mut abc.a);
|
||||
// visit_b(&mut abc.b);
|
||||
// // ignore abc.c
|
||||
// }
|
||||
//
|
||||
// As well as being more concise, the former is explicit about which fields
|
||||
// are skipped. Furthermore, if a new field is added, the destructuring
|
||||
// version will cause a compile error, which is good. In comparison, the
|
||||
// field access version will continue working and it would be easy to
|
||||
// forget to add handling for it.
|
||||
|
||||
fn visit_crate(&mut self, c: &mut Crate) {
|
||||
walk_crate(self, c)
|
||||
}
|
||||
|
||||
fn visit_meta_list_item(&mut self, list_item: &mut MetaItemInner) {
|
||||
walk_meta_list_item(self, list_item);
|
||||
}
|
||||
|
||||
fn visit_meta_item(&mut self, meta_item: &mut MetaItem) {
|
||||
walk_meta_item(self, meta_item);
|
||||
}
|
||||
|
||||
fn visit_use_tree(&mut self, use_tree: &mut UseTree) {
|
||||
walk_use_tree(self, use_tree);
|
||||
}
|
||||
|
||||
fn visit_foreign_item(&mut self, ni: &mut ForeignItem) {
|
||||
walk_item(self, ni);
|
||||
}
|
||||
|
||||
fn flat_map_foreign_item(&mut self, ni: P<ForeignItem>) -> SmallVec<[P<ForeignItem>; 1]> {
|
||||
walk_flat_map_foreign_item(self, ni)
|
||||
}
|
||||
|
||||
fn visit_item(&mut self, i: &mut Item) {
|
||||
walk_item(self, i);
|
||||
}
|
||||
|
||||
fn flat_map_item(&mut self, i: P<Item>) -> SmallVec<[P<Item>; 1]> {
|
||||
walk_flat_map_item(self, i)
|
||||
}
|
||||
|
||||
fn visit_fn_header(&mut self, header: &mut FnHeader) {
|
||||
walk_fn_header(self, header);
|
||||
}
|
||||
|
||||
fn visit_field_def(&mut self, fd: &mut FieldDef) {
|
||||
walk_field_def(self, fd);
|
||||
}
|
||||
|
||||
fn flat_map_field_def(&mut self, fd: FieldDef) -> SmallVec<[FieldDef; 1]> {
|
||||
walk_flat_map_field_def(self, fd)
|
||||
}
|
||||
|
||||
fn visit_assoc_item(&mut self, i: &mut AssocItem, ctxt: AssocCtxt) {
|
||||
walk_assoc_item(self, i, ctxt)
|
||||
}
|
||||
|
||||
fn flat_map_assoc_item(
|
||||
&mut self,
|
||||
i: P<AssocItem>,
|
||||
ctxt: AssocCtxt,
|
||||
) -> SmallVec<[P<AssocItem>; 1]> {
|
||||
walk_flat_map_assoc_item(self, i, ctxt)
|
||||
}
|
||||
|
||||
fn visit_contract(&mut self, c: &mut FnContract) {
|
||||
walk_contract(self, c);
|
||||
}
|
||||
|
||||
fn visit_fn_decl(&mut self, d: &mut FnDecl) {
|
||||
walk_fn_decl(self, d);
|
||||
}
|
||||
|
||||
/// `Span` and `NodeId` are mutated at the caller site.
|
||||
fn visit_fn(&mut self, fk: FnKind<'_>, _: Span, _: NodeId) {
|
||||
walk_fn(self, fk)
|
||||
}
|
||||
|
||||
fn visit_coroutine_kind(&mut self, a: &mut CoroutineKind) {
|
||||
walk_coroutine_kind(self, a);
|
||||
}
|
||||
|
||||
fn visit_closure_binder(&mut self, b: &mut ClosureBinder) {
|
||||
walk_closure_binder(self, b);
|
||||
}
|
||||
|
||||
fn visit_block(&mut self, b: &mut Block) {
|
||||
walk_block(self, b);
|
||||
}
|
||||
|
||||
fn flat_map_stmt(&mut self, s: Stmt) -> SmallVec<[Stmt; 1]> {
|
||||
walk_flat_map_stmt(self, s)
|
||||
}
|
||||
|
||||
fn visit_arm(&mut self, arm: &mut Arm) {
|
||||
walk_arm(self, arm);
|
||||
}
|
||||
|
||||
fn flat_map_arm(&mut self, arm: Arm) -> SmallVec<[Arm; 1]> {
|
||||
walk_flat_map_arm(self, arm)
|
||||
}
|
||||
|
||||
fn visit_pat(&mut self, p: &mut Pat) {
|
||||
walk_pat(self, p);
|
||||
}
|
||||
|
||||
fn visit_anon_const(&mut self, c: &mut AnonConst) {
|
||||
walk_anon_const(self, c);
|
||||
}
|
||||
|
||||
fn visit_expr(&mut self, e: &mut Expr) {
|
||||
walk_expr(self, e);
|
||||
}
|
||||
|
||||
/// This method is a hack to workaround unstable of `stmt_expr_attributes`.
|
||||
/// It can be removed once that feature is stabilized.
|
||||
fn visit_method_receiver_expr(&mut self, ex: &mut P<Expr>) {
|
||||
self.visit_expr(ex)
|
||||
}
|
||||
|
||||
fn filter_map_expr(&mut self, e: P<Expr>) -> Option<P<Expr>> {
|
||||
walk_filter_map_expr(self, e)
|
||||
}
|
||||
|
||||
fn visit_generic_arg(&mut self, arg: &mut GenericArg) {
|
||||
walk_generic_arg(self, arg);
|
||||
}
|
||||
|
||||
fn visit_ty(&mut self, t: &mut Ty) {
|
||||
walk_ty(self, t);
|
||||
}
|
||||
|
||||
fn visit_ty_pat(&mut self, t: &mut TyPat) {
|
||||
walk_ty_pat(self, t);
|
||||
}
|
||||
|
||||
fn visit_lifetime(&mut self, l: &mut Lifetime) {
|
||||
walk_lifetime(self, l);
|
||||
}
|
||||
|
||||
fn visit_assoc_item_constraint(&mut self, c: &mut AssocItemConstraint) {
|
||||
walk_assoc_item_constraint(self, c);
|
||||
}
|
||||
|
||||
fn visit_foreign_mod(&mut self, nm: &mut ForeignMod) {
|
||||
walk_foreign_mod(self, nm);
|
||||
}
|
||||
|
||||
fn visit_variant(&mut self, v: &mut Variant) {
|
||||
walk_variant(self, v);
|
||||
}
|
||||
|
||||
fn flat_map_variant(&mut self, v: Variant) -> SmallVec<[Variant; 1]> {
|
||||
walk_flat_map_variant(self, v)
|
||||
}
|
||||
|
||||
fn visit_ident(&mut self, i: &mut Ident) {
|
||||
self.visit_span(&mut i.span);
|
||||
}
|
||||
|
||||
fn visit_path(&mut self, p: &mut Path) {
|
||||
walk_path(self, p);
|
||||
}
|
||||
|
||||
fn visit_path_segment(&mut self, p: &mut PathSegment) {
|
||||
walk_path_segment(self, p)
|
||||
}
|
||||
|
||||
fn visit_qself(&mut self, qs: &mut Option<P<QSelf>>) {
|
||||
walk_qself(self, qs);
|
||||
}
|
||||
|
||||
fn visit_generic_args(&mut self, p: &mut GenericArgs) {
|
||||
walk_generic_args(self, p);
|
||||
}
|
||||
|
||||
fn visit_local(&mut self, l: &mut Local) {
|
||||
walk_local(self, l);
|
||||
}
|
||||
|
||||
fn visit_mac_call(&mut self, mac: &mut MacCall) {
|
||||
walk_mac(self, mac);
|
||||
}
|
||||
|
||||
fn visit_macro_def(&mut self, def: &mut MacroDef) {
|
||||
walk_macro_def(self, def);
|
||||
}
|
||||
|
||||
fn visit_label(&mut self, label: &mut Label) {
|
||||
walk_label(self, label);
|
||||
}
|
||||
|
||||
fn visit_attribute(&mut self, at: &mut Attribute) {
|
||||
walk_attribute(self, at);
|
||||
}
|
||||
|
||||
fn visit_param(&mut self, param: &mut Param) {
|
||||
walk_param(self, param);
|
||||
}
|
||||
|
||||
fn flat_map_param(&mut self, param: Param) -> SmallVec<[Param; 1]> {
|
||||
walk_flat_map_param(self, param)
|
||||
}
|
||||
|
||||
fn visit_generics(&mut self, generics: &mut Generics) {
|
||||
walk_generics(self, generics);
|
||||
}
|
||||
|
||||
fn visit_trait_ref(&mut self, tr: &mut TraitRef) {
|
||||
walk_trait_ref(self, tr);
|
||||
}
|
||||
|
||||
fn visit_poly_trait_ref(&mut self, p: &mut PolyTraitRef) {
|
||||
walk_poly_trait_ref(self, p);
|
||||
}
|
||||
|
||||
fn visit_variant_data(&mut self, vdata: &mut VariantData) {
|
||||
walk_variant_data(self, vdata);
|
||||
}
|
||||
|
||||
fn visit_generic_param(&mut self, param: &mut GenericParam) {
|
||||
walk_generic_param(self, param)
|
||||
}
|
||||
|
||||
fn flat_map_generic_param(&mut self, param: GenericParam) -> SmallVec<[GenericParam; 1]> {
|
||||
walk_flat_map_generic_param(self, param)
|
||||
}
|
||||
|
||||
fn visit_param_bound(&mut self, tpb: &mut GenericBound, _ctxt: BoundKind) {
|
||||
walk_param_bound(self, tpb);
|
||||
}
|
||||
|
||||
fn visit_precise_capturing_arg(&mut self, arg: &mut PreciseCapturingArg) {
|
||||
walk_precise_capturing_arg(self, arg);
|
||||
}
|
||||
|
||||
fn visit_expr_field(&mut self, f: &mut ExprField) {
|
||||
walk_expr_field(self, f);
|
||||
}
|
||||
|
||||
fn flat_map_expr_field(&mut self, f: ExprField) -> SmallVec<[ExprField; 1]> {
|
||||
walk_flat_map_expr_field(self, f)
|
||||
}
|
||||
|
||||
fn flat_map_where_predicate(
|
||||
&mut self,
|
||||
where_predicate: WherePredicate,
|
||||
) -> SmallVec<[WherePredicate; 1]> {
|
||||
walk_flat_map_where_predicate(self, where_predicate)
|
||||
}
|
||||
|
||||
fn visit_where_predicate_kind(&mut self, kind: &mut WherePredicateKind) {
|
||||
walk_where_predicate_kind(self, kind)
|
||||
}
|
||||
|
||||
fn visit_vis(&mut self, vis: &mut Visibility) {
|
||||
walk_vis(self, vis);
|
||||
}
|
||||
|
||||
fn visit_id(&mut self, _id: &mut NodeId) {
|
||||
// Do nothing.
|
||||
}
|
||||
|
||||
// Span visiting is no longer used, but we keep it for now,
|
||||
// in case it's needed for something like #127241.
|
||||
fn visit_span(&mut self, _sp: &mut Span) {
|
||||
// Do nothing.
|
||||
}
|
||||
|
||||
fn visit_pat_field(&mut self, fp: &mut PatField) {
|
||||
walk_pat_field(self, fp)
|
||||
}
|
||||
|
||||
fn flat_map_pat_field(&mut self, fp: PatField) -> SmallVec<[PatField; 1]> {
|
||||
walk_flat_map_pat_field(self, fp)
|
||||
}
|
||||
|
||||
fn visit_inline_asm(&mut self, asm: &mut InlineAsm) {
|
||||
walk_inline_asm(self, asm)
|
||||
}
|
||||
|
||||
fn visit_inline_asm_sym(&mut self, sym: &mut InlineAsmSym) {
|
||||
walk_inline_asm_sym(self, sym)
|
||||
}
|
||||
|
||||
fn visit_format_args(&mut self, fmt: &mut FormatArgs) {
|
||||
walk_format_args(self, fmt)
|
||||
}
|
||||
|
||||
fn visit_capture_by(&mut self, capture_by: &mut CaptureBy) {
|
||||
walk_capture_by(self, capture_by)
|
||||
}
|
||||
|
||||
fn visit_fn_ret_ty(&mut self, fn_ret_ty: &mut FnRetTy) {
|
||||
walk_fn_ret_ty(self, fn_ret_ty)
|
||||
}
|
||||
}
|
||||
|
||||
super::common_visitor_and_walkers!((mut) MutVisitor);
|
||||
|
||||
macro_rules! generate_flat_map_visitor_fns {
|
||||
|
|
@ -398,22 +70,6 @@ generate_flat_map_visitor_fns! {
|
|||
visit_arms, Arm, flat_map_arm;
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn visit_thin_vec<T, F>(elems: &mut ThinVec<T>, mut visit_elem: F)
|
||||
where
|
||||
F: FnMut(&mut T),
|
||||
{
|
||||
for elem in elems {
|
||||
visit_elem(elem);
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_attrs<T: MutVisitor>(vis: &mut T, attrs: &mut AttrVec) {
|
||||
for attr in attrs.iter_mut() {
|
||||
vis.visit_attribute(attr);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_pat_field<T: MutVisitor>(
|
||||
vis: &mut T,
|
||||
mut fp: PatField,
|
||||
|
|
@ -431,47 +87,26 @@ fn visit_nested_use_tree<V: MutVisitor>(
|
|||
vis.visit_use_tree(nested_tree);
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_arm<T: MutVisitor>(vis: &mut T, mut arm: Arm) -> SmallVec<[Arm; 1]> {
|
||||
vis.visit_arm(&mut arm);
|
||||
smallvec![arm]
|
||||
macro_rules! generate_walk_flat_map_fns {
|
||||
($($fn_name:ident($Ty:ty$(,$extra_name:ident: $ExtraTy:ty)*) => $visit_fn_name:ident;)+) => {$(
|
||||
pub fn $fn_name<V: MutVisitor>(vis: &mut V, mut value: $Ty$(,$extra_name: $ExtraTy)*) -> SmallVec<[$Ty; 1]> {
|
||||
vis.$visit_fn_name(&mut value$(,$extra_name)*);
|
||||
smallvec![value]
|
||||
}
|
||||
)+};
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_variant<T: MutVisitor>(
|
||||
vis: &mut T,
|
||||
mut variant: Variant,
|
||||
) -> SmallVec<[Variant; 1]> {
|
||||
vis.visit_variant(&mut variant);
|
||||
smallvec![variant]
|
||||
}
|
||||
|
||||
fn walk_meta_list_item<T: MutVisitor>(vis: &mut T, li: &mut MetaItemInner) {
|
||||
match li {
|
||||
MetaItemInner::MetaItem(mi) => vis.visit_meta_item(mi),
|
||||
MetaItemInner::Lit(_lit) => {}
|
||||
}
|
||||
}
|
||||
|
||||
fn walk_meta_item<T: MutVisitor>(vis: &mut T, mi: &mut MetaItem) {
|
||||
let MetaItem { unsafety: _, path: _, kind, span } = mi;
|
||||
match kind {
|
||||
MetaItemKind::Word => {}
|
||||
MetaItemKind::List(mis) => visit_thin_vec(mis, |mi| vis.visit_meta_list_item(mi)),
|
||||
MetaItemKind::NameValue(_s) => {}
|
||||
}
|
||||
vis.visit_span(span);
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_param<T: MutVisitor>(vis: &mut T, mut param: Param) -> SmallVec<[Param; 1]> {
|
||||
vis.visit_param(&mut param);
|
||||
smallvec![param]
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_generic_param<T: MutVisitor>(
|
||||
vis: &mut T,
|
||||
mut param: GenericParam,
|
||||
) -> SmallVec<[GenericParam; 1]> {
|
||||
vis.visit_generic_param(&mut param);
|
||||
smallvec![param]
|
||||
generate_walk_flat_map_fns! {
|
||||
walk_flat_map_arm(Arm) => visit_arm;
|
||||
walk_flat_map_variant(Variant) => visit_variant;
|
||||
walk_flat_map_param(Param) => visit_param;
|
||||
walk_flat_map_generic_param(GenericParam) => visit_generic_param;
|
||||
walk_flat_map_where_predicate(WherePredicate) => visit_where_predicate;
|
||||
walk_flat_map_field_def(FieldDef) => visit_field_def;
|
||||
walk_flat_map_expr_field(ExprField) => visit_expr_field;
|
||||
walk_flat_map_item(P<Item>) => visit_item;
|
||||
walk_flat_map_foreign_item(P<ForeignItem>) => visit_foreign_item;
|
||||
walk_flat_map_assoc_item(P<AssocItem>, ctxt: AssocCtxt) => visit_assoc_item;
|
||||
}
|
||||
|
||||
fn walk_ty_alias_where_clauses<T: MutVisitor>(vis: &mut T, tawcs: &mut TyAliasWhereClauses) {
|
||||
|
|
@ -482,63 +117,6 @@ fn walk_ty_alias_where_clauses<T: MutVisitor>(vis: &mut T, tawcs: &mut TyAliasWh
|
|||
vis.visit_span(span_after);
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_where_predicate<T: MutVisitor>(
|
||||
vis: &mut T,
|
||||
mut pred: WherePredicate,
|
||||
) -> SmallVec<[WherePredicate; 1]> {
|
||||
walk_where_predicate(vis, &mut pred);
|
||||
smallvec![pred]
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_field_def<T: MutVisitor>(
|
||||
vis: &mut T,
|
||||
mut fd: FieldDef,
|
||||
) -> SmallVec<[FieldDef; 1]> {
|
||||
vis.visit_field_def(&mut fd);
|
||||
smallvec![fd]
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_expr_field<T: MutVisitor>(
|
||||
vis: &mut T,
|
||||
mut f: ExprField,
|
||||
) -> SmallVec<[ExprField; 1]> {
|
||||
vis.visit_expr_field(&mut f);
|
||||
smallvec![f]
|
||||
}
|
||||
|
||||
pub fn walk_item_kind<K: WalkItemKind>(
|
||||
kind: &mut K,
|
||||
span: Span,
|
||||
id: NodeId,
|
||||
visibility: &mut Visibility,
|
||||
ctxt: K::Ctxt,
|
||||
vis: &mut impl MutVisitor,
|
||||
) {
|
||||
kind.walk(span, id, visibility, ctxt, vis)
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_item(vis: &mut impl MutVisitor, mut item: P<Item>) -> SmallVec<[P<Item>; 1]> {
|
||||
vis.visit_item(&mut item);
|
||||
smallvec![item]
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_foreign_item(
|
||||
vis: &mut impl MutVisitor,
|
||||
mut item: P<ForeignItem>,
|
||||
) -> SmallVec<[P<ForeignItem>; 1]> {
|
||||
vis.visit_foreign_item(&mut item);
|
||||
smallvec![item]
|
||||
}
|
||||
|
||||
pub fn walk_flat_map_assoc_item(
|
||||
vis: &mut impl MutVisitor,
|
||||
mut item: P<AssocItem>,
|
||||
ctxt: AssocCtxt,
|
||||
) -> SmallVec<[P<AssocItem>; 1]> {
|
||||
vis.visit_assoc_item(&mut item, ctxt);
|
||||
smallvec![item]
|
||||
}
|
||||
|
||||
pub fn walk_filter_map_expr<T: MutVisitor>(vis: &mut T, mut e: P<Expr>) -> Option<P<Expr>> {
|
||||
vis.visit_expr(&mut e);
|
||||
Some(e)
|
||||
|
|
@ -576,35 +154,11 @@ fn walk_flat_map_stmt_kind<T: MutVisitor>(vis: &mut T, kind: StmtKind) -> SmallV
|
|||
StmtKind::Empty => smallvec![StmtKind::Empty],
|
||||
StmtKind::MacCall(mut mac) => {
|
||||
let MacCallStmt { mac: mac_, style: _, attrs, tokens: _ } = mac.deref_mut();
|
||||
visit_attrs(vis, attrs);
|
||||
for attr in attrs {
|
||||
vis.visit_attribute(attr);
|
||||
}
|
||||
vis.visit_mac_call(mac_);
|
||||
smallvec![StmtKind::MacCall(mac)]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn walk_capture_by<T: MutVisitor>(vis: &mut T, capture_by: &mut CaptureBy) {
|
||||
match capture_by {
|
||||
CaptureBy::Ref => {}
|
||||
CaptureBy::Value { move_kw } => {
|
||||
vis.visit_span(move_kw);
|
||||
}
|
||||
CaptureBy::Use { use_kw } => {
|
||||
vis.visit_span(use_kw);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub enum FnKind<'a> {
|
||||
/// E.g., `fn foo()`, `fn foo(&self)`, or `extern "Abi" fn foo()`.
|
||||
Fn(FnCtxt, &'a mut Visibility, &'a mut Fn),
|
||||
|
||||
/// E.g., `|x, y| body`.
|
||||
Closure(
|
||||
&'a mut ClosureBinder,
|
||||
&'a mut Option<CoroutineKind>,
|
||||
&'a mut P<FnDecl>,
|
||||
&'a mut P<Expr>,
|
||||
),
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1085,6 +1085,7 @@ pub enum NtExprKind {
|
|||
Expr2021 { inferred: bool },
|
||||
}
|
||||
|
||||
/// A macro nonterminal, known in documentation as a fragment specifier.
|
||||
#[derive(Debug, Copy, Clone, PartialEq, Eq, Encodable, Decodable, Hash, HashStable_Generic)]
|
||||
pub enum NonterminalKind {
|
||||
Item,
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
use std::{ascii, fmt, str};
|
||||
|
||||
use rustc_literal_escaper::{
|
||||
MixedUnit, Mode, byte_from_char, unescape_byte, unescape_char, unescape_mixed, unescape_unicode,
|
||||
MixedUnit, unescape_byte, unescape_byte_str, unescape_c_str, unescape_char, unescape_str,
|
||||
};
|
||||
use rustc_span::{Span, Symbol, kw, sym};
|
||||
use tracing::debug;
|
||||
|
|
@ -87,11 +87,10 @@ impl LitKind {
|
|||
// Force-inlining here is aggressive but the closure is
|
||||
// called on every char in the string, so it can be hot in
|
||||
// programs with many long strings containing escapes.
|
||||
unescape_unicode(
|
||||
unescape_str(
|
||||
s,
|
||||
Mode::Str,
|
||||
&mut #[inline(always)]
|
||||
|_, c| match c {
|
||||
#[inline(always)]
|
||||
|_, res| match res {
|
||||
Ok(c) => buf.push(c),
|
||||
Err(err) => {
|
||||
assert!(!err.is_fatal(), "failed to unescape string literal")
|
||||
|
|
@ -111,8 +110,8 @@ impl LitKind {
|
|||
token::ByteStr => {
|
||||
let s = symbol.as_str();
|
||||
let mut buf = Vec::with_capacity(s.len());
|
||||
unescape_unicode(s, Mode::ByteStr, &mut |_, c| match c {
|
||||
Ok(c) => buf.push(byte_from_char(c)),
|
||||
unescape_byte_str(s, |_, res| match res {
|
||||
Ok(b) => buf.push(b),
|
||||
Err(err) => {
|
||||
assert!(!err.is_fatal(), "failed to unescape string literal")
|
||||
}
|
||||
|
|
@ -128,7 +127,7 @@ impl LitKind {
|
|||
token::CStr => {
|
||||
let s = symbol.as_str();
|
||||
let mut buf = Vec::with_capacity(s.len());
|
||||
unescape_mixed(s, Mode::CStr, &mut |_span, c| match c {
|
||||
unescape_c_str(s, |_span, c| match c {
|
||||
Ok(MixedUnit::Char(c)) => {
|
||||
buf.extend_from_slice(c.encode_utf8(&mut [0; 4]).as_bytes())
|
||||
}
|
||||
|
|
|
|||
|
|
@ -65,45 +65,6 @@ impl BoundKind {
|
|||
}
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
pub enum FnKind<'a> {
|
||||
/// E.g., `fn foo()`, `fn foo(&self)`, or `extern "Abi" fn foo()`.
|
||||
Fn(FnCtxt, &'a Visibility, &'a Fn),
|
||||
|
||||
/// E.g., `|x, y| body`.
|
||||
Closure(&'a ClosureBinder, &'a Option<CoroutineKind>, &'a FnDecl, &'a Expr),
|
||||
}
|
||||
|
||||
impl<'a> FnKind<'a> {
|
||||
pub fn header(&self) -> Option<&'a FnHeader> {
|
||||
match *self {
|
||||
FnKind::Fn(_, _, Fn { sig, .. }) => Some(&sig.header),
|
||||
FnKind::Closure(..) => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ident(&self) -> Option<&Ident> {
|
||||
match self {
|
||||
FnKind::Fn(_, _, Fn { ident, .. }) => Some(ident),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn decl(&self) -> &'a FnDecl {
|
||||
match self {
|
||||
FnKind::Fn(_, _, Fn { sig, .. }) => &sig.decl,
|
||||
FnKind::Closure(_, _, decl, _) => decl,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ctxt(&self) -> Option<FnCtxt> {
|
||||
match self {
|
||||
FnKind::Fn(ctxt, ..) => Some(*ctxt),
|
||||
FnKind::Closure(..) => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
pub enum LifetimeCtxt {
|
||||
/// Appears in a reference type.
|
||||
|
|
@ -114,206 +75,405 @@ pub enum LifetimeCtxt {
|
|||
GenericArg,
|
||||
}
|
||||
|
||||
/// Each method of the `Visitor` trait is a hook to be potentially
|
||||
/// overridden. Each method's default implementation recursively visits
|
||||
/// the substructure of the input via the corresponding `walk` method;
|
||||
/// e.g., the `visit_item` method by default calls `visit::walk_item`.
|
||||
///
|
||||
/// If you want to ensure that your code handles every variant
|
||||
/// explicitly, you need to override each method. (And you also need
|
||||
/// to monitor future changes to `Visitor` in case a new method with a
|
||||
/// new default implementation gets introduced.)
|
||||
///
|
||||
/// Every `walk_*` method uses deconstruction to access fields of structs and
|
||||
/// enums. This will result in a compile error if a field is added, which makes
|
||||
/// it more likely the appropriate visit call will be added for it.
|
||||
pub trait Visitor<'ast>: Sized {
|
||||
/// The result type of the `visit_*` methods. Can be either `()`,
|
||||
/// or `ControlFlow<T>`.
|
||||
type Result: VisitorResult = ();
|
||||
|
||||
fn visit_ident(&mut self, _ident: &'ast Ident) -> Self::Result {
|
||||
Self::Result::output()
|
||||
}
|
||||
fn visit_foreign_mod(&mut self, nm: &'ast ForeignMod) -> Self::Result {
|
||||
walk_foreign_mod(self, nm)
|
||||
}
|
||||
fn visit_foreign_item(&mut self, i: &'ast ForeignItem) -> Self::Result {
|
||||
walk_item(self, i)
|
||||
}
|
||||
fn visit_item(&mut self, i: &'ast Item) -> Self::Result {
|
||||
walk_item(self, i)
|
||||
}
|
||||
fn visit_local(&mut self, l: &'ast Local) -> Self::Result {
|
||||
walk_local(self, l)
|
||||
}
|
||||
fn visit_block(&mut self, b: &'ast Block) -> Self::Result {
|
||||
walk_block(self, b)
|
||||
}
|
||||
fn visit_stmt(&mut self, s: &'ast Stmt) -> Self::Result {
|
||||
walk_stmt(self, s)
|
||||
}
|
||||
fn visit_param(&mut self, param: &'ast Param) -> Self::Result {
|
||||
walk_param(self, param)
|
||||
}
|
||||
fn visit_arm(&mut self, a: &'ast Arm) -> Self::Result {
|
||||
walk_arm(self, a)
|
||||
}
|
||||
fn visit_pat(&mut self, p: &'ast Pat) -> Self::Result {
|
||||
walk_pat(self, p)
|
||||
}
|
||||
fn visit_anon_const(&mut self, c: &'ast AnonConst) -> Self::Result {
|
||||
walk_anon_const(self, c)
|
||||
}
|
||||
fn visit_expr(&mut self, ex: &'ast Expr) -> Self::Result {
|
||||
walk_expr(self, ex)
|
||||
}
|
||||
/// This method is a hack to workaround unstable of `stmt_expr_attributes`.
|
||||
/// It can be removed once that feature is stabilized.
|
||||
fn visit_method_receiver_expr(&mut self, ex: &'ast Expr) -> Self::Result {
|
||||
self.visit_expr(ex)
|
||||
}
|
||||
fn visit_ty(&mut self, t: &'ast Ty) -> Self::Result {
|
||||
walk_ty(self, t)
|
||||
}
|
||||
fn visit_ty_pat(&mut self, t: &'ast TyPat) -> Self::Result {
|
||||
walk_ty_pat(self, t)
|
||||
}
|
||||
fn visit_generic_param(&mut self, param: &'ast GenericParam) -> Self::Result {
|
||||
walk_generic_param(self, param)
|
||||
}
|
||||
fn visit_generics(&mut self, g: &'ast Generics) -> Self::Result {
|
||||
walk_generics(self, g)
|
||||
}
|
||||
fn visit_closure_binder(&mut self, b: &'ast ClosureBinder) -> Self::Result {
|
||||
walk_closure_binder(self, b)
|
||||
}
|
||||
fn visit_contract(&mut self, c: &'ast FnContract) -> Self::Result {
|
||||
walk_contract(self, c)
|
||||
}
|
||||
fn visit_where_predicate(&mut self, p: &'ast WherePredicate) -> Self::Result {
|
||||
walk_where_predicate(self, p)
|
||||
}
|
||||
fn visit_where_predicate_kind(&mut self, k: &'ast WherePredicateKind) -> Self::Result {
|
||||
walk_where_predicate_kind(self, k)
|
||||
}
|
||||
fn visit_fn(&mut self, fk: FnKind<'ast>, _: Span, _: NodeId) -> Self::Result {
|
||||
walk_fn(self, fk)
|
||||
}
|
||||
fn visit_assoc_item(&mut self, i: &'ast AssocItem, ctxt: AssocCtxt) -> Self::Result {
|
||||
walk_assoc_item(self, i, ctxt)
|
||||
}
|
||||
fn visit_trait_ref(&mut self, t: &'ast TraitRef) -> Self::Result {
|
||||
walk_trait_ref(self, t)
|
||||
}
|
||||
fn visit_param_bound(&mut self, bounds: &'ast GenericBound, _ctxt: BoundKind) -> Self::Result {
|
||||
walk_param_bound(self, bounds)
|
||||
}
|
||||
fn visit_precise_capturing_arg(&mut self, arg: &'ast PreciseCapturingArg) -> Self::Result {
|
||||
walk_precise_capturing_arg(self, arg)
|
||||
}
|
||||
fn visit_poly_trait_ref(&mut self, t: &'ast PolyTraitRef) -> Self::Result {
|
||||
walk_poly_trait_ref(self, t)
|
||||
}
|
||||
fn visit_variant_data(&mut self, s: &'ast VariantData) -> Self::Result {
|
||||
walk_variant_data(self, s)
|
||||
}
|
||||
fn visit_field_def(&mut self, s: &'ast FieldDef) -> Self::Result {
|
||||
walk_field_def(self, s)
|
||||
}
|
||||
fn visit_variant(&mut self, v: &'ast Variant) -> Self::Result {
|
||||
walk_variant(self, v)
|
||||
}
|
||||
fn visit_variant_discr(&mut self, discr: &'ast AnonConst) -> Self::Result {
|
||||
self.visit_anon_const(discr)
|
||||
}
|
||||
fn visit_label(&mut self, label: &'ast Label) -> Self::Result {
|
||||
walk_label(self, label)
|
||||
}
|
||||
fn visit_lifetime(&mut self, lifetime: &'ast Lifetime, _: LifetimeCtxt) -> Self::Result {
|
||||
walk_lifetime(self, lifetime)
|
||||
}
|
||||
fn visit_mac_call(&mut self, mac: &'ast MacCall) -> Self::Result {
|
||||
walk_mac(self, mac)
|
||||
}
|
||||
fn visit_id(&mut self, _id: NodeId) -> Self::Result {
|
||||
Self::Result::output()
|
||||
}
|
||||
fn visit_macro_def(&mut self, macro_def: &'ast MacroDef) -> Self::Result {
|
||||
walk_macro_def(self, macro_def)
|
||||
}
|
||||
fn visit_path(&mut self, path: &'ast Path) -> Self::Result {
|
||||
walk_path(self, path)
|
||||
}
|
||||
fn visit_use_tree(&mut self, use_tree: &'ast UseTree) -> Self::Result {
|
||||
walk_use_tree(self, use_tree)
|
||||
}
|
||||
fn visit_nested_use_tree(&mut self, use_tree: &'ast UseTree, id: NodeId) -> Self::Result {
|
||||
try_visit!(self.visit_id(id));
|
||||
self.visit_use_tree(use_tree)
|
||||
}
|
||||
fn visit_path_segment(&mut self, path_segment: &'ast PathSegment) -> Self::Result {
|
||||
walk_path_segment(self, path_segment)
|
||||
}
|
||||
fn visit_generic_args(&mut self, generic_args: &'ast GenericArgs) -> Self::Result {
|
||||
walk_generic_args(self, generic_args)
|
||||
}
|
||||
fn visit_generic_arg(&mut self, generic_arg: &'ast GenericArg) -> Self::Result {
|
||||
walk_generic_arg(self, generic_arg)
|
||||
}
|
||||
fn visit_assoc_item_constraint(
|
||||
&mut self,
|
||||
constraint: &'ast AssocItemConstraint,
|
||||
) -> Self::Result {
|
||||
walk_assoc_item_constraint(self, constraint)
|
||||
}
|
||||
fn visit_attribute(&mut self, attr: &'ast Attribute) -> Self::Result {
|
||||
walk_attribute(self, attr)
|
||||
}
|
||||
fn visit_vis(&mut self, vis: &'ast Visibility) -> Self::Result {
|
||||
walk_vis(self, vis)
|
||||
}
|
||||
fn visit_fn_ret_ty(&mut self, ret_ty: &'ast FnRetTy) -> Self::Result {
|
||||
walk_fn_ret_ty(self, ret_ty)
|
||||
}
|
||||
fn visit_fn_header(&mut self, header: &'ast FnHeader) -> Self::Result {
|
||||
walk_fn_header(self, header)
|
||||
}
|
||||
fn visit_expr_field(&mut self, f: &'ast ExprField) -> Self::Result {
|
||||
walk_expr_field(self, f)
|
||||
}
|
||||
fn visit_pat_field(&mut self, fp: &'ast PatField) -> Self::Result {
|
||||
walk_pat_field(self, fp)
|
||||
}
|
||||
fn visit_crate(&mut self, krate: &'ast Crate) -> Self::Result {
|
||||
walk_crate(self, krate)
|
||||
}
|
||||
fn visit_inline_asm(&mut self, asm: &'ast InlineAsm) -> Self::Result {
|
||||
walk_inline_asm(self, asm)
|
||||
}
|
||||
fn visit_format_args(&mut self, fmt: &'ast FormatArgs) -> Self::Result {
|
||||
walk_format_args(self, fmt)
|
||||
}
|
||||
fn visit_inline_asm_sym(&mut self, sym: &'ast InlineAsmSym) -> Self::Result {
|
||||
walk_inline_asm_sym(self, sym)
|
||||
}
|
||||
fn visit_capture_by(&mut self, _capture_by: &'ast CaptureBy) -> Self::Result {
|
||||
Self::Result::output()
|
||||
}
|
||||
fn visit_coroutine_kind(&mut self, coroutine_kind: &'ast CoroutineKind) -> Self::Result {
|
||||
walk_coroutine_kind(self, coroutine_kind)
|
||||
}
|
||||
fn visit_fn_decl(&mut self, fn_decl: &'ast FnDecl) -> Self::Result {
|
||||
walk_fn_decl(self, fn_decl)
|
||||
}
|
||||
fn visit_qself(&mut self, qs: &'ast Option<P<QSelf>>) -> Self::Result {
|
||||
walk_qself(self, qs)
|
||||
}
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! common_visitor_and_walkers {
|
||||
($(($mut: ident))? $Visitor:ident$(<$lt:lifetime>)?) => {
|
||||
$(${ignore($lt)}
|
||||
#[derive(Copy, Clone)]
|
||||
)?
|
||||
#[derive(Debug)]
|
||||
pub enum FnKind<'a> {
|
||||
/// E.g., `fn foo()`, `fn foo(&self)`, or `extern "Abi" fn foo()`.
|
||||
Fn(FnCtxt, &'a $($mut)? Visibility, &'a $($mut)? Fn),
|
||||
|
||||
/// E.g., `|x, y| body`.
|
||||
Closure(&'a $($mut)? ClosureBinder, &'a $($mut)? Option<CoroutineKind>, &'a $($mut)? P<FnDecl>, &'a $($mut)? P<Expr>),
|
||||
}
|
||||
|
||||
impl<'a> FnKind<'a> {
|
||||
pub fn header(&'a $($mut)? self) -> Option<&'a $($mut)? FnHeader> {
|
||||
match *self {
|
||||
FnKind::Fn(_, _, Fn { sig, .. }) => Some(&$($mut)? sig.header),
|
||||
FnKind::Closure(..) => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ident(&'a $($mut)? self) -> Option<&'a $($mut)? Ident> {
|
||||
match self {
|
||||
FnKind::Fn(_, _, Fn { ident, .. }) => Some(ident),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn decl(&'a $($mut)? self) -> &'a $($mut)? FnDecl {
|
||||
match self {
|
||||
FnKind::Fn(_, _, Fn { sig, .. }) => &$($mut)? sig.decl,
|
||||
FnKind::Closure(_, _, decl, _) => decl,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ctxt(&self) -> Option<FnCtxt> {
|
||||
match self {
|
||||
FnKind::Fn(ctxt, ..) => Some(*ctxt),
|
||||
FnKind::Closure(..) => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Each method of this trait is a hook to be potentially
|
||||
/// overridden. Each method's default implementation recursively visits
|
||||
/// the substructure of the input via the corresponding `walk` method;
|
||||
#[doc = concat!(" e.g., the `visit_item` method by default calls `visit"$(, "_", stringify!($mut))?, "::walk_item`.")]
|
||||
///
|
||||
/// If you want to ensure that your code handles every variant
|
||||
/// explicitly, you need to override each method. (And you also need
|
||||
/// to monitor future changes to this trait in case a new method with a
|
||||
/// new default implementation gets introduced.)
|
||||
///
|
||||
/// Every `walk_*` method uses deconstruction to access fields of structs and
|
||||
/// enums. This will result in a compile error if a field is added, which makes
|
||||
/// it more likely the appropriate visit call will be added for it.
|
||||
pub trait $Visitor<$($lt)?> : Sized $(${ignore($mut)} + MutVisitorResult<Result = ()>)? {
|
||||
$(
|
||||
${ignore($lt)}
|
||||
/// The result type of the `visit_*` methods. Can be either `()`,
|
||||
/// or `ControlFlow<T>`.
|
||||
type Result: VisitorResult = ();
|
||||
)?
|
||||
|
||||
// Methods in this trait have one of three forms, with the last two forms
|
||||
// only occuring on `MutVisitor`:
|
||||
//
|
||||
// fn visit_t(&mut self, t: &mut T); // common
|
||||
// fn flat_map_t(&mut self, t: T) -> SmallVec<[T; 1]>; // rare
|
||||
// fn filter_map_t(&mut self, t: T) -> Option<T>; // rarest
|
||||
//
|
||||
// When writing these methods, it is better to use destructuring like this:
|
||||
//
|
||||
// fn visit_abc(&mut self, ABC { a, b, c: _ }: &mut ABC) {
|
||||
// visit_a(a);
|
||||
// visit_b(b);
|
||||
// }
|
||||
//
|
||||
// than to use field access like this:
|
||||
//
|
||||
// fn visit_abc(&mut self, abc: &mut ABC) {
|
||||
// visit_a(&mut abc.a);
|
||||
// visit_b(&mut abc.b);
|
||||
// // ignore abc.c
|
||||
// }
|
||||
//
|
||||
// As well as being more concise, the former is explicit about which fields
|
||||
// are skipped. Furthermore, if a new field is added, the destructuring
|
||||
// version will cause a compile error, which is good. In comparison, the
|
||||
// field access version will continue working and it would be easy to
|
||||
// forget to add handling for it.
|
||||
fn visit_ident(&mut self, Ident { name: _, span }: &$($lt)? $($mut)? Ident) -> Self::Result {
|
||||
visit_span(self, span)
|
||||
}
|
||||
|
||||
fn visit_foreign_mod(&mut self, nm: &$($lt)? $($mut)? ForeignMod) -> Self::Result {
|
||||
walk_foreign_mod(self, nm)
|
||||
}
|
||||
|
||||
fn visit_foreign_item(&mut self, i: &$($lt)? $($mut)? ForeignItem) -> Self::Result {
|
||||
walk_item(self, i)
|
||||
}
|
||||
|
||||
fn visit_item(&mut self, i: &$($lt)? $($mut)? Item) -> Self::Result {
|
||||
walk_item(self, i)
|
||||
}
|
||||
|
||||
fn visit_local(&mut self, l: &$($lt)? $($mut)? Local) -> Self::Result {
|
||||
walk_local(self, l)
|
||||
}
|
||||
|
||||
fn visit_block(&mut self, b: &$($lt)? $($mut)? Block) -> Self::Result {
|
||||
walk_block(self, b)
|
||||
}
|
||||
|
||||
fn visit_param(&mut self, param: &$($lt)? $($mut)? Param) -> Self::Result {
|
||||
walk_param(self, param)
|
||||
}
|
||||
|
||||
fn visit_arm(&mut self, a: &$($lt)? $($mut)? Arm) -> Self::Result {
|
||||
walk_arm(self, a)
|
||||
}
|
||||
|
||||
fn visit_pat(&mut self, p: &$($lt)? $($mut)? Pat) -> Self::Result {
|
||||
walk_pat(self, p)
|
||||
}
|
||||
|
||||
fn visit_anon_const(&mut self, c: &$($lt)? $($mut)? AnonConst) -> Self::Result {
|
||||
walk_anon_const(self, c)
|
||||
}
|
||||
|
||||
fn visit_expr(&mut self, ex: &$($lt)? $($mut)? Expr) -> Self::Result {
|
||||
walk_expr(self, ex)
|
||||
}
|
||||
|
||||
/// This method is a hack to workaround unstable of `stmt_expr_attributes`.
|
||||
/// It can be removed once that feature is stabilized.
|
||||
fn visit_method_receiver_expr(&mut self, ex: &$($lt)? $($mut)? Expr) -> Self::Result {
|
||||
self.visit_expr(ex)
|
||||
}
|
||||
|
||||
fn visit_ty(&mut self, t: &$($lt)? $($mut)? Ty) -> Self::Result {
|
||||
walk_ty(self, t)
|
||||
}
|
||||
|
||||
fn visit_ty_pat(&mut self, t: &$($lt)? $($mut)? TyPat) -> Self::Result {
|
||||
walk_ty_pat(self, t)
|
||||
}
|
||||
|
||||
fn visit_generic_param(&mut self, param: &$($lt)? $($mut)? GenericParam) -> Self::Result {
|
||||
walk_generic_param(self, param)
|
||||
}
|
||||
|
||||
fn visit_generics(&mut self, g: &$($lt)? $($mut)? Generics) -> Self::Result {
|
||||
walk_generics(self, g)
|
||||
}
|
||||
fn visit_closure_binder(&mut self, b: &$($lt)? $($mut)? ClosureBinder) -> Self::Result {
|
||||
walk_closure_binder(self, b)
|
||||
}
|
||||
fn visit_contract(&mut self, c: &$($lt)? $($mut)? FnContract) -> Self::Result {
|
||||
walk_contract(self, c)
|
||||
}
|
||||
|
||||
fn visit_where_predicate(&mut self, p: &$($lt)? $($mut)? WherePredicate) -> Self::Result {
|
||||
walk_where_predicate(self, p)
|
||||
}
|
||||
|
||||
fn visit_where_predicate_kind(&mut self, k: &$($lt)? $($mut)? WherePredicateKind) -> Self::Result {
|
||||
walk_where_predicate_kind(self, k)
|
||||
}
|
||||
|
||||
// for `MutVisitor`: `Span` and `NodeId` are mutated at the caller site.
|
||||
fn visit_fn(
|
||||
&mut self,
|
||||
fk: FnKind<$($lt)? $(${ignore($mut)} '_)?>,
|
||||
_: Span,
|
||||
_: NodeId
|
||||
) -> Self::Result {
|
||||
walk_fn(self, fk)
|
||||
}
|
||||
|
||||
fn visit_assoc_item(&mut self, i: &$($lt)? $($mut)? AssocItem, ctxt: AssocCtxt) -> Self::Result {
|
||||
walk_assoc_item(self, i, ctxt)
|
||||
}
|
||||
|
||||
fn visit_trait_ref(&mut self, t: &$($lt)? $($mut)? TraitRef) -> Self::Result {
|
||||
walk_trait_ref(self, t)
|
||||
}
|
||||
|
||||
fn visit_param_bound(&mut self, bounds: &$($lt)? $($mut)? GenericBound, _ctxt: BoundKind) -> Self::Result {
|
||||
walk_param_bound(self, bounds)
|
||||
}
|
||||
|
||||
fn visit_precise_capturing_arg(&mut self, arg: &$($lt)? $($mut)? PreciseCapturingArg) -> Self::Result {
|
||||
walk_precise_capturing_arg(self, arg)
|
||||
}
|
||||
|
||||
fn visit_poly_trait_ref(&mut self, t: &$($lt)? $($mut)? PolyTraitRef) -> Self::Result {
|
||||
walk_poly_trait_ref(self, t)
|
||||
}
|
||||
|
||||
fn visit_variant_data(&mut self, s: &$($lt)? $($mut)? VariantData) -> Self::Result {
|
||||
walk_variant_data(self, s)
|
||||
}
|
||||
|
||||
fn visit_field_def(&mut self, s: &$($lt)? $($mut)? FieldDef) -> Self::Result {
|
||||
walk_field_def(self, s)
|
||||
}
|
||||
|
||||
fn visit_variant(&mut self, v: &$($lt)? $($mut)? Variant) -> Self::Result {
|
||||
walk_variant(self, v)
|
||||
}
|
||||
|
||||
fn visit_label(&mut self, label: &$($lt)? $($mut)? Label) -> Self::Result {
|
||||
walk_label(self, label)
|
||||
}
|
||||
|
||||
fn visit_lifetime(&mut self, lifetime: &$($lt)? $($mut)? Lifetime, $(${ignore($lt)} _: LifetimeCtxt )?) -> Self::Result {
|
||||
walk_lifetime(self, lifetime)
|
||||
}
|
||||
|
||||
fn visit_mac_call(&mut self, mac: &$($lt)? $($mut)? MacCall) -> Self::Result {
|
||||
walk_mac(self, mac)
|
||||
}
|
||||
|
||||
fn visit_id(&mut self, _id: $(&$mut)? NodeId) -> Self::Result {
|
||||
Self::Result::output()
|
||||
}
|
||||
|
||||
fn visit_macro_def(&mut self, macro_def: &$($lt)? $($mut)? MacroDef) -> Self::Result {
|
||||
walk_macro_def(self, macro_def)
|
||||
}
|
||||
|
||||
fn visit_path(&mut self, path: &$($lt)? $($mut)? Path) -> Self::Result {
|
||||
walk_path(self, path)
|
||||
}
|
||||
|
||||
fn visit_use_tree(&mut self, use_tree: &$($lt)? $($mut)? UseTree) -> Self::Result {
|
||||
walk_use_tree(self, use_tree)
|
||||
}
|
||||
|
||||
fn visit_path_segment(&mut self, path_segment: &$($lt)? $($mut)? PathSegment) -> Self::Result {
|
||||
walk_path_segment(self, path_segment)
|
||||
}
|
||||
|
||||
fn visit_generic_args(&mut self, generic_args: &$($lt)? $($mut)? GenericArgs) -> Self::Result {
|
||||
walk_generic_args(self, generic_args)
|
||||
}
|
||||
|
||||
fn visit_generic_arg(&mut self, generic_arg: &$($lt)? $($mut)? GenericArg) -> Self::Result {
|
||||
walk_generic_arg(self, generic_arg)
|
||||
}
|
||||
|
||||
fn visit_assoc_item_constraint(
|
||||
&mut self,
|
||||
constraint: &$($lt)? $($mut)? AssocItemConstraint,
|
||||
) -> Self::Result {
|
||||
walk_assoc_item_constraint(self, constraint)
|
||||
}
|
||||
|
||||
fn visit_attribute(&mut self, attr: &$($lt)? $($mut)? Attribute) -> Self::Result {
|
||||
walk_attribute(self, attr)
|
||||
}
|
||||
|
||||
fn visit_vis(&mut self, vis: &$($lt)? $($mut)? Visibility) -> Self::Result {
|
||||
walk_vis(self, vis)
|
||||
}
|
||||
|
||||
fn visit_fn_ret_ty(&mut self, ret_ty: &$($lt)? $($mut)? FnRetTy) -> Self::Result {
|
||||
walk_fn_ret_ty(self, ret_ty)
|
||||
}
|
||||
|
||||
fn visit_fn_header(&mut self, header: &$($lt)? $($mut)? FnHeader) -> Self::Result {
|
||||
walk_fn_header(self, header)
|
||||
}
|
||||
|
||||
fn visit_expr_field(&mut self, f: &$($lt)? $($mut)? ExprField) -> Self::Result {
|
||||
walk_expr_field(self, f)
|
||||
}
|
||||
|
||||
fn visit_pat_field(&mut self, fp: &$($lt)? $($mut)? PatField) -> Self::Result {
|
||||
walk_pat_field(self, fp)
|
||||
}
|
||||
|
||||
fn visit_crate(&mut self, krate: &$($lt)? $($mut)? Crate) -> Self::Result {
|
||||
walk_crate(self, krate)
|
||||
}
|
||||
|
||||
fn visit_inline_asm(&mut self, asm: &$($lt)? $($mut)? InlineAsm) -> Self::Result {
|
||||
walk_inline_asm(self, asm)
|
||||
}
|
||||
|
||||
fn visit_format_args(&mut self, fmt: &$($lt)? $($mut)? FormatArgs) -> Self::Result {
|
||||
walk_format_args(self, fmt)
|
||||
}
|
||||
|
||||
fn visit_inline_asm_sym(&mut self, sym: &$($lt)? $($mut)? InlineAsmSym) -> Self::Result {
|
||||
walk_inline_asm_sym(self, sym)
|
||||
}
|
||||
|
||||
fn visit_capture_by(&mut self, capture_by: &$($lt)? $($mut)? CaptureBy) -> Self::Result {
|
||||
walk_capture_by(self, capture_by)
|
||||
}
|
||||
|
||||
fn visit_coroutine_kind(&mut self, coroutine_kind: &$($lt)? $($mut)? CoroutineKind) -> Self::Result {
|
||||
walk_coroutine_kind(self, coroutine_kind)
|
||||
}
|
||||
|
||||
fn visit_fn_decl(&mut self, fn_decl: &$($lt)? $($mut)? FnDecl) -> Self::Result {
|
||||
walk_fn_decl(self, fn_decl)
|
||||
}
|
||||
|
||||
fn visit_qself(&mut self, qs: &$($lt)? $($mut)? Option<P<QSelf>>) -> Self::Result {
|
||||
walk_qself(self, qs)
|
||||
}
|
||||
|
||||
// (non-mut) `Visitor`-only methods
|
||||
$(
|
||||
fn visit_stmt(&mut self, s: &$lt Stmt) -> Self::Result {
|
||||
walk_stmt(self, s)
|
||||
}
|
||||
|
||||
fn visit_nested_use_tree(&mut self, use_tree: &$lt UseTree, id: NodeId) -> Self::Result {
|
||||
try_visit!(self.visit_id(id));
|
||||
self.visit_use_tree(use_tree)
|
||||
}
|
||||
)?
|
||||
|
||||
// `MutVisitor`-only methods
|
||||
$(
|
||||
fn flat_map_foreign_item(&mut self, ni: P<ForeignItem>) -> SmallVec<[P<ForeignItem>; 1]> {
|
||||
walk_flat_map_foreign_item(self, ni)
|
||||
}
|
||||
|
||||
fn flat_map_item(&mut self, i: P<Item>) -> SmallVec<[P<Item>; 1]> {
|
||||
walk_flat_map_item(self, i)
|
||||
}
|
||||
|
||||
fn flat_map_field_def(&mut self, fd: FieldDef) -> SmallVec<[FieldDef; 1]> {
|
||||
walk_flat_map_field_def(self, fd)
|
||||
}
|
||||
|
||||
fn flat_map_assoc_item(
|
||||
&mut self,
|
||||
i: P<AssocItem>,
|
||||
ctxt: AssocCtxt,
|
||||
) -> SmallVec<[P<AssocItem>; 1]> {
|
||||
walk_flat_map_assoc_item(self, i, ctxt)
|
||||
}
|
||||
|
||||
fn flat_map_stmt(&mut self, s: Stmt) -> SmallVec<[Stmt; 1]> {
|
||||
walk_flat_map_stmt(self, s)
|
||||
}
|
||||
|
||||
fn flat_map_arm(&mut self, arm: Arm) -> SmallVec<[Arm; 1]> {
|
||||
walk_flat_map_arm(self, arm)
|
||||
}
|
||||
|
||||
fn filter_map_expr(&mut self, e: P<Expr>) -> Option<P<Expr>> {
|
||||
walk_filter_map_expr(self, e)
|
||||
}
|
||||
|
||||
fn flat_map_variant(&mut self, v: Variant) -> SmallVec<[Variant; 1]> {
|
||||
walk_flat_map_variant(self, v)
|
||||
}
|
||||
|
||||
fn flat_map_param(&mut self, param: Param) -> SmallVec<[Param; 1]> {
|
||||
walk_flat_map_param(self, param)
|
||||
}
|
||||
|
||||
fn flat_map_generic_param(&mut self, param: GenericParam) -> SmallVec<[GenericParam; 1]> {
|
||||
walk_flat_map_generic_param(self, param)
|
||||
}
|
||||
|
||||
fn flat_map_expr_field(&mut self, f: ExprField) -> SmallVec<[ExprField; 1]> {
|
||||
walk_flat_map_expr_field(self, f)
|
||||
}
|
||||
|
||||
fn flat_map_where_predicate(
|
||||
&mut self,
|
||||
where_predicate: WherePredicate,
|
||||
) -> SmallVec<[WherePredicate; 1]> {
|
||||
walk_flat_map_where_predicate(self, where_predicate)
|
||||
}
|
||||
|
||||
// Span visiting is no longer used, but we keep it for now,
|
||||
// in case it's needed for something like #127241.
|
||||
fn visit_span(&mut self, _sp: &$mut Span) {
|
||||
// Do nothing.
|
||||
}
|
||||
|
||||
fn flat_map_pat_field(&mut self, fp: PatField) -> SmallVec<[PatField; 1]> {
|
||||
walk_flat_map_pat_field(self, fp)
|
||||
}
|
||||
)?
|
||||
}
|
||||
|
||||
pub trait WalkItemKind {
|
||||
type Ctxt;
|
||||
fn walk<$($lt,)? V: $Visitor$(<$lt>)?>(
|
||||
|
|
@ -409,6 +569,24 @@ macro_rules! common_visitor_and_walkers {
|
|||
V::Result::output()
|
||||
}
|
||||
|
||||
$(${ignore($lt)}
|
||||
#[inline]
|
||||
)?
|
||||
fn walk_capture_by<$($lt,)? V: $Visitor$(<$lt>)?>(
|
||||
vis: &mut V,
|
||||
capture_by: &$($lt)? $($mut)? CaptureBy
|
||||
) -> V::Result {
|
||||
match capture_by {
|
||||
CaptureBy::Ref => { V::Result::output() }
|
||||
CaptureBy::Value { move_kw } => {
|
||||
visit_span(vis, move_kw)
|
||||
}
|
||||
CaptureBy::Use { use_kw } => {
|
||||
visit_span(vis, use_kw)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_bounds<$($lt,)? V: $Visitor$(<$lt>)?>(visitor: &mut V, bounds: &$($lt)? $($mut)? GenericBounds, ctxt: BoundKind) -> V::Result {
|
||||
walk_list!(visitor, visit_param_bound, bounds, ctxt);
|
||||
V::Result::output()
|
||||
|
|
@ -989,8 +1167,7 @@ macro_rules! common_visitor_and_walkers {
|
|||
try_visit!(vis.visit_vis(visibility));
|
||||
try_visit!(vis.visit_ident(ident));
|
||||
try_visit!(vis.visit_variant_data(data));
|
||||
$(${ignore($lt)} visit_opt!(vis, visit_variant_discr, disr_expr); )?
|
||||
$(${ignore($mut)} visit_opt!(vis, visit_anon_const, disr_expr); )?
|
||||
visit_opt!(vis, visit_anon_const, disr_expr);
|
||||
visit_span(vis, span)
|
||||
}
|
||||
|
||||
|
|
@ -1389,7 +1566,7 @@ macro_rules! common_visitor_and_walkers {
|
|||
|
||||
// FIXME: visit the template exhaustively.
|
||||
pub fn walk_format_args<$($lt,)? V: $Visitor$(<$lt>)?>(vis: &mut V, fmt: &$($lt)? $($mut)? FormatArgs) -> V::Result {
|
||||
let FormatArgs { span, template: _, arguments, uncooked_fmt_str: _ } = fmt;
|
||||
let FormatArgs { span, template: _, arguments, uncooked_fmt_str: _, is_source_literal: _ } = fmt;
|
||||
let args = $(${ignore($mut)} arguments.all_args_mut())? $(${ignore($lt)} arguments.all_args())? ;
|
||||
for FormatArgument { kind, expr } in args {
|
||||
match kind {
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ doctest = false
|
|||
rustc_abi = { path = "../rustc_abi" }
|
||||
rustc_ast = { path = "../rustc_ast" }
|
||||
rustc_ast_pretty = { path = "../rustc_ast_pretty" }
|
||||
rustc_attr_data_structures = { path = "../rustc_attr_data_structures" }
|
||||
rustc_attr_parsing = { path = "../rustc_attr_parsing" }
|
||||
rustc_data_structures = { path = "../rustc_data_structures" }
|
||||
rustc_errors = { path = "../rustc_errors" }
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ use std::sync::Arc;
|
|||
use rustc_ast::ptr::P as AstP;
|
||||
use rustc_ast::*;
|
||||
use rustc_ast_pretty::pprust::expr_to_string;
|
||||
use rustc_attr_data_structures::{AttributeKind, find_attr};
|
||||
use rustc_data_structures::stack::ensure_sufficient_stack;
|
||||
use rustc_hir as hir;
|
||||
use rustc_hir::HirId;
|
||||
|
|
@ -831,7 +832,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
|
|||
) {
|
||||
if self.tcx.features().async_fn_track_caller()
|
||||
&& let Some(attrs) = self.attrs.get(&outer_hir_id.local_id)
|
||||
&& attrs.into_iter().any(|attr| attr.has_name(sym::track_caller))
|
||||
&& find_attr!(*attrs, AttributeKind::TrackCaller(_))
|
||||
{
|
||||
let unstable_span = self.mark_span_with_reason(
|
||||
DesugaringKind::Async,
|
||||
|
|
@ -2289,12 +2290,12 @@ impl<'hir> LoweringContext<'_, 'hir> {
|
|||
span: Span,
|
||||
elements: &'hir [hir::Expr<'hir>],
|
||||
) -> hir::Expr<'hir> {
|
||||
let addrof = hir::ExprKind::AddrOf(
|
||||
hir::BorrowKind::Ref,
|
||||
hir::Mutability::Not,
|
||||
self.arena.alloc(self.expr(span, hir::ExprKind::Array(elements))),
|
||||
);
|
||||
self.expr(span, addrof)
|
||||
let array = self.arena.alloc(self.expr(span, hir::ExprKind::Array(elements)));
|
||||
self.expr_ref(span, array)
|
||||
}
|
||||
|
||||
pub(super) fn expr_ref(&mut self, span: Span, expr: &'hir hir::Expr<'hir>) -> hir::Expr<'hir> {
|
||||
self.expr(span, hir::ExprKind::AddrOf(hir::BorrowKind::Ref, hir::Mutability::Not, expr))
|
||||
}
|
||||
|
||||
pub(super) fn expr(&mut self, span: Span, kind: hir::ExprKind<'hir>) -> hir::Expr<'hir> {
|
||||
|
|
|
|||
|
|
@ -1,12 +1,10 @@
|
|||
use core::ops::ControlFlow;
|
||||
use std::borrow::Cow;
|
||||
|
||||
use rustc_ast::visit::Visitor;
|
||||
use rustc_ast::*;
|
||||
use rustc_data_structures::fx::FxIndexMap;
|
||||
use rustc_hir as hir;
|
||||
use rustc_session::config::FmtDebug;
|
||||
use rustc_span::{Ident, Span, Symbol, sym};
|
||||
use rustc_span::{DesugaringKind, Ident, Span, Symbol, sym};
|
||||
|
||||
use super::LoweringContext;
|
||||
|
||||
|
|
@ -16,6 +14,13 @@ impl<'hir> LoweringContext<'_, 'hir> {
|
|||
// format_args!() had any arguments _before_ flattening/inlining.
|
||||
let allow_const = fmt.arguments.all_args().is_empty();
|
||||
let mut fmt = Cow::Borrowed(fmt);
|
||||
|
||||
let sp = self.mark_span_with_reason(
|
||||
DesugaringKind::FormatLiteral { source: fmt.is_source_literal },
|
||||
sp,
|
||||
sp.ctxt().outer_expn_data().allow_internal_unstable,
|
||||
);
|
||||
|
||||
if self.tcx.sess.opts.unstable_opts.flatten_format_args {
|
||||
fmt = flatten_format_args(fmt);
|
||||
fmt = self.inline_literals(fmt);
|
||||
|
|
@ -476,77 +481,52 @@ fn expand_format_args<'hir>(
|
|||
return hir::ExprKind::Call(new, new_args);
|
||||
}
|
||||
|
||||
// If the args array contains exactly all the original arguments once,
|
||||
// in order, we can use a simple array instead of a `match` construction.
|
||||
// However, if there's a yield point in any argument except the first one,
|
||||
// we don't do this, because an Argument cannot be kept across yield points.
|
||||
//
|
||||
// This is an optimization, speeding up compilation about 1-2% in some cases.
|
||||
// See https://github.com/rust-lang/rust/pull/106770#issuecomment-1380790609
|
||||
let use_simple_array = argmap.len() == arguments.len()
|
||||
&& argmap.iter().enumerate().all(|(i, (&(j, _), _))| i == j)
|
||||
&& arguments.iter().skip(1).all(|arg| !may_contain_yield_point(&arg.expr));
|
||||
|
||||
let args = if arguments.is_empty() {
|
||||
let (let_statements, args) = if arguments.is_empty() {
|
||||
// Generate:
|
||||
// &<core::fmt::Argument>::none()
|
||||
// []
|
||||
(vec![], ctx.arena.alloc(ctx.expr(macsp, hir::ExprKind::Array(&[]))))
|
||||
} else if argmap.len() == 1 && arguments.len() == 1 {
|
||||
// Only one argument, so we don't need to make the `args` tuple.
|
||||
//
|
||||
// Note:
|
||||
// `none()` just returns `[]`. We use `none()` rather than `[]` to limit the lifetime.
|
||||
//
|
||||
// This makes sure that this still fails to compile, even when the argument is inlined:
|
||||
//
|
||||
// ```
|
||||
// let f = format_args!("{}", "a");
|
||||
// println!("{f}"); // error E0716
|
||||
// ```
|
||||
//
|
||||
// Cases where keeping the object around is allowed, such as `format_args!("a")`,
|
||||
// are handled above by the `allow_const` case.
|
||||
let none_fn = ctx.arena.alloc(ctx.expr_lang_item_type_relative(
|
||||
macsp,
|
||||
hir::LangItem::FormatArgument,
|
||||
sym::none,
|
||||
));
|
||||
let none = ctx.expr_call(macsp, none_fn, &[]);
|
||||
ctx.expr(macsp, hir::ExprKind::AddrOf(hir::BorrowKind::Ref, hir::Mutability::Not, none))
|
||||
} else if use_simple_array {
|
||||
// Generate:
|
||||
// &[
|
||||
// <core::fmt::Argument>::new_display(&arg0),
|
||||
// <core::fmt::Argument>::new_lower_hex(&arg1),
|
||||
// <core::fmt::Argument>::new_debug(&arg2),
|
||||
// …
|
||||
// ]
|
||||
let elements = ctx.arena.alloc_from_iter(arguments.iter().zip(argmap).map(
|
||||
|(arg, ((_, ty), placeholder_span))| {
|
||||
// super let args = [<core::fmt::Argument>::new_display(&arg)];
|
||||
let args = ctx.arena.alloc_from_iter(argmap.iter().map(
|
||||
|(&(arg_index, ty), &placeholder_span)| {
|
||||
let arg = &arguments[arg_index];
|
||||
let placeholder_span =
|
||||
placeholder_span.unwrap_or(arg.expr.span).with_ctxt(macsp.ctxt());
|
||||
let arg_span = match arg.kind {
|
||||
FormatArgumentKind::Captured(_) => placeholder_span,
|
||||
_ => arg.expr.span.with_ctxt(macsp.ctxt()),
|
||||
};
|
||||
let arg = ctx.lower_expr(&arg.expr);
|
||||
let ref_arg = ctx.arena.alloc(ctx.expr(
|
||||
arg_span,
|
||||
hir::ExprKind::AddrOf(hir::BorrowKind::Ref, hir::Mutability::Not, arg),
|
||||
));
|
||||
let ref_arg = ctx.arena.alloc(ctx.expr_ref(arg.span.with_ctxt(macsp.ctxt()), arg));
|
||||
make_argument(ctx, placeholder_span, ref_arg, ty)
|
||||
},
|
||||
));
|
||||
ctx.expr_array_ref(macsp, elements)
|
||||
} else {
|
||||
// Generate:
|
||||
// &match (&arg0, &arg1, &…) {
|
||||
// args => [
|
||||
// <core::fmt::Argument>::new_display(args.0),
|
||||
// <core::fmt::Argument>::new_lower_hex(args.1),
|
||||
// <core::fmt::Argument>::new_debug(args.0),
|
||||
// …
|
||||
// ]
|
||||
// }
|
||||
let args = ctx.arena.alloc(ctx.expr(macsp, hir::ExprKind::Array(args)));
|
||||
let args_ident = Ident::new(sym::args, macsp);
|
||||
let (args_pat, args_hir_id) = ctx.pat_ident(macsp, args_ident);
|
||||
let let_statement = ctx.stmt_super_let_pat(macsp, args_pat, Some(args));
|
||||
(vec![let_statement], ctx.arena.alloc(ctx.expr_ident_mut(macsp, args_ident, args_hir_id)))
|
||||
} else {
|
||||
// Generate:
|
||||
// super let args = (&arg0, &arg1, &…);
|
||||
let args_ident = Ident::new(sym::args, macsp);
|
||||
let (args_pat, args_hir_id) = ctx.pat_ident(macsp, args_ident);
|
||||
let elements = ctx.arena.alloc_from_iter(arguments.iter().map(|arg| {
|
||||
let arg_expr = ctx.lower_expr(&arg.expr);
|
||||
ctx.expr(
|
||||
arg.expr.span.with_ctxt(macsp.ctxt()),
|
||||
hir::ExprKind::AddrOf(hir::BorrowKind::Ref, hir::Mutability::Not, arg_expr),
|
||||
)
|
||||
}));
|
||||
let args_tuple = ctx.arena.alloc(ctx.expr(macsp, hir::ExprKind::Tup(elements)));
|
||||
let let_statement_1 = ctx.stmt_super_let_pat(macsp, args_pat, Some(args_tuple));
|
||||
|
||||
// Generate:
|
||||
// super let args = [
|
||||
// <core::fmt::Argument>::new_display(args.0),
|
||||
// <core::fmt::Argument>::new_lower_hex(args.1),
|
||||
// <core::fmt::Argument>::new_debug(args.0),
|
||||
// …
|
||||
// ];
|
||||
let args = ctx.arena.alloc_from_iter(argmap.iter().map(
|
||||
|(&(arg_index, ty), &placeholder_span)| {
|
||||
let arg = &arguments[arg_index];
|
||||
|
|
@ -567,58 +547,47 @@ fn expand_format_args<'hir>(
|
|||
make_argument(ctx, placeholder_span, arg, ty)
|
||||
},
|
||||
));
|
||||
let elements = ctx.arena.alloc_from_iter(arguments.iter().map(|arg| {
|
||||
let arg_expr = ctx.lower_expr(&arg.expr);
|
||||
ctx.expr(
|
||||
arg.expr.span.with_ctxt(macsp.ctxt()),
|
||||
hir::ExprKind::AddrOf(hir::BorrowKind::Ref, hir::Mutability::Not, arg_expr),
|
||||
)
|
||||
}));
|
||||
let args_tuple = ctx.arena.alloc(ctx.expr(macsp, hir::ExprKind::Tup(elements)));
|
||||
let array = ctx.arena.alloc(ctx.expr(macsp, hir::ExprKind::Array(args)));
|
||||
let match_arms = ctx.arena.alloc_from_iter([ctx.arm(args_pat, array)]);
|
||||
let match_expr = ctx.arena.alloc(ctx.expr_match(
|
||||
macsp,
|
||||
args_tuple,
|
||||
match_arms,
|
||||
hir::MatchSource::FormatArgs,
|
||||
));
|
||||
ctx.expr(
|
||||
macsp,
|
||||
hir::ExprKind::AddrOf(hir::BorrowKind::Ref, hir::Mutability::Not, match_expr),
|
||||
let args = ctx.arena.alloc(ctx.expr(macsp, hir::ExprKind::Array(args)));
|
||||
let (args_pat, args_hir_id) = ctx.pat_ident(macsp, args_ident);
|
||||
let let_statement_2 = ctx.stmt_super_let_pat(macsp, args_pat, Some(args));
|
||||
(
|
||||
vec![let_statement_1, let_statement_2],
|
||||
ctx.arena.alloc(ctx.expr_ident_mut(macsp, args_ident, args_hir_id)),
|
||||
)
|
||||
};
|
||||
|
||||
if let Some(format_options) = format_options {
|
||||
// Generate:
|
||||
// &args
|
||||
let args = ctx.expr_ref(macsp, args);
|
||||
|
||||
let call = if let Some(format_options) = format_options {
|
||||
// Generate:
|
||||
// <core::fmt::Arguments>::new_v1_formatted(
|
||||
// lit_pieces,
|
||||
// args,
|
||||
// format_options,
|
||||
// unsafe { ::core::fmt::UnsafeArg::new() }
|
||||
// )
|
||||
// unsafe {
|
||||
// <core::fmt::Arguments>::new_v1_formatted(
|
||||
// lit_pieces,
|
||||
// args,
|
||||
// format_options,
|
||||
// )
|
||||
// }
|
||||
let new_v1_formatted = ctx.arena.alloc(ctx.expr_lang_item_type_relative(
|
||||
macsp,
|
||||
hir::LangItem::FormatArguments,
|
||||
sym::new_v1_formatted,
|
||||
));
|
||||
let unsafe_arg_new = ctx.arena.alloc(ctx.expr_lang_item_type_relative(
|
||||
macsp,
|
||||
hir::LangItem::FormatUnsafeArg,
|
||||
sym::new,
|
||||
));
|
||||
let unsafe_arg_new_call = ctx.expr_call(macsp, unsafe_arg_new, &[]);
|
||||
let args = ctx.arena.alloc_from_iter([lit_pieces, args, format_options]);
|
||||
let call = ctx.expr_call(macsp, new_v1_formatted, args);
|
||||
let hir_id = ctx.next_id();
|
||||
let unsafe_arg = ctx.expr_block(ctx.arena.alloc(hir::Block {
|
||||
stmts: &[],
|
||||
expr: Some(unsafe_arg_new_call),
|
||||
hir_id,
|
||||
rules: hir::BlockCheckMode::UnsafeBlock(hir::UnsafeSource::CompilerGenerated),
|
||||
span: macsp,
|
||||
targeted_by_break: false,
|
||||
}));
|
||||
let args = ctx.arena.alloc_from_iter([lit_pieces, args, format_options, unsafe_arg]);
|
||||
hir::ExprKind::Call(new_v1_formatted, args)
|
||||
hir::ExprKind::Block(
|
||||
ctx.arena.alloc(hir::Block {
|
||||
stmts: &[],
|
||||
expr: Some(call),
|
||||
hir_id,
|
||||
rules: hir::BlockCheckMode::UnsafeBlock(hir::UnsafeSource::CompilerGenerated),
|
||||
span: macsp,
|
||||
targeted_by_break: false,
|
||||
}),
|
||||
None,
|
||||
)
|
||||
} else {
|
||||
// Generate:
|
||||
// <core::fmt::Arguments>::new_v1(
|
||||
|
|
@ -632,37 +601,23 @@ fn expand_format_args<'hir>(
|
|||
));
|
||||
let new_args = ctx.arena.alloc_from_iter([lit_pieces, args]);
|
||||
hir::ExprKind::Call(new_v1, new_args)
|
||||
};
|
||||
|
||||
if !let_statements.is_empty() {
|
||||
// Generate:
|
||||
// {
|
||||
// super let …
|
||||
// super let …
|
||||
// <core::fmt::Arguments>::new_…(…)
|
||||
// }
|
||||
let call = ctx.arena.alloc(ctx.expr(macsp, call));
|
||||
let block = ctx.block_all(macsp, ctx.arena.alloc_from_iter(let_statements), Some(call));
|
||||
hir::ExprKind::Block(block, None)
|
||||
} else {
|
||||
call
|
||||
}
|
||||
}
|
||||
|
||||
fn may_contain_yield_point(e: &ast::Expr) -> bool {
|
||||
struct MayContainYieldPoint;
|
||||
|
||||
impl Visitor<'_> for MayContainYieldPoint {
|
||||
type Result = ControlFlow<()>;
|
||||
|
||||
fn visit_expr(&mut self, e: &ast::Expr) -> ControlFlow<()> {
|
||||
if let ast::ExprKind::Await(_, _) | ast::ExprKind::Yield(_) = e.kind {
|
||||
ControlFlow::Break(())
|
||||
} else {
|
||||
visit::walk_expr(self, e)
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_mac_call(&mut self, _: &ast::MacCall) -> ControlFlow<()> {
|
||||
// Macros should be expanded at this point.
|
||||
unreachable!("unexpanded macro in ast lowering");
|
||||
}
|
||||
|
||||
fn visit_item(&mut self, _: &ast::Item) -> ControlFlow<()> {
|
||||
// Do not recurse into nested items.
|
||||
ControlFlow::Continue(())
|
||||
}
|
||||
}
|
||||
|
||||
MayContainYieldPoint.visit_expr(e).is_break()
|
||||
}
|
||||
|
||||
fn for_all_argument_indexes(template: &mut [FormatArgsPiece], mut f: impl FnMut(&mut usize)) {
|
||||
for piece in template {
|
||||
let FormatArgsPiece::Placeholder(placeholder) = piece else { continue };
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ use rustc_abi::ExternAbi;
|
|||
use rustc_ast::ptr::P;
|
||||
use rustc_ast::visit::AssocCtxt;
|
||||
use rustc_ast::*;
|
||||
use rustc_errors::ErrorGuaranteed;
|
||||
use rustc_errors::{E0570, ErrorGuaranteed, struct_span_code_err};
|
||||
use rustc_hir::def::{DefKind, PerNS, Res};
|
||||
use rustc_hir::def_id::{CRATE_DEF_ID, LocalDefId};
|
||||
use rustc_hir::{self as hir, HirId, LifetimeSource, PredicateOrigin};
|
||||
|
|
@ -1644,9 +1644,29 @@ impl<'hir> LoweringContext<'_, 'hir> {
|
|||
self.error_on_invalid_abi(abi_str);
|
||||
ExternAbi::Rust
|
||||
});
|
||||
let sess = self.tcx.sess;
|
||||
let features = self.tcx.features();
|
||||
gate_unstable_abi(sess, features, span, extern_abi);
|
||||
let tcx = self.tcx;
|
||||
|
||||
// we can't do codegen for unsupported ABIs, so error now so we won't get farther
|
||||
if !tcx.sess.target.is_abi_supported(extern_abi) {
|
||||
let mut err = struct_span_code_err!(
|
||||
tcx.dcx(),
|
||||
span,
|
||||
E0570,
|
||||
"{extern_abi} is not a supported ABI for the current target",
|
||||
);
|
||||
|
||||
if let ExternAbi::Stdcall { unwind } = extern_abi {
|
||||
let c_abi = ExternAbi::C { unwind };
|
||||
let system_abi = ExternAbi::System { unwind };
|
||||
err.help(format!("if you need `extern {extern_abi}` on win32 and `extern {c_abi}` everywhere else, \
|
||||
use `extern {system_abi}`"
|
||||
));
|
||||
}
|
||||
err.emit();
|
||||
}
|
||||
// Show required feature gate even if we already errored, as the user is likely to build the code
|
||||
// for the actually intended target next and then they will need the feature gate.
|
||||
gate_unstable_abi(tcx.sess, tcx.features(), span, extern_abi);
|
||||
extern_abi
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -2292,6 +2292,26 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
|
|||
self.stmt(span, hir::StmtKind::Let(self.arena.alloc(local)))
|
||||
}
|
||||
|
||||
fn stmt_super_let_pat(
|
||||
&mut self,
|
||||
span: Span,
|
||||
pat: &'hir hir::Pat<'hir>,
|
||||
init: Option<&'hir hir::Expr<'hir>>,
|
||||
) -> hir::Stmt<'hir> {
|
||||
let hir_id = self.next_id();
|
||||
let local = hir::LetStmt {
|
||||
super_: Some(span),
|
||||
hir_id,
|
||||
init,
|
||||
pat,
|
||||
els: None,
|
||||
source: hir::LocalSource::Normal,
|
||||
span: self.lower_span(span),
|
||||
ty: None,
|
||||
};
|
||||
self.stmt(span, hir::StmtKind::Let(self.arena.alloc(local)))
|
||||
}
|
||||
|
||||
fn block_expr(&mut self, expr: &'hir hir::Expr<'hir>) -> &'hir hir::Block<'hir> {
|
||||
self.block_all(expr.span, &[], Some(expr))
|
||||
}
|
||||
|
|
|
|||
|
|
@ -96,6 +96,9 @@ pub fn extern_abi_stability(abi: ExternAbi) -> Result<(), UnstableAbi> {
|
|||
ExternAbi::RustCold => {
|
||||
Err(UnstableAbi { abi, feature: sym::rust_cold_cc, explain: GateReason::Experimental })
|
||||
}
|
||||
ExternAbi::RustInvalid => {
|
||||
Err(UnstableAbi { abi, feature: sym::rustc_attrs, explain: GateReason::ImplDetail })
|
||||
}
|
||||
ExternAbi::GpuKernel => Err(UnstableAbi {
|
||||
abi,
|
||||
feature: sym::abi_gpu_kernel,
|
||||
|
|
@ -124,12 +127,12 @@ pub fn extern_abi_stability(abi: ExternAbi) -> Result<(), UnstableAbi> {
|
|||
feature: sym::abi_riscv_interrupt,
|
||||
explain: GateReason::Experimental,
|
||||
}),
|
||||
ExternAbi::CCmseNonSecureCall => Err(UnstableAbi {
|
||||
ExternAbi::CmseNonSecureCall => Err(UnstableAbi {
|
||||
abi,
|
||||
feature: sym::abi_c_cmse_nonsecure_call,
|
||||
feature: sym::abi_cmse_nonsecure_call,
|
||||
explain: GateReason::Experimental,
|
||||
}),
|
||||
ExternAbi::CCmseNonSecureEntry => Err(UnstableAbi {
|
||||
ExternAbi::CmseNonSecureEntry => Err(UnstableAbi {
|
||||
abi,
|
||||
feature: sym::cmse_nonsecure_entry,
|
||||
explain: GateReason::Experimental,
|
||||
|
|
|
|||
|
|
@ -18,5 +18,6 @@ rustc_macros = { path = "../rustc_macros" }
|
|||
rustc_parse = { path = "../rustc_parse" }
|
||||
rustc_session = { path = "../rustc_session" }
|
||||
rustc_span = { path = "../rustc_span" }
|
||||
rustc_target = { path = "../rustc_target" }
|
||||
thin-vec = "0.2.12"
|
||||
# tidy-alphabetical-end
|
||||
|
|
|
|||
|
|
@ -1,20 +1,25 @@
|
|||
ast_passes_abi_custom_coroutine =
|
||||
functions with the `"custom"` ABI cannot be `{$coroutine_kind_str}`
|
||||
ast_passes_abi_cannot_be_coroutine =
|
||||
functions with the {$abi} ABI cannot be `{$coroutine_kind_str}`
|
||||
.suggestion = remove the `{$coroutine_kind_str}` keyword from this definiton
|
||||
|
||||
ast_passes_abi_custom_invalid_signature =
|
||||
invalid signature for `extern "custom"` function
|
||||
.note = functions with the `"custom"` ABI cannot have any parameters or return type
|
||||
.suggestion = remove the parameters and return type
|
||||
|
||||
ast_passes_abi_custom_safe_foreign_function =
|
||||
foreign functions with the `"custom"` ABI cannot be safe
|
||||
foreign functions with the "custom" ABI cannot be safe
|
||||
.suggestion = remove the `safe` keyword from this definition
|
||||
|
||||
ast_passes_abi_custom_safe_function =
|
||||
functions with the `"custom"` ABI must be unsafe
|
||||
functions with the "custom" ABI must be unsafe
|
||||
.suggestion = add the `unsafe` keyword to this definition
|
||||
|
||||
ast_passes_abi_must_not_have_parameters_or_return_type=
|
||||
invalid signature for `extern {$abi}` function
|
||||
.note = functions with the {$abi} ABI cannot have any parameters or return type
|
||||
.suggestion = remove the parameters and return type
|
||||
|
||||
ast_passes_abi_must_not_have_return_type=
|
||||
invalid signature for `extern {$abi}` function
|
||||
.note = functions with the "custom" ABI cannot have a return type
|
||||
.help = remove the return type
|
||||
|
||||
ast_passes_assoc_const_without_body =
|
||||
associated constant in `impl` without body
|
||||
.suggestion = provide a definition for the constant
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ use std::ops::{Deref, DerefMut};
|
|||
use std::str::FromStr;
|
||||
|
||||
use itertools::{Either, Itertools};
|
||||
use rustc_abi::ExternAbi;
|
||||
use rustc_abi::{CanonAbi, ExternAbi, InterruptKind};
|
||||
use rustc_ast::ptr::P;
|
||||
use rustc_ast::visit::{AssocCtxt, BoundKind, FnCtxt, FnKind, Visitor, walk_list};
|
||||
use rustc_ast::*;
|
||||
|
|
@ -37,6 +37,7 @@ use rustc_session::lint::builtin::{
|
|||
};
|
||||
use rustc_session::lint::{BuiltinLintDiag, LintBuffer};
|
||||
use rustc_span::{Ident, Span, kw, sym};
|
||||
use rustc_target::spec::{AbiMap, AbiMapping};
|
||||
use thin_vec::thin_vec;
|
||||
|
||||
use crate::errors::{self, TildeConstReason};
|
||||
|
|
@ -365,31 +366,77 @@ impl<'a> AstValidator<'a> {
|
|||
}
|
||||
}
|
||||
|
||||
/// An `extern "custom"` function must be unsafe, and must not have any parameters or return
|
||||
/// type.
|
||||
fn check_custom_abi(&self, ctxt: FnCtxt, ident: &Ident, sig: &FnSig) {
|
||||
/// Check that the signature of this function does not violate the constraints of its ABI.
|
||||
fn check_extern_fn_signature(&self, abi: ExternAbi, ctxt: FnCtxt, ident: &Ident, sig: &FnSig) {
|
||||
match AbiMap::from_target(&self.sess.target).canonize_abi(abi, false) {
|
||||
AbiMapping::Direct(canon_abi) | AbiMapping::Deprecated(canon_abi) => {
|
||||
match canon_abi {
|
||||
CanonAbi::C
|
||||
| CanonAbi::Rust
|
||||
| CanonAbi::RustCold
|
||||
| CanonAbi::Arm(_)
|
||||
| CanonAbi::GpuKernel
|
||||
| CanonAbi::X86(_) => { /* nothing to check */ }
|
||||
|
||||
CanonAbi::Custom => {
|
||||
// An `extern "custom"` function must be unsafe.
|
||||
self.reject_safe_fn(abi, ctxt, sig);
|
||||
|
||||
// An `extern "custom"` function cannot be `async` and/or `gen`.
|
||||
self.reject_coroutine(abi, sig);
|
||||
|
||||
// An `extern "custom"` function must have type `fn()`.
|
||||
self.reject_params_or_return(abi, ident, sig);
|
||||
}
|
||||
|
||||
CanonAbi::Interrupt(interrupt_kind) => {
|
||||
// An interrupt handler cannot be `async` and/or `gen`.
|
||||
self.reject_coroutine(abi, sig);
|
||||
|
||||
if let InterruptKind::X86 = interrupt_kind {
|
||||
// "x86-interrupt" is special because it does have arguments.
|
||||
// FIXME(workingjubilee): properly lint on acceptable input types.
|
||||
if let FnRetTy::Ty(ref ret_ty) = sig.decl.output {
|
||||
self.dcx().emit_err(errors::AbiMustNotHaveReturnType {
|
||||
span: ret_ty.span,
|
||||
abi,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
// An `extern "interrupt"` function must have type `fn()`.
|
||||
self.reject_params_or_return(abi, ident, sig);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
AbiMapping::Invalid => { /* ignore */ }
|
||||
}
|
||||
}
|
||||
|
||||
fn reject_safe_fn(&self, abi: ExternAbi, ctxt: FnCtxt, sig: &FnSig) {
|
||||
let dcx = self.dcx();
|
||||
|
||||
// An `extern "custom"` function must be unsafe.
|
||||
match sig.header.safety {
|
||||
Safety::Unsafe(_) => { /* all good */ }
|
||||
Safety::Safe(safe_span) => {
|
||||
let safe_span =
|
||||
self.sess.psess.source_map().span_until_non_whitespace(safe_span.to(sig.span));
|
||||
let source_map = self.sess.psess.source_map();
|
||||
let safe_span = source_map.span_until_non_whitespace(safe_span.to(sig.span));
|
||||
dcx.emit_err(errors::AbiCustomSafeForeignFunction { span: sig.span, safe_span });
|
||||
}
|
||||
Safety::Default => match ctxt {
|
||||
FnCtxt::Foreign => { /* all good */ }
|
||||
FnCtxt::Free | FnCtxt::Assoc(_) => {
|
||||
self.dcx().emit_err(errors::AbiCustomSafeFunction {
|
||||
dcx.emit_err(errors::AbiCustomSafeFunction {
|
||||
span: sig.span,
|
||||
abi,
|
||||
unsafe_span: sig.span.shrink_to_lo(),
|
||||
});
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// An `extern "custom"` function cannot be `async` and/or `gen`.
|
||||
fn reject_coroutine(&self, abi: ExternAbi, sig: &FnSig) {
|
||||
if let Some(coroutine_kind) = sig.header.coroutine_kind {
|
||||
let coroutine_kind_span = self
|
||||
.sess
|
||||
|
|
@ -397,14 +444,16 @@ impl<'a> AstValidator<'a> {
|
|||
.source_map()
|
||||
.span_until_non_whitespace(coroutine_kind.span().to(sig.span));
|
||||
|
||||
self.dcx().emit_err(errors::AbiCustomCoroutine {
|
||||
self.dcx().emit_err(errors::AbiCannotBeCoroutine {
|
||||
span: sig.span,
|
||||
abi,
|
||||
coroutine_kind_span,
|
||||
coroutine_kind_str: coroutine_kind.as_str(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// An `extern "custom"` function must not have any parameters or return type.
|
||||
fn reject_params_or_return(&self, abi: ExternAbi, ident: &Ident, sig: &FnSig) {
|
||||
let mut spans: Vec<_> = sig.decl.inputs.iter().map(|p| p.span).collect();
|
||||
if let FnRetTy::Ty(ref ret_ty) = sig.decl.output {
|
||||
spans.push(ret_ty.span);
|
||||
|
|
@ -415,11 +464,12 @@ impl<'a> AstValidator<'a> {
|
|||
let suggestion_span = header_span.shrink_to_hi().to(sig.decl.output.span());
|
||||
let padding = if header_span.is_empty() { "" } else { " " };
|
||||
|
||||
self.dcx().emit_err(errors::AbiCustomInvalidSignature {
|
||||
self.dcx().emit_err(errors::AbiMustNotHaveParametersOrReturnType {
|
||||
spans,
|
||||
symbol: ident.name,
|
||||
suggestion_span,
|
||||
padding,
|
||||
abi,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
|
@ -1199,9 +1249,12 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
|
|||
self.check_foreign_fn_bodyless(*ident, body.as_deref());
|
||||
self.check_foreign_fn_headerless(sig.header);
|
||||
self.check_foreign_item_ascii_only(*ident);
|
||||
if self.extern_mod_abi == Some(ExternAbi::Custom) {
|
||||
self.check_custom_abi(FnCtxt::Foreign, ident, sig);
|
||||
}
|
||||
self.check_extern_fn_signature(
|
||||
self.extern_mod_abi.unwrap_or(ExternAbi::FALLBACK),
|
||||
FnCtxt::Foreign,
|
||||
ident,
|
||||
sig,
|
||||
);
|
||||
}
|
||||
ForeignItemKind::TyAlias(box TyAlias {
|
||||
defaultness,
|
||||
|
|
@ -1411,9 +1464,9 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
|
|||
|
||||
if let FnKind::Fn(ctxt, _, fun) = fk
|
||||
&& let Extern::Explicit(str_lit, _) = fun.sig.header.ext
|
||||
&& let Ok(ExternAbi::Custom) = ExternAbi::from_str(str_lit.symbol.as_str())
|
||||
&& let Ok(abi) = ExternAbi::from_str(str_lit.symbol.as_str())
|
||||
{
|
||||
self.check_custom_abi(ctxt, &fun.ident, &fun.sig);
|
||||
self.check_extern_fn_signature(abi, ctxt, &fun.ident, &fun.sig);
|
||||
}
|
||||
|
||||
self.check_c_variadic_type(fk);
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
//! Errors emitted by ast_passes.
|
||||
|
||||
use rustc_abi::ExternAbi;
|
||||
use rustc_ast::ParamKindOrd;
|
||||
use rustc_errors::codes::*;
|
||||
use rustc_errors::{Applicability, Diag, EmissionGuarantee, Subdiagnostic};
|
||||
|
|
@ -845,6 +846,7 @@ pub(crate) struct AbiCustomSafeForeignFunction {
|
|||
pub(crate) struct AbiCustomSafeFunction {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
pub abi: ExternAbi,
|
||||
|
||||
#[suggestion(
|
||||
ast_passes_suggestion,
|
||||
|
|
@ -856,10 +858,11 @@ pub(crate) struct AbiCustomSafeFunction {
|
|||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(ast_passes_abi_custom_coroutine)]
|
||||
pub(crate) struct AbiCustomCoroutine {
|
||||
#[diag(ast_passes_abi_cannot_be_coroutine)]
|
||||
pub(crate) struct AbiCannotBeCoroutine {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
pub abi: ExternAbi,
|
||||
|
||||
#[suggestion(
|
||||
ast_passes_suggestion,
|
||||
|
|
@ -872,11 +875,12 @@ pub(crate) struct AbiCustomCoroutine {
|
|||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(ast_passes_abi_custom_invalid_signature)]
|
||||
#[diag(ast_passes_abi_must_not_have_parameters_or_return_type)]
|
||||
#[note]
|
||||
pub(crate) struct AbiCustomInvalidSignature {
|
||||
pub(crate) struct AbiMustNotHaveParametersOrReturnType {
|
||||
#[primary_span]
|
||||
pub spans: Vec<Span>,
|
||||
pub abi: ExternAbi,
|
||||
|
||||
#[suggestion(
|
||||
ast_passes_suggestion,
|
||||
|
|
@ -888,3 +892,13 @@ pub(crate) struct AbiCustomInvalidSignature {
|
|||
pub symbol: Symbol,
|
||||
pub padding: &'static str,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(ast_passes_abi_must_not_have_return_type)]
|
||||
#[note]
|
||||
pub(crate) struct AbiMustNotHaveReturnType {
|
||||
#[primary_span]
|
||||
#[help]
|
||||
pub span: Span,
|
||||
pub abi: ExternAbi,
|
||||
}
|
||||
|
|
|
|||
|
|
@ -357,6 +357,10 @@ impl<'a> State<'a> {
|
|||
self.word_nbsp("raw");
|
||||
self.print_mutability(mutability, true);
|
||||
}
|
||||
ast::BorrowKind::Pin => {
|
||||
self.word_nbsp("pin");
|
||||
self.print_mutability(mutability, true);
|
||||
}
|
||||
}
|
||||
self.print_expr_cond_paren(
|
||||
expr,
|
||||
|
|
@ -386,18 +390,44 @@ impl<'a> State<'a> {
|
|||
|
||||
let ib = self.ibox(INDENT_UNIT);
|
||||
|
||||
// The Match subexpression in `match x {} - 1` must be parenthesized if
|
||||
// it is the leftmost subexpression in a statement:
|
||||
//
|
||||
// (match x {}) - 1;
|
||||
//
|
||||
// But not otherwise:
|
||||
//
|
||||
// let _ = match x {} - 1;
|
||||
//
|
||||
// Same applies to a small set of other expression kinds which eagerly
|
||||
// terminate a statement which opens with them.
|
||||
let needs_par = fixup.would_cause_statement_boundary(expr);
|
||||
let needs_par = {
|
||||
// The Match subexpression in `match x {} - 1` must be parenthesized
|
||||
// if it is the leftmost subexpression in a statement:
|
||||
//
|
||||
// (match x {}) - 1;
|
||||
//
|
||||
// But not otherwise:
|
||||
//
|
||||
// let _ = match x {} - 1;
|
||||
//
|
||||
// Same applies to a small set of other expression kinds which
|
||||
// eagerly terminate a statement which opens with them.
|
||||
fixup.would_cause_statement_boundary(expr)
|
||||
} || {
|
||||
// If a binary operation ends up with an attribute, such as
|
||||
// resulting from the following macro expansion, then parentheses
|
||||
// are required so that the attribute encompasses the right
|
||||
// subexpression and not just the left one.
|
||||
//
|
||||
// #![feature(stmt_expr_attributes)]
|
||||
//
|
||||
// macro_rules! add_attr {
|
||||
// ($e:expr) => { #[attr] $e };
|
||||
// }
|
||||
//
|
||||
// let _ = add_attr!(1 + 1);
|
||||
//
|
||||
// We must pretty-print `#[attr] (1 + 1)` not `#[attr] 1 + 1`.
|
||||
!attrs.is_empty()
|
||||
&& matches!(
|
||||
expr.kind,
|
||||
ast::ExprKind::Binary(..)
|
||||
| ast::ExprKind::Cast(..)
|
||||
| ast::ExprKind::Assign(..)
|
||||
| ast::ExprKind::AssignOp(..)
|
||||
| ast::ExprKind::Range(..)
|
||||
)
|
||||
};
|
||||
if needs_par {
|
||||
self.popen();
|
||||
fixup = FixupContext::default();
|
||||
|
|
|
|||
|
|
@ -38,7 +38,8 @@ pub enum InstructionSetAttr {
|
|||
ArmT32,
|
||||
}
|
||||
|
||||
#[derive(Clone, Encodable, Decodable, Debug, PartialEq, Eq, HashStable_Generic, Default)]
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Default, PrintAttribute)]
|
||||
#[derive(Encodable, Decodable, HashStable_Generic)]
|
||||
pub enum OptimizeAttr {
|
||||
/// No `#[optimize(..)]` attribute
|
||||
#[default]
|
||||
|
|
@ -182,6 +183,9 @@ impl Deprecation {
|
|||
#[derive(Clone, Debug, HashStable_Generic, Encodable, Decodable, PrintAttribute)]
|
||||
pub enum AttributeKind {
|
||||
// tidy-alphabetical-start
|
||||
/// Represents `#[align(N)]`.
|
||||
Align { align: Align, span: Span },
|
||||
|
||||
/// Represents `#[rustc_allow_const_fn_unstable]`.
|
||||
AllowConstFnUnstable(ThinVec<Symbol>),
|
||||
|
||||
|
|
@ -198,6 +202,9 @@ pub enum AttributeKind {
|
|||
span: Span,
|
||||
},
|
||||
|
||||
/// Represents `#[cold]`.
|
||||
Cold(Span),
|
||||
|
||||
/// Represents `#[rustc_confusables]`.
|
||||
Confusables {
|
||||
symbols: ThinVec<Symbol>,
|
||||
|
|
@ -205,6 +212,9 @@ pub enum AttributeKind {
|
|||
first_span: Span,
|
||||
},
|
||||
|
||||
/// Represents `#[const_continue]`.
|
||||
ConstContinue(Span),
|
||||
|
||||
/// Represents `#[rustc_const_stable]` and `#[rustc_const_unstable]`.
|
||||
ConstStability {
|
||||
stability: PartialConstStability,
|
||||
|
|
@ -224,17 +234,48 @@ pub enum AttributeKind {
|
|||
/// Represents `#[inline]` and `#[rustc_force_inline]`.
|
||||
Inline(InlineAttr, Span),
|
||||
|
||||
/// Represents `#[loop_match]`.
|
||||
LoopMatch(Span),
|
||||
|
||||
/// Represents `#[rustc_macro_transparency]`.
|
||||
MacroTransparency(Transparency),
|
||||
|
||||
/// Represents [`#[may_dangle]`](https://std-dev-guide.rust-lang.org/tricky/may-dangle.html).
|
||||
MayDangle(Span),
|
||||
|
||||
/// Represents `#[must_use]`.
|
||||
MustUse {
|
||||
span: Span,
|
||||
/// must_use can optionally have a reason: `#[must_use = "reason this must be used"]`
|
||||
reason: Option<Symbol>,
|
||||
},
|
||||
|
||||
/// Represents `#[naked]`
|
||||
Naked(Span),
|
||||
|
||||
/// Represents `#[no_mangle]`
|
||||
NoMangle(Span),
|
||||
|
||||
/// Represents `#[optimize(size|speed)]`
|
||||
Optimize(OptimizeAttr, Span),
|
||||
|
||||
/// Represents `#[rustc_pub_transparent]` (used by the `repr_transparent_external_private_fields` lint).
|
||||
PubTransparent(Span),
|
||||
|
||||
/// Represents [`#[repr]`](https://doc.rust-lang.org/stable/reference/type-layout.html#representations).
|
||||
Repr(ThinVec<(ReprAttr, Span)>),
|
||||
|
||||
/// Represents `#[rustc_skip_during_method_dispatch]`.
|
||||
SkipDuringMethodDispatch { array: bool, boxed_slice: bool, span: Span },
|
||||
|
||||
/// Represents `#[stable]`, `#[unstable]` and `#[rustc_allowed_through_unstable_modules]`.
|
||||
Stability {
|
||||
stability: Stability,
|
||||
/// Span of the attribute.
|
||||
span: Span,
|
||||
},
|
||||
|
||||
/// Represents `#[track_caller]`
|
||||
TrackCaller(Span),
|
||||
// tidy-alphabetical-end
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,42 @@
|
|||
use crate::AttributeKind;
|
||||
|
||||
#[derive(PartialEq)]
|
||||
pub enum EncodeCrossCrate {
|
||||
Yes,
|
||||
No,
|
||||
}
|
||||
|
||||
impl AttributeKind {
|
||||
pub fn encode_cross_crate(&self) -> EncodeCrossCrate {
|
||||
use AttributeKind::*;
|
||||
use EncodeCrossCrate::*;
|
||||
|
||||
match self {
|
||||
Align { .. } => No,
|
||||
AllowConstFnUnstable(..) => No,
|
||||
AllowInternalUnstable(..) => Yes,
|
||||
AsPtr(..) => Yes,
|
||||
BodyStability { .. } => No,
|
||||
Confusables { .. } => Yes,
|
||||
ConstStability { .. } => Yes,
|
||||
ConstStabilityIndirect => No,
|
||||
Deprecation { .. } => Yes,
|
||||
DocComment { .. } => Yes,
|
||||
Inline(..) => No,
|
||||
MacroTransparency(..) => Yes,
|
||||
Repr(..) => No,
|
||||
Stability { .. } => Yes,
|
||||
Cold(..) => No,
|
||||
ConstContinue(..) => No,
|
||||
LoopMatch(..) => No,
|
||||
MayDangle(..) => No,
|
||||
MustUse { .. } => Yes,
|
||||
Naked(..) => No,
|
||||
NoMangle(..) => No,
|
||||
Optimize(..) => No,
|
||||
PubTransparent(..) => Yes,
|
||||
SkipDuringMethodDispatch { .. } => No,
|
||||
TrackCaller(..) => Yes,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -9,6 +9,7 @@
|
|||
// tidy-alphabetical-end
|
||||
|
||||
mod attributes;
|
||||
mod encode_cross_crate;
|
||||
mod stability;
|
||||
mod version;
|
||||
|
||||
|
|
@ -17,6 +18,7 @@ pub mod lints;
|
|||
use std::num::NonZero;
|
||||
|
||||
pub use attributes::*;
|
||||
pub use encode_cross_crate::EncodeCrossCrate;
|
||||
use rustc_abi::Align;
|
||||
use rustc_ast::token::CommentKind;
|
||||
use rustc_ast::{AttrStyle, IntTy, UintTy};
|
||||
|
|
|
|||
|
|
@ -44,6 +44,9 @@ attr_parsing_incorrect_repr_format_packed_expect_integer =
|
|||
attr_parsing_incorrect_repr_format_packed_one_or_zero_arg =
|
||||
incorrect `repr(packed)` attribute format: `packed` takes exactly one parenthesized argument, or no parentheses at all
|
||||
|
||||
attr_parsing_invalid_alignment_value =
|
||||
invalid alignment value: {$error_part}
|
||||
|
||||
attr_parsing_invalid_issue_string =
|
||||
`issue` must be a non-zero numeric string or "none"
|
||||
.must_not_be_zero = `issue` must not be "0", use "none" instead
|
||||
|
|
@ -86,6 +89,10 @@ attr_parsing_missing_since =
|
|||
attr_parsing_multiple_stability_levels =
|
||||
multiple stability levels
|
||||
|
||||
attr_parsing_naked_functions_incompatible_attribute =
|
||||
attribute incompatible with `#[unsafe(naked)]`
|
||||
.label = the `{$attr}` attribute is incompatible with `#[unsafe(naked)]`
|
||||
.naked_attribute = function marked with `#[unsafe(naked)]` here
|
||||
attr_parsing_non_ident_feature =
|
||||
'feature' is not an identifier
|
||||
|
||||
|
|
|
|||
203
compiler/rustc_attr_parsing/src/attributes/codegen_attrs.rs
Normal file
203
compiler/rustc_attr_parsing/src/attributes/codegen_attrs.rs
Normal file
|
|
@ -0,0 +1,203 @@
|
|||
use rustc_attr_data_structures::{AttributeKind, OptimizeAttr};
|
||||
use rustc_feature::{AttributeTemplate, template};
|
||||
use rustc_session::parse::feature_err;
|
||||
use rustc_span::{Span, Symbol, sym};
|
||||
|
||||
use super::{AcceptMapping, AttributeOrder, AttributeParser, OnDuplicate, SingleAttributeParser};
|
||||
use crate::context::{AcceptContext, FinalizeContext, Stage};
|
||||
use crate::parser::ArgParser;
|
||||
use crate::session_diagnostics::NakedFunctionIncompatibleAttribute;
|
||||
|
||||
pub(crate) struct OptimizeParser;
|
||||
|
||||
impl<S: Stage> SingleAttributeParser<S> for OptimizeParser {
|
||||
const PATH: &[Symbol] = &[sym::optimize];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepLast;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::WarnButFutureError;
|
||||
const TEMPLATE: AttributeTemplate = template!(List: "size|speed|none");
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
let Some(list) = args.list() else {
|
||||
cx.expected_list(cx.attr_span);
|
||||
return None;
|
||||
};
|
||||
|
||||
let Some(single) = list.single() else {
|
||||
cx.expected_single_argument(list.span);
|
||||
return None;
|
||||
};
|
||||
|
||||
let res = match single.meta_item().and_then(|i| i.path().word().map(|i| i.name)) {
|
||||
Some(sym::size) => OptimizeAttr::Size,
|
||||
Some(sym::speed) => OptimizeAttr::Speed,
|
||||
Some(sym::none) => OptimizeAttr::DoNotOptimize,
|
||||
_ => {
|
||||
cx.expected_specific_argument(single.span(), vec!["size", "speed", "none"]);
|
||||
OptimizeAttr::Default
|
||||
}
|
||||
};
|
||||
|
||||
Some(AttributeKind::Optimize(res, cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) struct ColdParser;
|
||||
|
||||
impl<S: Stage> SingleAttributeParser<S> for ColdParser {
|
||||
const PATH: &[Symbol] = &[sym::cold];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepLast;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Warn;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(AttributeKind::Cold(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
pub(crate) struct NakedParser {
|
||||
span: Option<Span>,
|
||||
}
|
||||
|
||||
impl<S: Stage> AttributeParser<S> for NakedParser {
|
||||
const ATTRIBUTES: AcceptMapping<Self, S> =
|
||||
&[(&[sym::naked], template!(Word), |this, cx, args| {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
return;
|
||||
}
|
||||
|
||||
if let Some(earlier) = this.span {
|
||||
let span = cx.attr_span;
|
||||
cx.warn_unused_duplicate(earlier, span);
|
||||
} else {
|
||||
this.span = Some(cx.attr_span);
|
||||
}
|
||||
})];
|
||||
|
||||
fn finalize(self, cx: &FinalizeContext<'_, '_, S>) -> Option<AttributeKind> {
|
||||
// FIXME(jdonszelmann): upgrade this list to *parsed* attributes
|
||||
// once all of these have parsed forms. That'd make the check much nicer...
|
||||
//
|
||||
// many attributes don't make sense in combination with #[naked].
|
||||
// Notable attributes that are incompatible with `#[naked]` are:
|
||||
//
|
||||
// * `#[inline]`
|
||||
// * `#[track_caller]`
|
||||
// * `#[test]`, `#[ignore]`, `#[should_panic]`
|
||||
//
|
||||
// NOTE: when making changes to this list, check that `error_codes/E0736.md` remains
|
||||
// accurate.
|
||||
const ALLOW_LIST: &[rustc_span::Symbol] = &[
|
||||
// conditional compilation
|
||||
sym::cfg_trace,
|
||||
sym::cfg_attr_trace,
|
||||
// testing (allowed here so better errors can be generated in `rustc_builtin_macros::test`)
|
||||
sym::test,
|
||||
sym::ignore,
|
||||
sym::should_panic,
|
||||
sym::bench,
|
||||
// diagnostics
|
||||
sym::allow,
|
||||
sym::warn,
|
||||
sym::deny,
|
||||
sym::forbid,
|
||||
sym::deprecated,
|
||||
sym::must_use,
|
||||
// abi, linking and FFI
|
||||
sym::cold,
|
||||
sym::export_name,
|
||||
sym::link_section,
|
||||
sym::linkage,
|
||||
sym::no_mangle,
|
||||
sym::instruction_set,
|
||||
sym::repr,
|
||||
sym::rustc_std_internal_symbol,
|
||||
sym::align,
|
||||
// obviously compatible with self
|
||||
sym::naked,
|
||||
// documentation
|
||||
sym::doc,
|
||||
];
|
||||
|
||||
let span = self.span?;
|
||||
|
||||
// only if we found a naked attribute do we do the somewhat expensive check
|
||||
'outer: for other_attr in cx.all_attrs {
|
||||
for allowed_attr in ALLOW_LIST {
|
||||
if other_attr.segments().next().is_some_and(|i| cx.tools.contains(&i.name)) {
|
||||
// effectively skips the error message being emitted below
|
||||
// if it's a tool attribute
|
||||
continue 'outer;
|
||||
}
|
||||
if other_attr.word_is(*allowed_attr) {
|
||||
// effectively skips the error message being emitted below
|
||||
// if its an allowed attribute
|
||||
continue 'outer;
|
||||
}
|
||||
|
||||
if other_attr.word_is(sym::target_feature) {
|
||||
if !cx.features().naked_functions_target_feature() {
|
||||
feature_err(
|
||||
&cx.sess(),
|
||||
sym::naked_functions_target_feature,
|
||||
other_attr.span(),
|
||||
"`#[target_feature(/* ... */)]` is currently unstable on `#[naked]` functions",
|
||||
).emit();
|
||||
}
|
||||
|
||||
continue 'outer;
|
||||
}
|
||||
}
|
||||
|
||||
cx.emit_err(NakedFunctionIncompatibleAttribute {
|
||||
span: other_attr.span(),
|
||||
naked_span: span,
|
||||
attr: other_attr.get_attribute_path().to_string(),
|
||||
});
|
||||
}
|
||||
|
||||
Some(AttributeKind::Naked(span))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) struct TrackCallerParser;
|
||||
|
||||
impl<S: Stage> SingleAttributeParser<S> for TrackCallerParser {
|
||||
const PATH: &[Symbol] = &[sym::track_caller];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepLast;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Warn;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(AttributeKind::TrackCaller(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) struct NoMangleParser;
|
||||
|
||||
impl<S: Stage> SingleAttributeParser<S> for NoMangleParser {
|
||||
const PATH: &[rustc_span::Symbol] = &[sym::no_mangle];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepLast;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Warn;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(AttributeKind::NoMangle(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
|
@ -45,10 +45,8 @@ impl<S: Stage> SingleAttributeParser<S> for InlineParser {
|
|||
ArgParser::NameValue(_) => {
|
||||
let suggestions =
|
||||
<Self as SingleAttributeParser<S>>::TEMPLATE.suggestions(false, "inline");
|
||||
cx.emit_lint(
|
||||
AttributeLintKind::IllFormedAttributeInput { suggestions },
|
||||
cx.attr_span,
|
||||
);
|
||||
let span = cx.attr_span;
|
||||
cx.emit_lint(AttributeLintKind::IllFormedAttributeInput { suggestions }, span);
|
||||
return None;
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -14,8 +14,25 @@ impl<S: Stage> SingleAttributeParser<S> for AsPtrParser {
|
|||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Error;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, _args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
// FIXME: check that there's no args (this is currently checked elsewhere)
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
}
|
||||
Some(AttributeKind::AsPtr(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) struct PubTransparentParser;
|
||||
impl<S: Stage> SingleAttributeParser<S> for PubTransparentParser {
|
||||
const PATH: &[Symbol] = &[sym::rustc_pub_transparent];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepFirst;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Error;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
}
|
||||
Some(AttributeKind::PubTransparent(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
|
|
|||
31
compiler/rustc_attr_parsing/src/attributes/loop_match.rs
Normal file
31
compiler/rustc_attr_parsing/src/attributes/loop_match.rs
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
use rustc_attr_data_structures::AttributeKind;
|
||||
use rustc_feature::{AttributeTemplate, template};
|
||||
use rustc_span::{Symbol, sym};
|
||||
|
||||
use crate::attributes::{AttributeOrder, OnDuplicate, SingleAttributeParser};
|
||||
use crate::context::{AcceptContext, Stage};
|
||||
use crate::parser::ArgParser;
|
||||
|
||||
pub(crate) struct LoopMatchParser;
|
||||
impl<S: Stage> SingleAttributeParser<S> for LoopMatchParser {
|
||||
const PATH: &[Symbol] = &[sym::loop_match];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepFirst;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Warn;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, _args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
Some(AttributeKind::LoopMatch(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) struct ConstContinueParser;
|
||||
impl<S: Stage> SingleAttributeParser<S> for ConstContinueParser {
|
||||
const PATH: &[Symbol] = &[sym::const_continue];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepFirst;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Warn;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, _args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
Some(AttributeKind::ConstContinue(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
|
@ -17,7 +17,6 @@
|
|||
use std::marker::PhantomData;
|
||||
|
||||
use rustc_attr_data_structures::AttributeKind;
|
||||
use rustc_attr_data_structures::lints::AttributeLintKind;
|
||||
use rustc_feature::AttributeTemplate;
|
||||
use rustc_span::{Span, Symbol};
|
||||
use thin_vec::ThinVec;
|
||||
|
|
@ -28,12 +27,17 @@ use crate::session_diagnostics::UnusedMultiple;
|
|||
|
||||
pub(crate) mod allow_unstable;
|
||||
pub(crate) mod cfg;
|
||||
pub(crate) mod codegen_attrs;
|
||||
pub(crate) mod confusables;
|
||||
pub(crate) mod deprecation;
|
||||
pub(crate) mod inline;
|
||||
pub(crate) mod lint_helpers;
|
||||
pub(crate) mod loop_match;
|
||||
pub(crate) mod must_use;
|
||||
pub(crate) mod repr;
|
||||
pub(crate) mod semantics;
|
||||
pub(crate) mod stability;
|
||||
pub(crate) mod traits;
|
||||
pub(crate) mod transparency;
|
||||
pub(crate) mod util;
|
||||
|
||||
|
|
@ -86,8 +90,19 @@ pub(crate) trait AttributeParser<S: Stage>: Default + 'static {
|
|||
/// [`SingleAttributeParser`] can only convert attributes one-to-one, and cannot combine multiple
|
||||
/// attributes together like is necessary for `#[stable()]` and `#[unstable()]` for example.
|
||||
pub(crate) trait SingleAttributeParser<S: Stage>: 'static {
|
||||
/// The single path of the attribute this parser accepts.
|
||||
///
|
||||
/// If you need the parser to accept more than one path, use [`AttributeParser`] instead
|
||||
const PATH: &[Symbol];
|
||||
|
||||
/// Configures the precedence of attributes with the same `PATH` on a syntax node.
|
||||
const ATTRIBUTE_ORDER: AttributeOrder;
|
||||
|
||||
/// Configures what to do when when the same attribute is
|
||||
/// applied more than once on the same syntax node.
|
||||
///
|
||||
/// [`ATTRIBUTE_ORDER`](Self::ATTRIBUTE_ORDER) specified which one is assumed to be correct,
|
||||
/// and this specified whether to, for example, warn or error on the other one.
|
||||
const ON_DUPLICATE: OnDuplicate<S>;
|
||||
|
||||
/// The template this attribute parser should implement. Used for diagnostics.
|
||||
|
|
@ -97,6 +112,8 @@ pub(crate) trait SingleAttributeParser<S: Stage>: 'static {
|
|||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind>;
|
||||
}
|
||||
|
||||
/// Use in combination with [`SingleAttributeParser`].
|
||||
/// `Single<T: SingleAttributeParser>` implements [`AttributeParser`].
|
||||
pub(crate) struct Single<T: SingleAttributeParser<S>, S: Stage>(
|
||||
PhantomData<(S, T)>,
|
||||
Option<(AttributeKind, Span)>,
|
||||
|
|
@ -173,14 +190,8 @@ impl<S: Stage> OnDuplicate<S> {
|
|||
unused: Span,
|
||||
) {
|
||||
match self {
|
||||
OnDuplicate::Warn => cx.emit_lint(
|
||||
AttributeLintKind::UnusedDuplicate { this: unused, other: used, warning: false },
|
||||
unused,
|
||||
),
|
||||
OnDuplicate::WarnButFutureError => cx.emit_lint(
|
||||
AttributeLintKind::UnusedDuplicate { this: unused, other: used, warning: true },
|
||||
unused,
|
||||
),
|
||||
OnDuplicate::Warn => cx.warn_unused_duplicate(used, unused),
|
||||
OnDuplicate::WarnButFutureError => cx.warn_unused_duplicate_future_error(used, unused),
|
||||
OnDuplicate::Error => {
|
||||
cx.emit_err(UnusedMultiple {
|
||||
this: used,
|
||||
|
|
@ -229,6 +240,10 @@ pub(crate) trait CombineAttributeParser<S: Stage>: 'static {
|
|||
const PATH: &[rustc_span::Symbol];
|
||||
|
||||
type Item;
|
||||
/// A function that converts individual items (of type [`Item`](Self::Item)) into the final attribute.
|
||||
///
|
||||
/// For example, individual representations fomr `#[repr(...)]` attributes into an `AttributeKind::Repr(x)`,
|
||||
/// where `x` is a vec of these individual reprs.
|
||||
const CONVERT: ConvertFn<Self::Item>;
|
||||
|
||||
/// The template this attribute parser should implement. Used for diagnostics.
|
||||
|
|
@ -241,6 +256,8 @@ pub(crate) trait CombineAttributeParser<S: Stage>: 'static {
|
|||
) -> impl IntoIterator<Item = Self::Item> + 'c;
|
||||
}
|
||||
|
||||
/// Use in combination with [`CombineAttributeParser`].
|
||||
/// `Combine<T: CombineAttributeParser>` implements [`AttributeParser`].
|
||||
pub(crate) struct Combine<T: CombineAttributeParser<S>, S: Stage>(
|
||||
PhantomData<(S, T)>,
|
||||
ThinVec<<T as CombineAttributeParser<S>>::Item>,
|
||||
|
|
|
|||
40
compiler/rustc_attr_parsing/src/attributes/must_use.rs
Normal file
40
compiler/rustc_attr_parsing/src/attributes/must_use.rs
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
use rustc_attr_data_structures::AttributeKind;
|
||||
use rustc_errors::DiagArgValue;
|
||||
use rustc_feature::{AttributeTemplate, template};
|
||||
use rustc_span::{Symbol, sym};
|
||||
|
||||
use crate::attributes::{AttributeOrder, OnDuplicate, SingleAttributeParser};
|
||||
use crate::context::{AcceptContext, Stage};
|
||||
use crate::parser::ArgParser;
|
||||
use crate::session_diagnostics;
|
||||
|
||||
pub(crate) struct MustUseParser;
|
||||
|
||||
impl<S: Stage> SingleAttributeParser<S> for MustUseParser {
|
||||
const PATH: &[Symbol] = &[sym::must_use];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepLast;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::WarnButFutureError;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word, NameValueStr: "reason");
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
Some(AttributeKind::MustUse {
|
||||
span: cx.attr_span,
|
||||
reason: match args {
|
||||
ArgParser::NoArgs => None,
|
||||
ArgParser::NameValue(name_value) => name_value.value_as_str(),
|
||||
ArgParser::List(_) => {
|
||||
let suggestions =
|
||||
<Self as SingleAttributeParser<S>>::TEMPLATE.suggestions(false, "must_use");
|
||||
cx.emit_err(session_diagnostics::MustUseIllFormedAttributeInput {
|
||||
num_suggestions: suggestions.len(),
|
||||
suggestions: DiagArgValue::StrListSepByAnd(
|
||||
suggestions.into_iter().map(|s| format!("`{s}`").into()).collect(),
|
||||
),
|
||||
span: cx.attr_span,
|
||||
});
|
||||
return None;
|
||||
}
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
@ -4,7 +4,7 @@ use rustc_attr_data_structures::{AttributeKind, IntType, ReprAttr};
|
|||
use rustc_feature::{AttributeTemplate, template};
|
||||
use rustc_span::{DUMMY_SP, Span, Symbol, sym};
|
||||
|
||||
use super::{CombineAttributeParser, ConvertFn};
|
||||
use super::{AcceptMapping, AttributeParser, CombineAttributeParser, ConvertFn, FinalizeContext};
|
||||
use crate::context::{AcceptContext, Stage};
|
||||
use crate::parser::{ArgParser, MetaItemListParser, MetaItemParser};
|
||||
use crate::session_diagnostics;
|
||||
|
|
@ -25,7 +25,8 @@ impl<S: Stage> CombineAttributeParser<S> for ReprParser {
|
|||
const PATH: &[Symbol] = &[sym::repr];
|
||||
const CONVERT: ConvertFn<Self::Item> = AttributeKind::Repr;
|
||||
// FIXME(jdonszelmann): never used
|
||||
const TEMPLATE: AttributeTemplate = template!(List: "C");
|
||||
const TEMPLATE: AttributeTemplate =
|
||||
template!(List: "C | Rust | align(...) | packed(...) | <integer type> | transparent");
|
||||
|
||||
fn extend<'c>(
|
||||
cx: &'c mut AcceptContext<'_, '_, S>,
|
||||
|
|
@ -203,7 +204,7 @@ fn parse_repr_align<S: Stage>(
|
|||
});
|
||||
}
|
||||
Align => {
|
||||
cx.dcx().emit_err(session_diagnostics::IncorrectReprFormatAlignOneArg {
|
||||
cx.emit_err(session_diagnostics::IncorrectReprFormatAlignOneArg {
|
||||
span: param_span,
|
||||
});
|
||||
}
|
||||
|
|
@ -266,3 +267,57 @@ fn parse_alignment(node: &LitKind) -> Result<Align, &'static str> {
|
|||
Err("not an unsuffixed integer")
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse #[align(N)].
|
||||
#[derive(Default)]
|
||||
pub(crate) struct AlignParser(Option<(Align, Span)>);
|
||||
|
||||
impl AlignParser {
|
||||
const PATH: &'static [Symbol] = &[sym::align];
|
||||
const TEMPLATE: AttributeTemplate = template!(List: "<alignment in bytes>");
|
||||
|
||||
fn parse<'c, S: Stage>(
|
||||
&mut self,
|
||||
cx: &'c mut AcceptContext<'_, '_, S>,
|
||||
args: &'c ArgParser<'_>,
|
||||
) {
|
||||
match args {
|
||||
ArgParser::NoArgs | ArgParser::NameValue(_) => {
|
||||
cx.expected_list(cx.attr_span);
|
||||
}
|
||||
ArgParser::List(list) => {
|
||||
let Some(align) = list.single() else {
|
||||
cx.expected_single_argument(list.span);
|
||||
return;
|
||||
};
|
||||
|
||||
let Some(lit) = align.lit() else {
|
||||
cx.emit_err(session_diagnostics::IncorrectReprFormatExpectInteger {
|
||||
span: align.span(),
|
||||
});
|
||||
|
||||
return;
|
||||
};
|
||||
|
||||
match parse_alignment(&lit.kind) {
|
||||
Ok(literal) => self.0 = Ord::max(self.0, Some((literal, cx.attr_span))),
|
||||
Err(message) => {
|
||||
cx.emit_err(session_diagnostics::InvalidAlignmentValue {
|
||||
span: lit.span,
|
||||
error_part: message,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<S: Stage> AttributeParser<S> for AlignParser {
|
||||
const ATTRIBUTES: AcceptMapping<Self, S> = &[(Self::PATH, Self::TEMPLATE, Self::parse)];
|
||||
|
||||
fn finalize(self, _cx: &FinalizeContext<'_, '_, S>) -> Option<AttributeKind> {
|
||||
let (align, span) = self.0?;
|
||||
Some(AttributeKind::Align { align, span })
|
||||
}
|
||||
}
|
||||
|
|
|
|||
22
compiler/rustc_attr_parsing/src/attributes/semantics.rs
Normal file
22
compiler/rustc_attr_parsing/src/attributes/semantics.rs
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
use rustc_attr_data_structures::AttributeKind;
|
||||
use rustc_feature::{AttributeTemplate, template};
|
||||
use rustc_span::{Symbol, sym};
|
||||
|
||||
use crate::attributes::{AttributeOrder, OnDuplicate, SingleAttributeParser};
|
||||
use crate::context::{AcceptContext, Stage};
|
||||
use crate::parser::ArgParser;
|
||||
|
||||
pub(crate) struct MayDangleParser;
|
||||
impl<S: Stage> SingleAttributeParser<S> for MayDangleParser {
|
||||
const PATH: &[Symbol] = &[sym::may_dangle];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepFirst;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Warn;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
}
|
||||
Some(AttributeKind::MayDangle(cx.attr_span))
|
||||
}
|
||||
}
|
||||
|
|
@ -139,7 +139,10 @@ impl<S: Stage> SingleAttributeParser<S> for ConstStabilityIndirectParser {
|
|||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Ignore;
|
||||
const TEMPLATE: AttributeTemplate = template!(Word);
|
||||
|
||||
fn convert(_cx: &mut AcceptContext<'_, '_, S>, _args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.expected_no_args(span);
|
||||
}
|
||||
Some(AttributeKind::ConstStabilityIndirect)
|
||||
}
|
||||
}
|
||||
|
|
@ -361,8 +364,8 @@ pub(crate) fn parse_unstability<S: Stage>(
|
|||
};
|
||||
}
|
||||
Some(sym::soft) => {
|
||||
if !param.args().no_args() {
|
||||
cx.emit_err(session_diagnostics::SoftNoArgs { span: param.span() });
|
||||
if let Err(span) = args.no_args() {
|
||||
cx.emit_err(session_diagnostics::SoftNoArgs { span });
|
||||
}
|
||||
is_soft = true;
|
||||
}
|
||||
|
|
|
|||
54
compiler/rustc_attr_parsing/src/attributes/traits.rs
Normal file
54
compiler/rustc_attr_parsing/src/attributes/traits.rs
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
use core::mem;
|
||||
|
||||
use rustc_attr_data_structures::AttributeKind;
|
||||
use rustc_feature::{AttributeTemplate, template};
|
||||
use rustc_span::{Symbol, sym};
|
||||
|
||||
use crate::attributes::{AttributeOrder, OnDuplicate, SingleAttributeParser};
|
||||
use crate::context::{AcceptContext, Stage};
|
||||
use crate::parser::ArgParser;
|
||||
|
||||
pub(crate) struct SkipDuringMethodDispatchParser;
|
||||
|
||||
impl<S: Stage> SingleAttributeParser<S> for SkipDuringMethodDispatchParser {
|
||||
const PATH: &[Symbol] = &[sym::rustc_skip_during_method_dispatch];
|
||||
const ATTRIBUTE_ORDER: AttributeOrder = AttributeOrder::KeepFirst;
|
||||
const ON_DUPLICATE: OnDuplicate<S> = OnDuplicate::Error;
|
||||
|
||||
const TEMPLATE: AttributeTemplate = template!(List: "array, boxed_slice");
|
||||
|
||||
fn convert(cx: &mut AcceptContext<'_, '_, S>, args: &ArgParser<'_>) -> Option<AttributeKind> {
|
||||
let mut array = false;
|
||||
let mut boxed_slice = false;
|
||||
let Some(args) = args.list() else {
|
||||
cx.expected_list(cx.attr_span);
|
||||
return None;
|
||||
};
|
||||
if args.is_empty() {
|
||||
cx.expected_at_least_one_argument(args.span);
|
||||
return None;
|
||||
}
|
||||
for arg in args.mixed() {
|
||||
let Some(arg) = arg.meta_item() else {
|
||||
cx.unexpected_literal(arg.span());
|
||||
continue;
|
||||
};
|
||||
if let Err(span) = arg.args().no_args() {
|
||||
cx.expected_no_args(span);
|
||||
}
|
||||
let path = arg.path();
|
||||
let (key, skip): (Symbol, &mut bool) = match path.word_sym() {
|
||||
Some(key @ sym::array) => (key, &mut array),
|
||||
Some(key @ sym::boxed_slice) => (key, &mut boxed_slice),
|
||||
_ => {
|
||||
cx.expected_specific_argument(path.span(), vec!["array", "boxed_slice"]);
|
||||
continue;
|
||||
}
|
||||
};
|
||||
if mem::replace(skip, true) {
|
||||
cx.duplicate_key(arg.span(), key);
|
||||
}
|
||||
}
|
||||
Some(AttributeKind::SkipDuringMethodDispatch { array, boxed_slice, span: cx.attr_span })
|
||||
}
|
||||
}
|
||||
|
|
@ -15,17 +15,24 @@ use rustc_session::Session;
|
|||
use rustc_span::{DUMMY_SP, ErrorGuaranteed, Span, Symbol, sym};
|
||||
|
||||
use crate::attributes::allow_unstable::{AllowConstFnUnstableParser, AllowInternalUnstableParser};
|
||||
use crate::attributes::codegen_attrs::{
|
||||
ColdParser, NakedParser, NoMangleParser, OptimizeParser, TrackCallerParser,
|
||||
};
|
||||
use crate::attributes::confusables::ConfusablesParser;
|
||||
use crate::attributes::deprecation::DeprecationParser;
|
||||
use crate::attributes::inline::{InlineParser, RustcForceInlineParser};
|
||||
use crate::attributes::lint_helpers::AsPtrParser;
|
||||
use crate::attributes::repr::ReprParser;
|
||||
use crate::attributes::lint_helpers::{AsPtrParser, PubTransparentParser};
|
||||
use crate::attributes::loop_match::{ConstContinueParser, LoopMatchParser};
|
||||
use crate::attributes::must_use::MustUseParser;
|
||||
use crate::attributes::repr::{AlignParser, ReprParser};
|
||||
use crate::attributes::semantics::MayDangleParser;
|
||||
use crate::attributes::stability::{
|
||||
BodyStabilityParser, ConstStabilityIndirectParser, ConstStabilityParser, StabilityParser,
|
||||
};
|
||||
use crate::attributes::traits::SkipDuringMethodDispatchParser;
|
||||
use crate::attributes::transparency::TransparencyParser;
|
||||
use crate::attributes::{AttributeParser as _, Combine, Single};
|
||||
use crate::parser::{ArgParser, MetaItemParser};
|
||||
use crate::parser::{ArgParser, MetaItemParser, PathParser};
|
||||
use crate::session_diagnostics::{AttributeParseError, AttributeParseErrorReason, UnknownMetaItem};
|
||||
|
||||
macro_rules! group_type {
|
||||
|
|
@ -90,9 +97,11 @@ macro_rules! attribute_parsers {
|
|||
attribute_parsers!(
|
||||
pub(crate) static ATTRIBUTE_PARSERS = [
|
||||
// tidy-alphabetical-start
|
||||
AlignParser,
|
||||
BodyStabilityParser,
|
||||
ConfusablesParser,
|
||||
ConstStabilityParser,
|
||||
NakedParser,
|
||||
StabilityParser,
|
||||
// tidy-alphabetical-end
|
||||
|
||||
|
|
@ -104,10 +113,20 @@ attribute_parsers!(
|
|||
|
||||
// tidy-alphabetical-start
|
||||
Single<AsPtrParser>,
|
||||
Single<ColdParser>,
|
||||
Single<ConstContinueParser>,
|
||||
Single<ConstStabilityIndirectParser>,
|
||||
Single<DeprecationParser>,
|
||||
Single<InlineParser>,
|
||||
Single<LoopMatchParser>,
|
||||
Single<MayDangleParser>,
|
||||
Single<MustUseParser>,
|
||||
Single<NoMangleParser>,
|
||||
Single<OptimizeParser>,
|
||||
Single<PubTransparentParser>,
|
||||
Single<RustcForceInlineParser>,
|
||||
Single<SkipDuringMethodDispatchParser>,
|
||||
Single<TrackCallerParser>,
|
||||
Single<TransparencyParser>,
|
||||
// tidy-alphabetical-end
|
||||
];
|
||||
|
|
@ -164,7 +183,7 @@ pub struct Late;
|
|||
///
|
||||
/// Gives [`AttributeParser`]s enough information to create errors, for example.
|
||||
pub(crate) struct AcceptContext<'f, 'sess, S: Stage> {
|
||||
pub(crate) finalize_cx: FinalizeContext<'f, 'sess, S>,
|
||||
pub(crate) shared: SharedContext<'f, 'sess, S>,
|
||||
/// The span of the attribute currently being parsed
|
||||
pub(crate) attr_span: Span,
|
||||
|
||||
|
|
@ -177,7 +196,7 @@ pub(crate) struct AcceptContext<'f, 'sess, S: Stage> {
|
|||
pub(crate) attr_path: AttrPath,
|
||||
}
|
||||
|
||||
impl<'f, 'sess: 'f, S: Stage> AcceptContext<'f, 'sess, S> {
|
||||
impl<'f, 'sess: 'f, S: Stage> SharedContext<'f, 'sess, S> {
|
||||
pub(crate) fn emit_err(&self, diag: impl for<'x> Diagnostic<'x>) -> ErrorGuaranteed {
|
||||
S::emit_err(&self.sess, diag)
|
||||
}
|
||||
|
|
@ -190,6 +209,34 @@ impl<'f, 'sess: 'f, S: Stage> AcceptContext<'f, 'sess, S> {
|
|||
(self.emit_lint)(AttributeLint { id, span, kind: lint });
|
||||
}
|
||||
|
||||
pub(crate) fn warn_unused_duplicate(&mut self, used_span: Span, unused_span: Span) {
|
||||
self.emit_lint(
|
||||
AttributeLintKind::UnusedDuplicate {
|
||||
this: unused_span,
|
||||
other: used_span,
|
||||
warning: false,
|
||||
},
|
||||
unused_span,
|
||||
)
|
||||
}
|
||||
|
||||
pub(crate) fn warn_unused_duplicate_future_error(
|
||||
&mut self,
|
||||
used_span: Span,
|
||||
unused_span: Span,
|
||||
) {
|
||||
self.emit_lint(
|
||||
AttributeLintKind::UnusedDuplicate {
|
||||
this: unused_span,
|
||||
other: used_span,
|
||||
warning: true,
|
||||
},
|
||||
unused_span,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'f, 'sess: 'f, S: Stage> AcceptContext<'f, 'sess, S> {
|
||||
pub(crate) fn unknown_key(
|
||||
&self,
|
||||
span: Span,
|
||||
|
|
@ -231,6 +278,16 @@ impl<'f, 'sess: 'f, S: Stage> AcceptContext<'f, 'sess, S> {
|
|||
})
|
||||
}
|
||||
|
||||
pub(crate) fn expected_no_args(&self, args_span: Span) -> ErrorGuaranteed {
|
||||
self.emit_err(AttributeParseError {
|
||||
span: args_span,
|
||||
attr_span: self.attr_span,
|
||||
template: self.template.clone(),
|
||||
attribute: self.attr_path.clone(),
|
||||
reason: AttributeParseErrorReason::ExpectedNoArgs,
|
||||
})
|
||||
}
|
||||
|
||||
/// emit an error that a `name = value` pair was expected at this span. The symbol can be given for
|
||||
/// a nicer error message talking about the specific name that was found lacking a value.
|
||||
pub(crate) fn expected_name_value(&self, span: Span, name: Option<Symbol>) -> ErrorGuaranteed {
|
||||
|
|
@ -276,6 +333,16 @@ impl<'f, 'sess: 'f, S: Stage> AcceptContext<'f, 'sess, S> {
|
|||
})
|
||||
}
|
||||
|
||||
pub(crate) fn expected_at_least_one_argument(&self, span: Span) -> ErrorGuaranteed {
|
||||
self.emit_err(AttributeParseError {
|
||||
span,
|
||||
attr_span: self.attr_span,
|
||||
template: self.template.clone(),
|
||||
attribute: self.attr_path.clone(),
|
||||
reason: AttributeParseErrorReason::ExpectedAtLeastOneArgument,
|
||||
})
|
||||
}
|
||||
|
||||
pub(crate) fn expected_specific_argument(
|
||||
&self,
|
||||
span: Span,
|
||||
|
|
@ -312,16 +379,16 @@ impl<'f, 'sess: 'f, S: Stage> AcceptContext<'f, 'sess, S> {
|
|||
}
|
||||
|
||||
impl<'f, 'sess, S: Stage> Deref for AcceptContext<'f, 'sess, S> {
|
||||
type Target = FinalizeContext<'f, 'sess, S>;
|
||||
type Target = SharedContext<'f, 'sess, S>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.finalize_cx
|
||||
&self.shared
|
||||
}
|
||||
}
|
||||
|
||||
impl<'f, 'sess, S: Stage> DerefMut for AcceptContext<'f, 'sess, S> {
|
||||
fn deref_mut(&mut self) -> &mut Self::Target {
|
||||
&mut self.finalize_cx
|
||||
&mut self.shared
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -329,7 +396,7 @@ impl<'f, 'sess, S: Stage> DerefMut for AcceptContext<'f, 'sess, S> {
|
|||
///
|
||||
/// Gives [`AttributeParser`](crate::attributes::AttributeParser)s enough information to create
|
||||
/// errors, for example.
|
||||
pub(crate) struct FinalizeContext<'p, 'sess, S: Stage> {
|
||||
pub(crate) struct SharedContext<'p, 'sess, S: Stage> {
|
||||
/// The parse context, gives access to the session and the
|
||||
/// diagnostics context.
|
||||
pub(crate) cx: &'p mut AttributeParser<'sess, S>,
|
||||
|
|
@ -338,10 +405,40 @@ pub(crate) struct FinalizeContext<'p, 'sess, S: Stage> {
|
|||
/// The id ([`NodeId`] if `S` is `Early`, [`HirId`] if `S` is `Late`) of the syntactical component this attribute was applied to
|
||||
pub(crate) target_id: S::Id,
|
||||
|
||||
pub(crate) emit_lint: &'p mut dyn FnMut(AttributeLint<S::Id>),
|
||||
emit_lint: &'p mut dyn FnMut(AttributeLint<S::Id>),
|
||||
}
|
||||
|
||||
/// Context given to every attribute parser during finalization.
|
||||
///
|
||||
/// Gives [`AttributeParser`](crate::attributes::AttributeParser)s enough information to create
|
||||
/// errors, for example.
|
||||
pub(crate) struct FinalizeContext<'p, 'sess, S: Stage> {
|
||||
pub(crate) shared: SharedContext<'p, 'sess, S>,
|
||||
|
||||
/// A list of all attribute on this syntax node.
|
||||
///
|
||||
/// Useful for compatibility checks with other attributes in [`finalize`](crate::attributes::AttributeParser::finalize)
|
||||
///
|
||||
/// Usually, you should use normal attribute parsing logic instead,
|
||||
/// especially when making a *denylist* of other attributes.
|
||||
pub(crate) all_attrs: &'p [PathParser<'p>],
|
||||
}
|
||||
|
||||
impl<'p, 'sess: 'p, S: Stage> Deref for FinalizeContext<'p, 'sess, S> {
|
||||
type Target = SharedContext<'p, 'sess, S>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.shared
|
||||
}
|
||||
}
|
||||
|
||||
impl<'p, 'sess: 'p, S: Stage> DerefMut for FinalizeContext<'p, 'sess, S> {
|
||||
fn deref_mut(&mut self) -> &mut Self::Target {
|
||||
&mut self.shared
|
||||
}
|
||||
}
|
||||
|
||||
impl<'p, 'sess: 'p, S: Stage> Deref for SharedContext<'p, 'sess, S> {
|
||||
type Target = AttributeParser<'sess, S>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
|
|
@ -349,7 +446,7 @@ impl<'p, 'sess: 'p, S: Stage> Deref for FinalizeContext<'p, 'sess, S> {
|
|||
}
|
||||
}
|
||||
|
||||
impl<'p, 'sess: 'p, S: Stage> DerefMut for FinalizeContext<'p, 'sess, S> {
|
||||
impl<'p, 'sess: 'p, S: Stage> DerefMut for SharedContext<'p, 'sess, S> {
|
||||
fn deref_mut(&mut self) -> &mut Self::Target {
|
||||
self.cx
|
||||
}
|
||||
|
|
@ -364,8 +461,7 @@ pub enum OmitDoc {
|
|||
/// Context created once, for example as part of the ast lowering
|
||||
/// context, through which all attributes can be lowered.
|
||||
pub struct AttributeParser<'sess, S: Stage = Late> {
|
||||
#[expect(dead_code)] // FIXME(jdonszelmann): needed later to verify we parsed all attributes
|
||||
tools: Vec<Symbol>,
|
||||
pub(crate) tools: Vec<Symbol>,
|
||||
features: Option<&'sess Features>,
|
||||
sess: &'sess Session,
|
||||
stage: PhantomData<S>,
|
||||
|
|
@ -419,19 +515,13 @@ impl<'sess> AttributeParser<'sess, Early> {
|
|||
|
||||
parsed.pop()
|
||||
}
|
||||
|
||||
pub fn new_early(sess: &'sess Session, features: &'sess Features, tools: Vec<Symbol>) -> Self {
|
||||
Self { features: Some(features), tools, parse_only: None, sess, stage: PhantomData }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'sess> AttributeParser<'sess, Late> {
|
||||
pub fn new(sess: &'sess Session, features: &'sess Features, tools: Vec<Symbol>) -> Self {
|
||||
Self { features: Some(features), tools, parse_only: None, sess, stage: PhantomData }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'sess, S: Stage> AttributeParser<'sess, S> {
|
||||
pub fn new(sess: &'sess Session, features: &'sess Features, tools: Vec<Symbol>) -> Self {
|
||||
Self { features: Some(features), tools, parse_only: None, sess, stage: PhantomData }
|
||||
}
|
||||
|
||||
pub(crate) fn sess(&self) -> &'sess Session {
|
||||
&self.sess
|
||||
}
|
||||
|
|
@ -459,6 +549,7 @@ impl<'sess, S: Stage> AttributeParser<'sess, S> {
|
|||
mut emit_lint: impl FnMut(AttributeLint<S::Id>),
|
||||
) -> Vec<Attribute> {
|
||||
let mut attributes = Vec::new();
|
||||
let mut attr_paths = Vec::new();
|
||||
|
||||
for attr in attrs {
|
||||
// If we're only looking for a single attribute, skip all the ones we don't care about.
|
||||
|
|
@ -502,6 +593,8 @@ impl<'sess, S: Stage> AttributeParser<'sess, S> {
|
|||
// }))
|
||||
// }
|
||||
ast::AttrKind::Normal(n) => {
|
||||
attr_paths.push(PathParser::Ast(&n.item.path));
|
||||
|
||||
let parser = MetaItemParser::from_attr(n, self.dcx());
|
||||
let path = parser.path();
|
||||
let args = parser.args();
|
||||
|
|
@ -510,7 +603,7 @@ impl<'sess, S: Stage> AttributeParser<'sess, S> {
|
|||
if let Some(accepts) = S::parsers().0.get(parts.as_slice()) {
|
||||
for (template, accept) in accepts {
|
||||
let mut cx: AcceptContext<'_, 'sess, S> = AcceptContext {
|
||||
finalize_cx: FinalizeContext {
|
||||
shared: SharedContext {
|
||||
cx: self,
|
||||
target_span,
|
||||
target_id,
|
||||
|
|
@ -554,10 +647,13 @@ impl<'sess, S: Stage> AttributeParser<'sess, S> {
|
|||
let mut parsed_attributes = Vec::new();
|
||||
for f in &S::parsers().1 {
|
||||
if let Some(attr) = f(&mut FinalizeContext {
|
||||
cx: self,
|
||||
target_span,
|
||||
target_id,
|
||||
emit_lint: &mut emit_lint,
|
||||
shared: SharedContext {
|
||||
cx: self,
|
||||
target_span,
|
||||
target_id,
|
||||
emit_lint: &mut emit_lint,
|
||||
},
|
||||
all_attrs: &attr_paths,
|
||||
}) {
|
||||
parsed_attributes.push(Attribute::Parsed(attr));
|
||||
}
|
||||
|
|
|
|||
|
|
@ -87,6 +87,14 @@ impl<'a> PathParser<'a> {
|
|||
pub fn word_is(&self, sym: Symbol) -> bool {
|
||||
self.word().map(|i| i.name == sym).unwrap_or(false)
|
||||
}
|
||||
|
||||
/// Checks whether the first segments match the givens.
|
||||
///
|
||||
/// Unlike [`segments_is`](Self::segments_is),
|
||||
/// `self` may contain more segments than the number matched against.
|
||||
pub fn starts_with(&self, segments: &[Symbol]) -> bool {
|
||||
segments.len() < self.len() && self.segments().zip(segments).all(|(a, b)| a.name == *b)
|
||||
}
|
||||
}
|
||||
|
||||
impl Display for PathParser<'_> {
|
||||
|
|
@ -161,9 +169,15 @@ impl<'a> ArgParser<'a> {
|
|||
}
|
||||
}
|
||||
|
||||
/// Asserts that there are no arguments
|
||||
pub fn no_args(&self) -> bool {
|
||||
matches!(self, Self::NoArgs)
|
||||
/// Assert that there were no args.
|
||||
/// If there were, get a span to the arguments
|
||||
/// (to pass to [`AcceptContext::expected_no_args`](crate::context::AcceptContext::expected_no_args)).
|
||||
pub fn no_args(&self) -> Result<(), Span> {
|
||||
match self {
|
||||
Self::NoArgs => Ok(()),
|
||||
Self::List(args) => Err(args.span),
|
||||
Self::NameValue(args) => Err(args.eq_span.to(args.value_span)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -436,6 +436,15 @@ pub(crate) struct IllFormedAttributeInput {
|
|||
pub suggestions: DiagArgValue,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(attr_parsing_ill_formed_attribute_input)]
|
||||
pub(crate) struct MustUseIllFormedAttributeInput {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
pub num_suggestions: usize,
|
||||
pub suggestions: DiagArgValue,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(attr_parsing_stability_outside_std, code = E0734)]
|
||||
pub(crate) struct StabilityOutsideStd {
|
||||
|
|
@ -450,6 +459,14 @@ pub(crate) struct EmptyConfusables {
|
|||
pub span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(attr_parsing_invalid_alignment_value, code = E0589)]
|
||||
pub(crate) struct InvalidAlignmentValue {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
pub error_part: &'static str,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(attr_parsing_repr_ident, code = E0565)]
|
||||
pub(crate) struct ReprIdent {
|
||||
|
|
@ -465,8 +482,21 @@ pub(crate) struct UnrecognizedReprHint {
|
|||
pub span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(attr_parsing_naked_functions_incompatible_attribute, code = E0736)]
|
||||
pub(crate) struct NakedFunctionIncompatibleAttribute {
|
||||
#[primary_span]
|
||||
#[label]
|
||||
pub span: Span,
|
||||
#[label(attr_parsing_naked_attribute)]
|
||||
pub naked_span: Span,
|
||||
pub attr: String,
|
||||
}
|
||||
|
||||
pub(crate) enum AttributeParseErrorReason {
|
||||
ExpectedNoArgs,
|
||||
ExpectedStringLiteral { byte_string: Option<Span> },
|
||||
ExpectedAtLeastOneArgument,
|
||||
ExpectedSingleArgument,
|
||||
ExpectedList,
|
||||
UnexpectedLiteral,
|
||||
|
|
@ -510,6 +540,9 @@ impl<'a, G: EmissionGuarantee> Diagnostic<'a, G> for AttributeParseError {
|
|||
diag.span_label(self.span, "expected a single argument here");
|
||||
diag.code(E0805);
|
||||
}
|
||||
AttributeParseErrorReason::ExpectedAtLeastOneArgument => {
|
||||
diag.span_label(self.span, "expected at least 1 argument here");
|
||||
}
|
||||
AttributeParseErrorReason::ExpectedList => {
|
||||
diag.span_label(self.span, "expected this to be a list");
|
||||
}
|
||||
|
|
@ -521,6 +554,10 @@ impl<'a, G: EmissionGuarantee> Diagnostic<'a, G> for AttributeParseError {
|
|||
diag.span_label(self.span, format!("didn't expect a literal here"));
|
||||
diag.code(E0565);
|
||||
}
|
||||
AttributeParseErrorReason::ExpectedNoArgs => {
|
||||
diag.span_label(self.span, format!("didn't expect any arguments here"));
|
||||
diag.code(E0565);
|
||||
}
|
||||
AttributeParseErrorReason::ExpectedNameValue(None) => {
|
||||
diag.span_label(
|
||||
self.span,
|
||||
|
|
|
|||
|
|
@ -3201,14 +3201,6 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
let expr_ty: Option<Ty<'_>> =
|
||||
visitor.prop_expr.map(|expr| typeck_results.expr_ty(expr).peel_refs());
|
||||
|
||||
let is_format_arguments_item = if let Some(expr_ty) = expr_ty
|
||||
&& let ty::Adt(adt, _) = expr_ty.kind()
|
||||
{
|
||||
self.infcx.tcx.is_lang_item(adt.did(), LangItem::FormatArguments)
|
||||
} else {
|
||||
false
|
||||
};
|
||||
|
||||
if visitor.found == 0
|
||||
&& stmt.span.contains(proper_span)
|
||||
&& let Some(p) = sm.span_to_margin(stmt.span)
|
||||
|
|
@ -3236,25 +3228,17 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
""
|
||||
};
|
||||
|
||||
if !is_format_arguments_item {
|
||||
let addition = format!(
|
||||
"let {}binding = {};\n{}",
|
||||
mutability,
|
||||
s,
|
||||
" ".repeat(p)
|
||||
);
|
||||
err.multipart_suggestion_verbose(
|
||||
msg,
|
||||
vec![
|
||||
(stmt.span.shrink_to_lo(), addition),
|
||||
(proper_span, "binding".to_string()),
|
||||
],
|
||||
Applicability::MaybeIncorrect,
|
||||
);
|
||||
} else {
|
||||
err.note("the result of `format_args!` can only be assigned directly if no placeholders in its arguments are used");
|
||||
err.note("to learn more, visit <https://doc.rust-lang.org/std/macro.format_args.html>");
|
||||
}
|
||||
let addition =
|
||||
format!("let {}binding = {};\n{}", mutability, s, " ".repeat(p));
|
||||
err.multipart_suggestion_verbose(
|
||||
msg,
|
||||
vec![
|
||||
(stmt.span.shrink_to_lo(), addition),
|
||||
(proper_span, "binding".to_string()),
|
||||
],
|
||||
Applicability::MaybeIncorrect,
|
||||
);
|
||||
|
||||
suggested = true;
|
||||
break;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -71,7 +71,6 @@ impl<'tcx> BorrowExplanation<'tcx> {
|
|||
) {
|
||||
let tcx = cx.infcx.tcx;
|
||||
let body = cx.body;
|
||||
let local_names = &cx.local_names;
|
||||
|
||||
if let Some(span) = borrow_span {
|
||||
let def_id = body.source.def_id();
|
||||
|
|
@ -220,7 +219,7 @@ impl<'tcx> BorrowExplanation<'tcx> {
|
|||
_ => ("destructor", format!("type `{}`", local_decl.ty)),
|
||||
};
|
||||
|
||||
match local_names[dropped_local] {
|
||||
match cx.local_name(dropped_local) {
|
||||
Some(local_name) if !local_decl.from_compiler_desugaring() => {
|
||||
let message = format!(
|
||||
"{borrow_desc}borrow might be used here, when `{local_name}` is dropped \
|
||||
|
|
@ -670,10 +669,10 @@ impl<'tcx> MirBorrowckCtxt<'_, '_, 'tcx> {
|
|||
|
||||
Some(Cause::DropVar(local, location)) => {
|
||||
let mut should_note_order = false;
|
||||
if self.local_names[local].is_some()
|
||||
if self.local_name(local).is_some()
|
||||
&& let Some((WriteKind::StorageDeadOrDrop, place)) = kind_place
|
||||
&& let Some(borrowed_local) = place.as_local()
|
||||
&& self.local_names[borrowed_local].is_some()
|
||||
&& self.local_name(borrowed_local).is_some()
|
||||
&& local != borrowed_local
|
||||
{
|
||||
should_note_order = true;
|
||||
|
|
@ -748,7 +747,7 @@ impl<'tcx> MirBorrowckCtxt<'_, '_, 'tcx> {
|
|||
Operand::Copy(place) | Operand::Move(place) => {
|
||||
if let Some(l) = place.as_local() {
|
||||
let local_decl = &self.body.local_decls[l];
|
||||
if self.local_names[l].is_none() {
|
||||
if self.local_name(l).is_none() {
|
||||
local_decl.source_info.span
|
||||
} else {
|
||||
span
|
||||
|
|
@ -793,7 +792,7 @@ impl<'tcx> MirBorrowckCtxt<'_, '_, 'tcx> {
|
|||
Operand::Copy(place) | Operand::Move(place) => {
|
||||
if let Some(l) = place.as_local() {
|
||||
let local_decl = &self.body.local_decls[l];
|
||||
if self.local_names[l].is_none() {
|
||||
if self.local_name(l).is_none() {
|
||||
local_decl.source_info.span
|
||||
} else {
|
||||
span
|
||||
|
|
|
|||
|
|
@ -7,17 +7,17 @@ use rustc_data_structures::fx::FxIndexMap;
|
|||
use rustc_errors::{Applicability, Diag, EmissionGuarantee, MultiSpan, listify};
|
||||
use rustc_hir::def::{CtorKind, Namespace};
|
||||
use rustc_hir::{self as hir, CoroutineKind, LangItem};
|
||||
use rustc_index::IndexSlice;
|
||||
use rustc_index::{IndexSlice, IndexVec};
|
||||
use rustc_infer::infer::{BoundRegionConversionTime, NllRegionVariableOrigin};
|
||||
use rustc_infer::traits::SelectionError;
|
||||
use rustc_middle::bug;
|
||||
use rustc_middle::mir::{
|
||||
AggregateKind, CallSource, ConstOperand, ConstraintCategory, FakeReadCause, Local, LocalInfo,
|
||||
LocalKind, Location, Operand, Place, PlaceRef, PlaceTy, ProjectionElem, Rvalue, Statement,
|
||||
StatementKind, Terminator, TerminatorKind, find_self_call,
|
||||
StatementKind, Terminator, TerminatorKind, VarDebugInfoContents, find_self_call,
|
||||
};
|
||||
use rustc_middle::ty::print::Print;
|
||||
use rustc_middle::ty::{self, Ty, TyCtxt};
|
||||
use rustc_middle::{bug, span_bug};
|
||||
use rustc_mir_dataflow::move_paths::{InitLocation, LookupResult, MoveOutIndex};
|
||||
use rustc_span::def_id::LocalDefId;
|
||||
use rustc_span::source_map::Spanned;
|
||||
|
|
@ -190,6 +190,36 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
) -> Option<&(PlaceRef<'tcx>, Diag<'infcx>)> {
|
||||
self.diags_buffer.buffered_move_errors.get(move_out_indices)
|
||||
}
|
||||
|
||||
/// Uses `body.var_debug_info` to find the symbol
|
||||
fn local_name(&self, index: Local) -> Option<Symbol> {
|
||||
*self.local_names().get(index)?
|
||||
}
|
||||
|
||||
fn local_names(&self) -> &IndexSlice<Local, Option<Symbol>> {
|
||||
self.local_names.get_or_init(|| {
|
||||
let mut local_names = IndexVec::from_elem(None, &self.body.local_decls);
|
||||
for var_debug_info in &self.body.var_debug_info {
|
||||
if let VarDebugInfoContents::Place(place) = var_debug_info.value {
|
||||
if let Some(local) = place.as_local() {
|
||||
if let Some(prev_name) = local_names[local]
|
||||
&& var_debug_info.name != prev_name
|
||||
{
|
||||
span_bug!(
|
||||
var_debug_info.source_info.span,
|
||||
"local {:?} has many names (`{}` vs `{}`)",
|
||||
local,
|
||||
prev_name,
|
||||
var_debug_info.name
|
||||
);
|
||||
}
|
||||
local_names[local] = Some(var_debug_info.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
local_names
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
||||
|
|
@ -430,7 +460,7 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
/// a name, or its name was generated by the compiler, then `Err` is returned
|
||||
fn append_local_to_string(&self, local: Local, buf: &mut String) -> Result<(), ()> {
|
||||
let decl = &self.body.local_decls[local];
|
||||
match self.local_names[local] {
|
||||
match self.local_name(local) {
|
||||
Some(name) if !decl.from_compiler_desugaring() => {
|
||||
buf.push_str(name.as_str());
|
||||
Ok(())
|
||||
|
|
@ -1254,8 +1284,14 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
&& !spans.is_empty()
|
||||
{
|
||||
let mut span: MultiSpan = spans.clone().into();
|
||||
err.arg("ty", param_ty.to_string());
|
||||
let msg = err.dcx.eagerly_translate_to_string(
|
||||
fluent::borrowck_moved_a_fn_once_in_call_def,
|
||||
err.args.iter(),
|
||||
);
|
||||
err.remove_arg("ty");
|
||||
for sp in spans {
|
||||
span.push_span_label(sp, fluent::borrowck_moved_a_fn_once_in_call_def);
|
||||
span.push_span_label(sp, msg.clone());
|
||||
}
|
||||
span.push_span_label(
|
||||
fn_call_span,
|
||||
|
|
@ -1500,4 +1536,9 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Skip over locals that begin with an underscore or have no name
|
||||
pub(crate) fn local_excluded_from_unused_mut_lint(&self, index: Local) -> bool {
|
||||
self.local_name(index).is_none_or(|name| name.as_str().starts_with('_'))
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -465,11 +465,15 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
|
||||
if let PlaceRef { local, projection: [] } = deref_base {
|
||||
let decl = &self.body.local_decls[local];
|
||||
let local_name = self.local_name(local).map(|sym| format!("`{sym}`"));
|
||||
if decl.is_ref_for_guard() {
|
||||
return self
|
||||
.cannot_move_out_of(
|
||||
span,
|
||||
&format!("`{}` in pattern guard", self.local_names[local].unwrap()),
|
||||
&format!(
|
||||
"{} in pattern guard",
|
||||
local_name.as_deref().unwrap_or("the place")
|
||||
),
|
||||
)
|
||||
.with_note(
|
||||
"variables bound in patterns cannot be moved from \
|
||||
|
|
@ -825,7 +829,7 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
}
|
||||
|
||||
if binds_to.len() == 1 {
|
||||
let place_desc = &format!("`{}`", self.local_names[*local].unwrap());
|
||||
let place_desc = self.local_name(*local).map(|sym| format!("`{sym}`"));
|
||||
|
||||
if let Some(expr) = self.find_expr(binding_span) {
|
||||
self.suggest_cloning(err, bind_to.ty, expr, None);
|
||||
|
|
@ -834,7 +838,7 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
err.subdiagnostic(crate::session_diagnostics::TypeNoCopy::Label {
|
||||
is_partial_move: false,
|
||||
ty: bind_to.ty,
|
||||
place: place_desc,
|
||||
place: place_desc.as_deref().unwrap_or("the place"),
|
||||
span: binding_span,
|
||||
});
|
||||
}
|
||||
|
|
|
|||
|
|
@ -60,7 +60,7 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
if access_place.as_local().is_some() {
|
||||
reason = ", as it is not declared as mutable".to_string();
|
||||
} else {
|
||||
let name = self.local_names[local].expect("immutable unnamed local");
|
||||
let name = self.local_name(local).expect("immutable unnamed local");
|
||||
reason = format!(", as `{name}` is not declared as mutable");
|
||||
}
|
||||
}
|
||||
|
|
@ -285,7 +285,7 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
.body
|
||||
.local_decls
|
||||
.get(local)
|
||||
.is_some_and(|l| mut_borrow_of_mutable_ref(l, self.local_names[local])) =>
|
||||
.is_some_and(|l| mut_borrow_of_mutable_ref(l, self.local_name(local))) =>
|
||||
{
|
||||
let decl = &self.body.local_decls[local];
|
||||
err.span_label(span, format!("cannot {act}"));
|
||||
|
|
@ -481,7 +481,7 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
let (pointer_sigil, pointer_desc) =
|
||||
if local_decl.ty.is_ref() { ("&", "reference") } else { ("*const", "pointer") };
|
||||
|
||||
match self.local_names[local] {
|
||||
match self.local_name(local) {
|
||||
Some(name) if !local_decl.from_compiler_desugaring() => {
|
||||
err.span_label(
|
||||
span,
|
||||
|
|
|
|||
|
|
@ -664,14 +664,14 @@ impl<'infcx, 'tcx> MirBorrowckCtxt<'_, 'infcx, 'tcx> {
|
|||
let fr_name_and_span = self.regioncx.get_var_name_and_span_for_region(
|
||||
self.infcx.tcx,
|
||||
self.body,
|
||||
&self.local_names,
|
||||
&self.local_names(),
|
||||
&self.upvars,
|
||||
errci.fr,
|
||||
);
|
||||
let outlived_fr_name_and_span = self.regioncx.get_var_name_and_span_for_region(
|
||||
self.infcx.tcx,
|
||||
self.body,
|
||||
&self.local_names,
|
||||
&self.local_names(),
|
||||
&self.upvars,
|
||||
errci.outlived_fr,
|
||||
);
|
||||
|
|
|
|||
|
|
@ -399,7 +399,7 @@ impl<'tcx> MirBorrowckCtxt<'_, '_, 'tcx> {
|
|||
[implicit_inputs + argument_index];
|
||||
let (_, span) = self.regioncx.get_argument_name_and_span_for_region(
|
||||
self.body,
|
||||
&self.local_names,
|
||||
self.local_names(),
|
||||
argument_index,
|
||||
);
|
||||
|
||||
|
|
@ -973,7 +973,7 @@ impl<'tcx> MirBorrowckCtxt<'_, '_, 'tcx> {
|
|||
{
|
||||
let (arg_name, arg_span) = self.regioncx.get_argument_name_and_span_for_region(
|
||||
self.body,
|
||||
&self.local_names,
|
||||
self.local_names(),
|
||||
arg_index,
|
||||
);
|
||||
let region_name = self.synthesize_region_name();
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@
|
|||
// tidy-alphabetical-end
|
||||
|
||||
use std::borrow::Cow;
|
||||
use std::cell::RefCell;
|
||||
use std::cell::{OnceCell, RefCell};
|
||||
use std::marker::PhantomData;
|
||||
use std::ops::{ControlFlow, Deref};
|
||||
|
||||
|
|
@ -391,7 +391,7 @@ fn do_mir_borrowck<'tcx>(
|
|||
used_mut_upvars: SmallVec::new(),
|
||||
borrow_set: &borrow_set,
|
||||
upvars: &[],
|
||||
local_names: IndexVec::from_elem(None, &promoted_body.local_decls),
|
||||
local_names: OnceCell::from(IndexVec::from_elem(None, &promoted_body.local_decls)),
|
||||
region_names: RefCell::default(),
|
||||
next_region_name: RefCell::new(1),
|
||||
polonius_output: None,
|
||||
|
|
@ -414,26 +414,6 @@ fn do_mir_borrowck<'tcx>(
|
|||
promoted_mbcx.report_move_errors();
|
||||
}
|
||||
|
||||
let mut local_names = IndexVec::from_elem(None, &body.local_decls);
|
||||
for var_debug_info in &body.var_debug_info {
|
||||
if let VarDebugInfoContents::Place(place) = var_debug_info.value {
|
||||
if let Some(local) = place.as_local() {
|
||||
if let Some(prev_name) = local_names[local]
|
||||
&& var_debug_info.name != prev_name
|
||||
{
|
||||
span_bug!(
|
||||
var_debug_info.source_info.span,
|
||||
"local {:?} has many names (`{}` vs `{}`)",
|
||||
local,
|
||||
prev_name,
|
||||
var_debug_info.name
|
||||
);
|
||||
}
|
||||
local_names[local] = Some(var_debug_info.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut mbcx = MirBorrowckCtxt {
|
||||
root_cx,
|
||||
infcx: &infcx,
|
||||
|
|
@ -450,7 +430,7 @@ fn do_mir_borrowck<'tcx>(
|
|||
used_mut_upvars: SmallVec::new(),
|
||||
borrow_set: &borrow_set,
|
||||
upvars: tcx.closure_captures(def),
|
||||
local_names,
|
||||
local_names: OnceCell::new(),
|
||||
region_names: RefCell::default(),
|
||||
next_region_name: RefCell::new(1),
|
||||
move_errors: Vec::new(),
|
||||
|
|
@ -682,7 +662,7 @@ struct MirBorrowckCtxt<'a, 'infcx, 'tcx> {
|
|||
upvars: &'tcx [&'tcx ty::CapturedPlace<'tcx>],
|
||||
|
||||
/// Names of local (user) variables (extracted from `var_debug_info`).
|
||||
local_names: IndexVec<Local, Option<Symbol>>,
|
||||
local_names: OnceCell<IndexVec<Local, Option<Symbol>>>,
|
||||
|
||||
/// Record the region names generated for each region in the given
|
||||
/// MIR def so that we can reuse them later in help/error messages.
|
||||
|
|
@ -2610,7 +2590,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, '_, 'tcx> {
|
|||
};
|
||||
|
||||
// Skip over locals that begin with an underscore or have no name
|
||||
if self.local_names[local].is_none_or(|name| name.as_str().starts_with('_')) {
|
||||
if self.local_excluded_from_unused_mut_lint(local) {
|
||||
continue;
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -104,6 +104,8 @@ builtin_macros_concat_bytes_bad_repeat = repeat count is not a positive number
|
|||
builtin_macros_concat_bytes_invalid = cannot concatenate {$lit_kind} literals
|
||||
.byte_char = try using a byte character
|
||||
.byte_str = try using a byte string
|
||||
.c_str = try using a null-terminated byte string
|
||||
.c_str_note = concatenating C strings is ambiguous about including the '\0'
|
||||
.number_array = try wrapping the number in an array
|
||||
|
||||
builtin_macros_concat_bytes_missing_literal = expected a byte literal
|
||||
|
|
@ -116,10 +118,6 @@ builtin_macros_concat_bytes_oob = numeric literal is out of bounds
|
|||
builtin_macros_concat_bytestr = cannot concatenate a byte string literal
|
||||
builtin_macros_concat_c_str_lit = cannot concatenate a C string literal
|
||||
|
||||
builtin_macros_concat_idents_ident_args = `concat_idents!()` requires ident args
|
||||
|
||||
builtin_macros_concat_idents_missing_args = `concat_idents!()` takes 1 or more arguments
|
||||
builtin_macros_concat_idents_missing_comma = `concat_idents!()` expecting comma
|
||||
builtin_macros_concat_missing_literal = expected a literal
|
||||
.note = only literals (like `"foo"`, `-42` and `3.14`) can be passed to `concat!()`
|
||||
|
||||
|
|
|
|||
|
|
@ -62,8 +62,8 @@ pub(crate) fn expand(
|
|||
fn generate_handler(cx: &ExtCtxt<'_>, handler: Ident, span: Span, sig_span: Span) -> Stmt {
|
||||
let usize = cx.path_ident(span, Ident::new(sym::usize, span));
|
||||
let ty_usize = cx.ty_path(usize);
|
||||
let size = Ident::from_str_and_span("size", span);
|
||||
let align = Ident::from_str_and_span("align", span);
|
||||
let size = Ident::new(sym::size, span);
|
||||
let align = Ident::new(sym::align, span);
|
||||
|
||||
let layout_new = cx.std_path(&[sym::alloc, sym::Layout, sym::from_size_align_unchecked]);
|
||||
let layout_new = cx.expr_path(cx.path(span, layout_new));
|
||||
|
|
|
|||
|
|
@ -652,8 +652,10 @@ mod llvm_enzyme {
|
|||
exprs = ecx.expr_call(new_decl_span, bb_call_expr, thin_vec![exprs]);
|
||||
} else {
|
||||
let q = QSelf { ty: d_ret_ty, path_span: span, position: 0 };
|
||||
let y =
|
||||
ExprKind::Path(Some(P(q)), ecx.path_ident(span, Ident::from_str("default")));
|
||||
let y = ExprKind::Path(
|
||||
Some(P(q)),
|
||||
ecx.path_ident(span, Ident::with_dummy_span(kw::Default)),
|
||||
);
|
||||
let default_call_expr = ecx.expr(span, y);
|
||||
let default_call_expr =
|
||||
ecx.expr_call(new_decl_span, default_call_expr, thin_vec![]);
|
||||
|
|
|
|||
|
|
@ -161,7 +161,7 @@ impl MutVisitor for CfgEval<'_> {
|
|||
}
|
||||
|
||||
#[instrument(level = "trace", skip(self))]
|
||||
fn visit_method_receiver_expr(&mut self, expr: &mut P<ast::Expr>) {
|
||||
fn visit_method_receiver_expr(&mut self, expr: &mut ast::Expr) {
|
||||
self.0.configure_expr(expr, true);
|
||||
mut_visit::walk_expr(self, expr);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
use rustc_ast::ptr::P;
|
||||
use rustc_ast::tokenstream::TokenStream;
|
||||
use rustc_ast::{ExprKind, LitIntType, LitKind, UintTy, token};
|
||||
use rustc_ast::{ExprKind, LitIntType, LitKind, StrStyle, UintTy, token};
|
||||
use rustc_expand::base::{DummyResult, ExpandResult, ExtCtxt, MacEager, MacroExpanderResult};
|
||||
use rustc_session::errors::report_lit_error;
|
||||
use rustc_span::{ErrorGuaranteed, Span};
|
||||
|
|
@ -21,15 +21,32 @@ fn invalid_type_err(
|
|||
let snippet = cx.sess.source_map().span_to_snippet(span).ok();
|
||||
let dcx = cx.dcx();
|
||||
match LitKind::from_token_lit(token_lit) {
|
||||
Ok(LitKind::CStr(_, _)) => {
|
||||
Ok(LitKind::CStr(_, style)) => {
|
||||
// Avoid ambiguity in handling of terminal `NUL` by refusing to
|
||||
// concatenate C string literals as bytes.
|
||||
dcx.emit_err(errors::ConcatCStrLit { span })
|
||||
let sugg = if let Some(mut as_bstr) = snippet
|
||||
&& style == StrStyle::Cooked
|
||||
&& as_bstr.starts_with('c')
|
||||
&& as_bstr.ends_with('"')
|
||||
{
|
||||
// Suggest`c"foo"` -> `b"foo\0"` if we can
|
||||
as_bstr.replace_range(0..1, "b");
|
||||
as_bstr.pop();
|
||||
as_bstr.push_str(r#"\0""#);
|
||||
Some(ConcatBytesInvalidSuggestion::CStrLit { span, as_bstr })
|
||||
} else {
|
||||
// No suggestion for a missing snippet, raw strings, or if for some reason we have
|
||||
// a span that doesn't match `c"foo"` (possible if a proc macro assigns a span
|
||||
// that doesn't actually point to a C string).
|
||||
None
|
||||
};
|
||||
// We can only provide a suggestion if we have a snip and it is not a raw string
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "C string", sugg, cs_note: Some(()) })
|
||||
}
|
||||
Ok(LitKind::Char(_)) => {
|
||||
let sugg =
|
||||
snippet.map(|snippet| ConcatBytesInvalidSuggestion::CharLit { span, snippet });
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "character", sugg })
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "character", sugg, cs_note: None })
|
||||
}
|
||||
Ok(LitKind::Str(_, _)) => {
|
||||
// suggestion would be invalid if we are nested
|
||||
|
|
@ -38,18 +55,21 @@ fn invalid_type_err(
|
|||
} else {
|
||||
None
|
||||
};
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "string", sugg })
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "string", sugg, cs_note: None })
|
||||
}
|
||||
Ok(LitKind::Float(_, _)) => {
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "float", sugg: None })
|
||||
}
|
||||
Ok(LitKind::Bool(_)) => {
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "boolean", sugg: None })
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "float", sugg: None, cs_note: None })
|
||||
}
|
||||
Ok(LitKind::Bool(_)) => dcx.emit_err(ConcatBytesInvalid {
|
||||
span,
|
||||
lit_kind: "boolean",
|
||||
sugg: None,
|
||||
cs_note: None,
|
||||
}),
|
||||
Ok(LitKind::Int(_, _)) if !is_nested => {
|
||||
let sugg =
|
||||
snippet.map(|snippet| ConcatBytesInvalidSuggestion::IntLit { span, snippet });
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "numeric", sugg })
|
||||
dcx.emit_err(ConcatBytesInvalid { span, lit_kind: "numeric", sugg, cs_note: None })
|
||||
}
|
||||
Ok(LitKind::Int(val, LitIntType::Unsuffixed | LitIntType::Unsigned(UintTy::U8))) => {
|
||||
assert!(val.get() > u8::MAX.into()); // must be an error
|
||||
|
|
|
|||
|
|
@ -1,71 +0,0 @@
|
|||
use rustc_ast::ptr::P;
|
||||
use rustc_ast::token::{self, Token};
|
||||
use rustc_ast::tokenstream::{TokenStream, TokenTree};
|
||||
use rustc_ast::{AttrVec, DUMMY_NODE_ID, Expr, ExprKind, Path, Ty, TyKind};
|
||||
use rustc_expand::base::{DummyResult, ExpandResult, ExtCtxt, MacResult, MacroExpanderResult};
|
||||
use rustc_span::{Ident, Span, Symbol};
|
||||
|
||||
use crate::errors;
|
||||
|
||||
pub(crate) fn expand_concat_idents<'cx>(
|
||||
cx: &'cx mut ExtCtxt<'_>,
|
||||
sp: Span,
|
||||
tts: TokenStream,
|
||||
) -> MacroExpanderResult<'cx> {
|
||||
if tts.is_empty() {
|
||||
let guar = cx.dcx().emit_err(errors::ConcatIdentsMissingArgs { span: sp });
|
||||
return ExpandResult::Ready(DummyResult::any(sp, guar));
|
||||
}
|
||||
|
||||
let mut res_str = String::new();
|
||||
for (i, e) in tts.iter().enumerate() {
|
||||
if i & 1 == 1 {
|
||||
match e {
|
||||
TokenTree::Token(Token { kind: token::Comma, .. }, _) => {}
|
||||
_ => {
|
||||
let guar = cx.dcx().emit_err(errors::ConcatIdentsMissingComma { span: sp });
|
||||
return ExpandResult::Ready(DummyResult::any(sp, guar));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if let TokenTree::Token(token, _) = e {
|
||||
if let Some((ident, _)) = token.ident() {
|
||||
res_str.push_str(ident.name.as_str());
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
let guar = cx.dcx().emit_err(errors::ConcatIdentsIdentArgs { span: sp });
|
||||
return ExpandResult::Ready(DummyResult::any(sp, guar));
|
||||
}
|
||||
}
|
||||
|
||||
let ident = Ident::new(Symbol::intern(&res_str), cx.with_call_site_ctxt(sp));
|
||||
|
||||
struct ConcatIdentsResult {
|
||||
ident: Ident,
|
||||
}
|
||||
|
||||
impl MacResult for ConcatIdentsResult {
|
||||
fn make_expr(self: Box<Self>) -> Option<P<Expr>> {
|
||||
Some(P(Expr {
|
||||
id: DUMMY_NODE_ID,
|
||||
kind: ExprKind::Path(None, Path::from_ident(self.ident)),
|
||||
span: self.ident.span,
|
||||
attrs: AttrVec::new(),
|
||||
tokens: None,
|
||||
}))
|
||||
}
|
||||
|
||||
fn make_ty(self: Box<Self>) -> Option<P<Ty>> {
|
||||
Some(P(Ty {
|
||||
id: DUMMY_NODE_ID,
|
||||
kind: TyKind::Path(None, Path::from_ident(self.ident)),
|
||||
span: self.ident.span,
|
||||
tokens: None,
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
ExpandResult::Ready(Box::new(ConcatIdentsResult { ident }))
|
||||
}
|
||||
|
|
@ -215,6 +215,8 @@ pub(crate) struct ConcatBytesInvalid {
|
|||
pub(crate) lit_kind: &'static str,
|
||||
#[subdiagnostic]
|
||||
pub(crate) sugg: Option<ConcatBytesInvalidSuggestion>,
|
||||
#[note(builtin_macros_c_str_note)]
|
||||
pub(crate) cs_note: Option<()>,
|
||||
}
|
||||
|
||||
#[derive(Subdiagnostic)]
|
||||
|
|
@ -239,6 +241,13 @@ pub(crate) enum ConcatBytesInvalidSuggestion {
|
|||
span: Span,
|
||||
snippet: String,
|
||||
},
|
||||
#[note(builtin_macros_c_str_note)]
|
||||
#[suggestion(builtin_macros_c_str, code = "{as_bstr}", applicability = "machine-applicable")]
|
||||
CStrLit {
|
||||
#[primary_span]
|
||||
span: Span,
|
||||
as_bstr: String,
|
||||
},
|
||||
#[suggestion(
|
||||
builtin_macros_number_array,
|
||||
code = "[{snippet}]",
|
||||
|
|
@ -290,27 +299,6 @@ pub(crate) struct ConcatBytesBadRepeat {
|
|||
pub(crate) span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(builtin_macros_concat_idents_missing_args)]
|
||||
pub(crate) struct ConcatIdentsMissingArgs {
|
||||
#[primary_span]
|
||||
pub(crate) span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(builtin_macros_concat_idents_missing_comma)]
|
||||
pub(crate) struct ConcatIdentsMissingComma {
|
||||
#[primary_span]
|
||||
pub(crate) span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(builtin_macros_concat_idents_ident_args)]
|
||||
pub(crate) struct ConcatIdentsIdentArgs {
|
||||
#[primary_span]
|
||||
pub(crate) span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(builtin_macros_bad_derive_target, code = E0774)]
|
||||
pub(crate) struct BadDeriveTarget {
|
||||
|
|
@ -672,6 +660,7 @@ impl Subdiagnostic for FormatUnusedArg {
|
|||
fn add_to_diag<G: EmissionGuarantee>(self, diag: &mut Diag<'_, G>) {
|
||||
diag.arg("named", self.named);
|
||||
let msg = diag.eagerly_translate(crate::fluent_generated::builtin_macros_format_unused_arg);
|
||||
diag.remove_arg("named");
|
||||
diag.span_label(self.span, msg);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -606,6 +606,7 @@ fn make_format_args(
|
|||
template,
|
||||
arguments: args,
|
||||
uncooked_fmt_str,
|
||||
is_source_literal,
|
||||
}))
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -36,7 +36,6 @@ mod cfg_eval;
|
|||
mod compile_error;
|
||||
mod concat;
|
||||
mod concat_bytes;
|
||||
mod concat_idents;
|
||||
mod define_opaque;
|
||||
mod derive;
|
||||
mod deriving;
|
||||
|
|
@ -84,7 +83,6 @@ pub fn register_builtin_macros(resolver: &mut dyn ResolverExpand) {
|
|||
compile_error: compile_error::expand_compile_error,
|
||||
concat: concat::expand_concat,
|
||||
concat_bytes: concat_bytes::expand_concat_bytes,
|
||||
concat_idents: concat_idents::expand_concat_idents,
|
||||
const_format_args: format::expand_format_args,
|
||||
core_panic: edition_panic::expand_panic,
|
||||
env: env::expand_env,
|
||||
|
|
|
|||
|
|
@ -56,7 +56,7 @@ pub fn inject(
|
|||
is_test_crate: bool,
|
||||
dcx: DiagCtxtHandle<'_>,
|
||||
) {
|
||||
let ecfg = ExpansionConfig::default("proc_macro".to_string(), features);
|
||||
let ecfg = ExpansionConfig::default(sym::proc_macro, features);
|
||||
let mut cx = ExtCtxt::new(sess, ecfg, resolver, None);
|
||||
|
||||
let mut collect = CollectProcMacros {
|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ pub fn inject(
|
|||
let span = DUMMY_SP.with_def_site_ctxt(expn_id.to_expn_id());
|
||||
let call_site = DUMMY_SP.with_call_site_ctxt(expn_id.to_expn_id());
|
||||
|
||||
let ecfg = ExpansionConfig::default("std_lib_injection".to_string(), features);
|
||||
let ecfg = ExpansionConfig::default(sym::std_lib_injection, features);
|
||||
let cx = ExtCtxt::new(sess, ecfg, resolver, None);
|
||||
|
||||
let ident_span = if edition >= Edition2018 { span } else { call_site };
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ use rustc_ast as ast;
|
|||
use rustc_ast::entry::EntryPointType;
|
||||
use rustc_ast::mut_visit::*;
|
||||
use rustc_ast::ptr::P;
|
||||
use rustc_ast::visit::{Visitor, walk_item};
|
||||
use rustc_ast::visit::Visitor;
|
||||
use rustc_ast::{ModKind, attr};
|
||||
use rustc_errors::DiagCtxtHandle;
|
||||
use rustc_expand::base::{ExtCtxt, ResolverExpand};
|
||||
|
|
@ -146,11 +146,11 @@ impl<'a> MutVisitor for TestHarnessGenerator<'a> {
|
|||
) = item.kind
|
||||
{
|
||||
let prev_tests = mem::take(&mut self.tests);
|
||||
walk_item_kind(&mut item.kind, item.span, item.id, &mut item.vis, (), self);
|
||||
ast::mut_visit::walk_item(self, item);
|
||||
self.add_test_cases(item.id, span, prev_tests);
|
||||
} else {
|
||||
// But in those cases, we emit a lint to warn the user of these missing tests.
|
||||
walk_item(&mut InnerItemLinter { sess: self.cx.ext_cx.sess }, &item);
|
||||
ast::visit::walk_item(&mut InnerItemLinter { sess: self.cx.ext_cx.sess }, &item);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -227,7 +227,7 @@ fn generate_test_harness(
|
|||
panic_strategy: PanicStrategy,
|
||||
test_runner: Option<ast::Path>,
|
||||
) {
|
||||
let econfig = ExpansionConfig::default("test".to_string(), features);
|
||||
let econfig = ExpansionConfig::default(sym::test, features);
|
||||
let ext_cx = ExtCtxt::new(sess, econfig, resolver, None);
|
||||
|
||||
let expn_id = ext_cx.resolver.expansion_for_ast_pass(
|
||||
|
|
|
|||
|
|
@ -43,42 +43,42 @@ checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-assembler-x64"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9ff8e35182c7372df00447cb90a04e584e032c42b9b9b6e8c50ddaaf0d7900d5"
|
||||
checksum = "f6f53499803b1607b6ee0ba0de4ba036e6da700c2e489fe8f9d0f683d0b84d31"
|
||||
dependencies = [
|
||||
"cranelift-assembler-x64-meta",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-assembler-x64-meta"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "14220f9c2698015c3b94dc6b84ae045c1c45509ddc406e43c6139252757fdb7a"
|
||||
checksum = "1aadaa5bc8430d0e7bb999459369bedd0e5816ad4a82a0e20748341c4e333eda"
|
||||
dependencies = [
|
||||
"cranelift-srcgen",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-bforest"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d372ef2777ceefd75829e1390211ac240e9196bc60699218f7ea2419038288ee"
|
||||
checksum = "2005fda2fc52a2dbce58229b4fb4483b70cbc806ba8ecc11b3f050c1a2d26cac"
|
||||
dependencies = [
|
||||
"cranelift-entity",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-bitset"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "56323783e423818fa89ce8078e90a3913d2a6e0810399bfce8ebd7ee87baa81f"
|
||||
checksum = "56935e02452ca1249d39ad5c45a96304d0b4300a158a391fd113451e0cd4483d"
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-codegen"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "74ffb780aab6186c6e9ba26519654b1ac55a09c0a866f6088a4efbbd84da68ed"
|
||||
checksum = "62612786bf00e10999f50217d6f455d02b31591155881a45a903d1a95d1a4043"
|
||||
dependencies = [
|
||||
"bumpalo",
|
||||
"cranelift-assembler-x64",
|
||||
|
|
@ -97,13 +97,14 @@ dependencies = [
|
|||
"serde",
|
||||
"smallvec",
|
||||
"target-lexicon",
|
||||
"wasmtime-math",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-codegen-meta"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c23ef13814d3b39c869650d5961128cbbecad83fbdff4e6836a03ecf6862d7ed"
|
||||
checksum = "07bae789df91ef236079733af9df11d852256c64af196f0bc6471ea0f5f301be"
|
||||
dependencies = [
|
||||
"cranelift-assembler-x64-meta",
|
||||
"cranelift-codegen-shared",
|
||||
|
|
@ -112,33 +113,33 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-codegen-shared"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b9f623300657679f847803ce80811454bfff89cea4f6bf684be5c468d4a73631"
|
||||
checksum = "1be319616d36527782558a8312508757815f64deb19b094c7b8f4337229a9bc6"
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-control"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "31f4168af69989aa6b91fab46799ed4df6096f3209f4a6c8fb4358f49c60188f"
|
||||
checksum = "8810ee1ab5e9bd5cff4c0c8d240e2009cb5c2b79888fde1d5256d605712314b7"
|
||||
dependencies = [
|
||||
"arbitrary",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-entity"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ca6fa9bae1c8de26d71ac2162f069447610fd91e7780cb480ee0d76ac81eabb8"
|
||||
checksum = "086452c97cfbe116bf17dbe622dc5fdf2ea97299c7d4ce42460f284387c9928a"
|
||||
dependencies = [
|
||||
"cranelift-bitset",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-frontend"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b8219205608aa0b0e6769b580284a7e055c7e0c323c1041cde7ca078add3e412"
|
||||
checksum = "4c27947010ab759330f252610c17a8cd64d123358be4f33164233d04fcd77b80"
|
||||
dependencies = [
|
||||
"cranelift-codegen",
|
||||
"log",
|
||||
|
|
@ -148,15 +149,15 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-isle"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "588d0c5964f10860b04043e55aab26d7f7a206b0fd4f10c5260e8aa5773832bd"
|
||||
checksum = "ec67bfb8bd55b1e9760eb9f5186dca8d81bd4d86110f8d5af01154a044c91802"
|
||||
|
||||
[[package]]
|
||||
name = "cranelift-jit"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "56bd917ddc524f84f4066f954062875bdfc0dffea068ee94e906d98de5ac7c33"
|
||||
checksum = "d67cdfc447f2abdb46bb30a6582cce189539c3c051c1d5330692376e1400edff"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"cranelift-codegen",
|
||||
|
|
@ -174,9 +175,9 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-module"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "68a03c057d8a992e06596c871341e446af43ff9224f941e5b8adea39137a5391"
|
||||
checksum = "e4597eaa52bca1ed111986c7a7f70cdbe192f83d271d627201365078e37b7e84"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"cranelift-codegen",
|
||||
|
|
@ -185,9 +186,9 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-native"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "19ed3c94cb97b14f92b6a94a1d45ef8c851f6a2ad9114e5d91d233f7da638fed"
|
||||
checksum = "75a9b63edea46e013fce459c46e500462cb03a0490fdd9c18fe42b1dd7b93aa1"
|
||||
dependencies = [
|
||||
"cranelift-codegen",
|
||||
"libc",
|
||||
|
|
@ -196,9 +197,9 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-object"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a64dacef362a69375a604f6636e5e9a174fb96dba3b273646fcd9fa85c1d0997"
|
||||
checksum = "ce706f0166d5b7f31693dff521e87cb9858e12adf22ffcde93c4a2826f8f04a9"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"cranelift-codegen",
|
||||
|
|
@ -211,9 +212,9 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "cranelift-srcgen"
|
||||
version = "0.120.0"
|
||||
version = "0.121.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "85256fac1519a7d25a040c1d850fba67478f3f021ad5fdf738ba4425ee862dbf"
|
||||
checksum = "7d5870e266df8237b56cc98b04f5739c228565c92dd629ec6c66efa87271a158"
|
||||
|
||||
[[package]]
|
||||
name = "crc32fast"
|
||||
|
|
@ -288,6 +289,12 @@ dependencies = [
|
|||
"windows-targets",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "libm"
|
||||
version = "0.2.15"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f9fbbcab51052fe104eb5e5d351cf728d30a5be1fe14d9be8a3b097481fb97de"
|
||||
|
||||
[[package]]
|
||||
name = "log"
|
||||
version = "0.4.22"
|
||||
|
|
@ -446,9 +453,9 @@ checksum = "adb9e6ca4f869e1180728b7950e35922a7fc6397f7b641499e8f3ef06e50dc83"
|
|||
|
||||
[[package]]
|
||||
name = "wasmtime-jit-icache-coherence"
|
||||
version = "33.0.0"
|
||||
version = "34.0.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "175e924dbc944c185808466d1e90b5a7feb610f3b9abdfe26f8ee25fd1086d1c"
|
||||
checksum = "2eedc0324e37cf39b049f4dca0c30997eaab49f09006d5f4c1994e64e7b7dba8"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"cfg-if",
|
||||
|
|
@ -456,6 +463,15 @@ dependencies = [
|
|||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wasmtime-math"
|
||||
version = "34.0.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1cd35fae4cf51d2b4a9bd2ef04b0eb309fa1849cab6a6ab5ac27cbd054ea284d"
|
||||
dependencies = [
|
||||
"libm",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "windows-sys"
|
||||
version = "0.52.0"
|
||||
|
|
|
|||
|
|
@ -8,12 +8,12 @@ crate-type = ["dylib"]
|
|||
|
||||
[dependencies]
|
||||
# These have to be in sync with each other
|
||||
cranelift-codegen = { version = "0.120.0", default-features = false, features = ["std", "timing", "unwind", "all-native-arch"] }
|
||||
cranelift-frontend = { version = "0.120.0" }
|
||||
cranelift-module = { version = "0.120.0" }
|
||||
cranelift-native = { version = "0.120.0" }
|
||||
cranelift-jit = { version = "0.120.0", optional = true }
|
||||
cranelift-object = { version = "0.120.0" }
|
||||
cranelift-codegen = { version = "0.121.0", default-features = false, features = ["std", "timing", "unwind", "all-native-arch"] }
|
||||
cranelift-frontend = { version = "0.121.0" }
|
||||
cranelift-module = { version = "0.121.0" }
|
||||
cranelift-native = { version = "0.121.0" }
|
||||
cranelift-jit = { version = "0.121.0", optional = true }
|
||||
cranelift-object = { version = "0.121.0" }
|
||||
target-lexicon = "0.13"
|
||||
gimli = { version = "0.31", default-features = false, features = ["write"] }
|
||||
object = { version = "0.36", default-features = false, features = ["std", "read_core", "write", "archive", "coff", "elf", "macho", "pe"] }
|
||||
|
|
@ -24,12 +24,12 @@ smallvec = "1.8.1"
|
|||
|
||||
[patch.crates-io]
|
||||
# Uncomment to use an unreleased version of cranelift
|
||||
#cranelift-codegen = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-33.0.0", version = "0.120.0" }
|
||||
#cranelift-frontend = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-33.0.0", version = "0.120.0" }
|
||||
#cranelift-module = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-33.0.0", version = "0.120.0" }
|
||||
#cranelift-native = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-33.0.0", version = "0.120.0" }
|
||||
#cranelift-jit = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-33.0.0", version = "0.120.0" }
|
||||
#cranelift-object = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-33.0.0", version = "0.120.0" }
|
||||
#cranelift-codegen = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-34.0.0", version = "0.121.0" }
|
||||
#cranelift-frontend = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-34.0.0", version = "0.121.0" }
|
||||
#cranelift-module = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-34.0.0", version = "0.121.0" }
|
||||
#cranelift-native = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-34.0.0", version = "0.121.0" }
|
||||
#cranelift-jit = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-34.0.0", version = "0.121.0" }
|
||||
#cranelift-object = { git = "https://github.com/bytecodealliance/wasmtime.git", branch = "release-34.0.0", version = "0.121.0" }
|
||||
|
||||
# Uncomment to use local checkout of cranelift
|
||||
#cranelift-codegen = { path = "../wasmtime/cranelift/codegen" }
|
||||
|
|
|
|||
|
|
@ -6,8 +6,8 @@ use crate::{CodegenBackend, SysrootKind, build_sysroot};
|
|||
static ABI_CAFE_REPO: GitRepo = GitRepo::github(
|
||||
"Gankra",
|
||||
"abi-cafe",
|
||||
"f1220cfd13b57f5c0082c26529163865ee25e115",
|
||||
"fe93a9acd461425d",
|
||||
"94d38030419eb00a1ba80e5e2b4d763dcee58db4",
|
||||
"6efb4457893c8670",
|
||||
"abi-cafe",
|
||||
);
|
||||
|
||||
|
|
@ -46,6 +46,10 @@ pub(crate) fn run(
|
|||
let mut cmd = ABI_CAFE.run(bootstrap_host_compiler, dirs);
|
||||
cmd.arg("--");
|
||||
|
||||
cmd.arg("--debug");
|
||||
|
||||
cmd.arg("--rules").arg(dirs.source_dir.join("scripts/abi-cafe-rules.toml"));
|
||||
|
||||
// stdcall, vectorcall and such don't work yet
|
||||
cmd.arg("--conventions").arg("c").arg("--conventions").arg("rust");
|
||||
|
||||
|
|
|
|||
|
|
@ -1,69 +0,0 @@
|
|||
From 236df390f3bc4ed69c26f4d51d584bea246da886 Mon Sep 17 00:00:00 2001
|
||||
From: bjorn3 <17426603+bjorn3@users.noreply.github.com>
|
||||
Date: Tue, 9 Jul 2024 11:25:14 +0000
|
||||
Subject: [PATCH] Disable broken tests
|
||||
|
||||
---
|
||||
src/report.rs | 36 ++++++++++++++++++++++++++++++++++++
|
||||
1 file changed, 36 insertions(+)
|
||||
|
||||
diff --git a/src/toolchains/rust.rs b/src/toolchains/rust.rs
|
||||
index 0c50f7a..bfde2b1 100644
|
||||
--- a/src/toolchains/rust.rs
|
||||
+++ b/src/toolchains/rust.rs
|
||||
@@ -83,6 +83,7 @@ impl Toolchain for RustcToolchain {
|
||||
.arg(out_dir)
|
||||
.arg("--target")
|
||||
.arg(built_info::TARGET)
|
||||
+ .arg("-g")
|
||||
.arg(format!("-Cmetadata={lib_name}"))
|
||||
.arg(src_path);
|
||||
if let Some(codegen_backend) = &self.codegen_backend {
|
||||
diff --git a/src/report.rs b/src/report.rs
|
||||
index 958ab43..dcf1044 100644
|
||||
--- a/src/report.rs
|
||||
+++ b/src/report.rs
|
||||
@@ -48,6 +48,40 @@ pub fn get_test_rules(test: &TestKey, caller: &dyn Toolchain, callee: &dyn Toolc
|
||||
//
|
||||
// THIS AREA RESERVED FOR VENDORS TO APPLY PATCHES
|
||||
|
||||
+ if cfg!(all(target_arch = "aarch64", target_os = "linux")) {
|
||||
+ if test.test == "F32Array" && test.options.convention == CallingConvention::C {
|
||||
+ result.check = Busted(Check);
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ if cfg!(all(target_arch = "aarch64", target_os = "macos")) {
|
||||
+ if test.test == "SingleVariantUnion" && test.options.convention == CallingConvention::C && test.options.repr == LangRepr::C {
|
||||
+ result.check = Busted(Check);
|
||||
+ }
|
||||
+
|
||||
+ if test.test == "OptionU128" && test.caller == "rustc" && test.options.convention == CallingConvention::Rust && test.options.repr == LangRepr::C {
|
||||
+ result.check = Busted(Run);
|
||||
+ }
|
||||
+
|
||||
+ if test.test == "OptionU128" && test.caller == "cgclif" && test.options.convention == CallingConvention::Rust && test.options.repr == LangRepr::C {
|
||||
+ result.check = Busted(Check);
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ if cfg!(all(target_arch = "x86_64", windows)) {
|
||||
+ if test.test == "simple" && test.options.convention == CallingConvention::Rust {
|
||||
+ result.check = Busted(Check);
|
||||
+ }
|
||||
+
|
||||
+ if test.test == "simple" && test.options.convention == CallingConvention::Rust && test.caller == "rustc" {
|
||||
+ result.check = Busted(Run);
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ if test.test == "f16" || test.test == "f128" {
|
||||
+ result.run = Skip;
|
||||
+ }
|
||||
+
|
||||
// END OF VENDOR RESERVED AREA
|
||||
//
|
||||
//
|
||||
--
|
||||
2.34.1
|
||||
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
[toolchain]
|
||||
channel = "nightly-2025-05-25"
|
||||
channel = "nightly-2025-06-24"
|
||||
components = ["rust-src", "rustc-dev", "llvm-tools"]
|
||||
profile = "minimal"
|
||||
|
|
|
|||
17
compiler/rustc_codegen_cranelift/scripts/abi-cafe-rules.toml
Normal file
17
compiler/rustc_codegen_cranelift/scripts/abi-cafe-rules.toml
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
[target.'cfg(all(target_arch = "aarch64", target_os = "linux"))']
|
||||
'F32Array::conv_c'.busted = "check"
|
||||
|
||||
[target.'cfg(all(target_arch = "aarch64", target_os = "macos"))']
|
||||
'SingleVariantUnion::conv_c::repr_c'.busted = "check"
|
||||
'OptionU128::conv_rust::repr_c::rustc_caller'.busted = "run"
|
||||
'OptionU128::conv_rust::repr_c::cgclif_caller'.busted = "check"
|
||||
|
||||
[target.'cfg(all(target_arch = "x86_64", windows))']
|
||||
'simple::conv_rust'.busted = "check"
|
||||
'simple::conv_rust::rustc_caller'.busted = "run"
|
||||
|
||||
[target.'*'.'f16']
|
||||
run = "skip"
|
||||
|
||||
[target.'*'.'f128']
|
||||
run = "skip"
|
||||
|
|
@ -151,20 +151,6 @@ rm tests/ui/process/process-panic-after-fork.rs # same
|
|||
cp ../dist/bin/rustdoc-clif ../dist/bin/rustdoc # some tests expect bin/rustdoc to exist
|
||||
|
||||
cat <<EOF | git apply -
|
||||
diff --git a/tests/run-make/linker-warning/rmake.rs b/tests/run-make/linker-warning/rmake.rs
|
||||
index 30387af428c..f7895b12961 100644
|
||||
--- a/tests/run-make/linker-warning/rmake.rs
|
||||
+++ b/tests/run-make/linker-warning/rmake.rs
|
||||
@@ -57,7 +57,8 @@ fn main() {
|
||||
.actual_text("(linker error)", out.stderr())
|
||||
- .normalize(r#"/rustc[^/]*/"#, "/rustc/")
|
||||
+ .normalize(r#"/tmp/rustc[^/]*/"#, "/tmp/rustc/")
|
||||
+ .normalize("libpanic_abort", "libpanic_unwind")
|
||||
.normalize(
|
||||
regex::escape(run_make_support::build_root().to_str().unwrap()),
|
||||
"/build-root",
|
||||
)
|
||||
.normalize(r#""[^"]*\/symbols.o""#, "\\"/symbols.o\\"")
|
||||
diff --git a/src/tools/compiletest/src/runtest/run_make.rs b/src/tools/compiletest/src/runtest/run_make.rs
|
||||
index 073116933bd..c3e4578204d 100644
|
||||
--- a/src/tools/compiletest/src/runtest/run_make.rs
|
||||
|
|
|
|||
|
|
@ -228,7 +228,7 @@ fn pointer_for_allocation<'tcx>(
|
|||
crate::pointer::Pointer::new(global_ptr)
|
||||
}
|
||||
|
||||
pub(crate) fn data_id_for_alloc_id(
|
||||
fn data_id_for_alloc_id(
|
||||
cx: &mut ConstantCx,
|
||||
module: &mut dyn Module,
|
||||
alloc_id: AllocId,
|
||||
|
|
|
|||
|
|
@ -202,9 +202,10 @@ pub(super) fn codegen_x86_llvm_intrinsic_call<'tcx>(
|
|||
};
|
||||
let x = codegen_operand(fx, &x.node);
|
||||
let y = codegen_operand(fx, &y.node);
|
||||
let kind = match &kind.node {
|
||||
Operand::Constant(const_) => crate::constant::eval_mir_constant(fx, const_).0,
|
||||
Operand::Copy(_) | Operand::Move(_) => unreachable!("{kind:?}"),
|
||||
let kind = if let Some(const_) = kind.node.constant() {
|
||||
crate::constant::eval_mir_constant(fx, const_).0
|
||||
} else {
|
||||
unreachable!("{kind:?}")
|
||||
};
|
||||
|
||||
let flt_cc = match kind
|
||||
|
|
|
|||
|
|
@ -205,9 +205,10 @@ pub(super) fn codegen_simd_intrinsic_call<'tcx>(
|
|||
// Find a way to reuse `immediate_const_vector` from `codegen_ssa` instead.
|
||||
let indexes = {
|
||||
use rustc_middle::mir::interpret::*;
|
||||
let idx_const = match &idx.node {
|
||||
Operand::Constant(const_) => crate::constant::eval_mir_constant(fx, const_).0,
|
||||
Operand::Copy(_) | Operand::Move(_) => unreachable!("{idx:?}"),
|
||||
let idx_const = if let Some(const_) = idx.node.constant() {
|
||||
crate::constant::eval_mir_constant(fx, const_).0
|
||||
} else {
|
||||
unreachable!("{idx:?}")
|
||||
};
|
||||
|
||||
let idx_bytes = match idx_const {
|
||||
|
|
|
|||
|
|
@ -184,7 +184,7 @@ impl CodegenBackend for CraneliftCodegenBackend {
|
|||
// FIXME return the actually used target features. this is necessary for #[cfg(target_feature)]
|
||||
let target_features = if sess.target.arch == "x86_64" && sess.target.os != "none" {
|
||||
// x86_64 mandates SSE2 support and rustc requires the x87 feature to be enabled
|
||||
vec![sym::fsxr, sym::sse, sym::sse2, Symbol::intern("x87")]
|
||||
vec![sym::fxsr, sym::sse, sym::sse2, Symbol::intern("x87")]
|
||||
} else if sess.target.arch == "aarch64" {
|
||||
match &*sess.target.os {
|
||||
"none" => vec![],
|
||||
|
|
|
|||
|
|
@ -1,7 +1,3 @@
|
|||
codegen_gcc_unknown_ctarget_feature_prefix =
|
||||
unknown feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = features must begin with a `+` to enable or `-` to disable it
|
||||
|
||||
codegen_gcc_unwinding_inline_asm =
|
||||
GCC backend does not support unwinding from inline asm
|
||||
|
||||
|
|
@ -16,15 +12,3 @@ codegen_gcc_lto_disallowed = lto can only be run for executables, cdylibs and st
|
|||
codegen_gcc_lto_dylib = lto cannot be used for `dylib` crate type without `-Zdylib-lto`
|
||||
|
||||
codegen_gcc_lto_bitcode_from_rlib = failed to get bitcode from object file for LTO ({$gcc_err})
|
||||
|
||||
codegen_gcc_unknown_ctarget_feature =
|
||||
unknown and unstable feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = it is still passed through to the codegen backend, but use of this feature might be unsound and the behavior of this feature can change in the future
|
||||
.possible_feature = you might have meant: `{$rust_feature}`
|
||||
.consider_filing_feature_request = consider filing a feature request
|
||||
|
||||
codegen_gcc_missing_features =
|
||||
add the missing features in a `target_feature` attribute
|
||||
|
||||
codegen_gcc_target_feature_disable_or_enable =
|
||||
the target features {$features} must all be either enabled or disabled together
|
||||
|
|
|
|||
|
|
@ -1591,9 +1591,9 @@ impl<'a, 'gcc, 'tcx> BuilderMethods<'a, 'tcx> for Builder<'a, 'gcc, 'tcx> {
|
|||
(value1, value2)
|
||||
}
|
||||
|
||||
fn filter_landing_pad(&mut self, pers_fn: RValue<'gcc>) -> (RValue<'gcc>, RValue<'gcc>) {
|
||||
fn filter_landing_pad(&mut self, pers_fn: RValue<'gcc>) {
|
||||
// TODO(antoyo): generate the correct landing pad
|
||||
self.cleanup_landing_pad(pers_fn)
|
||||
self.cleanup_landing_pad(pers_fn);
|
||||
}
|
||||
|
||||
#[cfg(feature = "master")]
|
||||
|
|
|
|||
|
|
@ -1,30 +1,6 @@
|
|||
use rustc_macros::{Diagnostic, Subdiagnostic};
|
||||
use rustc_macros::Diagnostic;
|
||||
use rustc_span::Span;
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_gcc_unknown_ctarget_feature_prefix)]
|
||||
#[note]
|
||||
pub(crate) struct UnknownCTargetFeaturePrefix<'a> {
|
||||
pub feature: &'a str,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_gcc_unknown_ctarget_feature)]
|
||||
#[note]
|
||||
pub(crate) struct UnknownCTargetFeature<'a> {
|
||||
pub feature: &'a str,
|
||||
#[subdiagnostic]
|
||||
pub rust_feature: PossibleFeature<'a>,
|
||||
}
|
||||
|
||||
#[derive(Subdiagnostic)]
|
||||
pub(crate) enum PossibleFeature<'a> {
|
||||
#[help(codegen_gcc_possible_feature)]
|
||||
Some { rust_feature: &'a str },
|
||||
#[help(codegen_gcc_consider_filing_feature_request)]
|
||||
None,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_gcc_unwinding_inline_asm)]
|
||||
pub(crate) struct UnwindingInlineAsm {
|
||||
|
|
|
|||
|
|
@ -1,20 +1,12 @@
|
|||
#[cfg(feature = "master")]
|
||||
use gccjit::Context;
|
||||
use rustc_codegen_ssa::codegen_attrs::check_tied_features;
|
||||
use rustc_codegen_ssa::errors::TargetFeatureDisableOrEnable;
|
||||
use rustc_data_structures::fx::FxHashMap;
|
||||
use rustc_data_structures::unord::UnordSet;
|
||||
use rustc_codegen_ssa::target_features;
|
||||
use rustc_session::Session;
|
||||
use rustc_session::features::{StabilityExt, retpoline_features_by_flags};
|
||||
use rustc_target::target_features::RUSTC_SPECIFIC_FEATURES;
|
||||
use smallvec::{SmallVec, smallvec};
|
||||
|
||||
use crate::errors::{PossibleFeature, UnknownCTargetFeature, UnknownCTargetFeaturePrefix};
|
||||
|
||||
fn gcc_features_by_flags(sess: &Session) -> Vec<&str> {
|
||||
let mut features: Vec<&str> = Vec::new();
|
||||
retpoline_features_by_flags(sess, &mut features);
|
||||
features
|
||||
fn gcc_features_by_flags(sess: &Session, features: &mut Vec<String>) {
|
||||
target_features::retpoline_features_by_flags(sess, features);
|
||||
// FIXME: LLVM also sets +reserve-x18 here under some conditions.
|
||||
}
|
||||
|
||||
/// The list of GCC features computed from CLI flags (`-Ctarget-cpu`, `-Ctarget-feature`,
|
||||
|
|
@ -44,98 +36,29 @@ pub(crate) fn global_gcc_features(sess: &Session, diagnostics: bool) -> Vec<Stri
|
|||
features.extend(sess.target.features.split(',').filter(|v| !v.is_empty()).map(String::from));
|
||||
|
||||
// -Ctarget-features
|
||||
let known_features = sess.target.rust_target_features();
|
||||
let mut featsmap = FxHashMap::default();
|
||||
|
||||
// Compute implied features
|
||||
let mut all_rust_features = vec![];
|
||||
for feature in sess.opts.cg.target_feature.split(',').chain(gcc_features_by_flags(sess)) {
|
||||
if let Some(feature) = feature.strip_prefix('+') {
|
||||
all_rust_features.extend(
|
||||
UnordSet::from(sess.target.implied_target_features(feature))
|
||||
.to_sorted_stable_ord()
|
||||
.iter()
|
||||
.map(|&&s| (true, s)),
|
||||
)
|
||||
} else if let Some(feature) = feature.strip_prefix('-') {
|
||||
// FIXME: Why do we not remove implied features on "-" here?
|
||||
// We do the equivalent above in `target_config`.
|
||||
// See <https://github.com/rust-lang/rust/issues/134792>.
|
||||
all_rust_features.push((false, feature));
|
||||
} else if !feature.is_empty() && diagnostics {
|
||||
sess.dcx().emit_warn(UnknownCTargetFeaturePrefix { feature });
|
||||
}
|
||||
}
|
||||
// Remove features that are meant for rustc, not codegen.
|
||||
all_rust_features.retain(|&(_, feature)| {
|
||||
// Retain if it is not a rustc feature
|
||||
!RUSTC_SPECIFIC_FEATURES.contains(&feature)
|
||||
});
|
||||
|
||||
// Check feature validity.
|
||||
if diagnostics {
|
||||
for &(enable, feature) in &all_rust_features {
|
||||
let feature_state = known_features.iter().find(|&&(v, _, _)| v == feature);
|
||||
match feature_state {
|
||||
None => {
|
||||
let rust_feature = known_features.iter().find_map(|&(rust_feature, _, _)| {
|
||||
let gcc_features = to_gcc_features(sess, rust_feature);
|
||||
if gcc_features.contains(&feature) && !gcc_features.contains(&rust_feature)
|
||||
{
|
||||
Some(rust_feature)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
});
|
||||
let unknown_feature = if let Some(rust_feature) = rust_feature {
|
||||
UnknownCTargetFeature {
|
||||
feature,
|
||||
rust_feature: PossibleFeature::Some { rust_feature },
|
||||
}
|
||||
} else {
|
||||
UnknownCTargetFeature { feature, rust_feature: PossibleFeature::None }
|
||||
};
|
||||
sess.dcx().emit_warn(unknown_feature);
|
||||
}
|
||||
Some(&(_, stability, _)) => {
|
||||
stability.verify_feature_enabled_by_flag(sess, enable, feature);
|
||||
}
|
||||
}
|
||||
|
||||
// FIXME(nagisa): figure out how to not allocate a full hashset here.
|
||||
featsmap.insert(feature, enable);
|
||||
}
|
||||
}
|
||||
|
||||
// Translate this into GCC features.
|
||||
let feats =
|
||||
all_rust_features.iter().flat_map(|&(enable, feature)| {
|
||||
let enable_disable = if enable { '+' } else { '-' };
|
||||
target_features::flag_to_backend_features(
|
||||
sess,
|
||||
diagnostics,
|
||||
|feature| to_gcc_features(sess, feature),
|
||||
|feature, enable| {
|
||||
// We run through `to_gcc_features` when
|
||||
// passing requests down to GCC. This means that all in-language
|
||||
// features also work on the command line instead of having two
|
||||
// different names when the GCC name and the Rust name differ.
|
||||
to_gcc_features(sess, feature)
|
||||
.iter()
|
||||
.flat_map(|feat| to_gcc_features(sess, feat).into_iter())
|
||||
.map(|feature| {
|
||||
if enable_disable == '-' {
|
||||
format!("-{}", feature)
|
||||
} else {
|
||||
feature.to_string()
|
||||
}
|
||||
})
|
||||
.collect::<Vec<_>>()
|
||||
});
|
||||
features.extend(feats);
|
||||
features.extend(
|
||||
to_gcc_features(sess, feature)
|
||||
.iter()
|
||||
.flat_map(|feat| to_gcc_features(sess, feat).into_iter())
|
||||
.map(
|
||||
|feature| {
|
||||
if !enable { format!("-{}", feature) } else { feature.to_string() }
|
||||
},
|
||||
),
|
||||
);
|
||||
},
|
||||
);
|
||||
|
||||
if diagnostics && let Some(f) = check_tied_features(sess, &featsmap) {
|
||||
sess.dcx().emit_err(TargetFeatureDisableOrEnable {
|
||||
features: f,
|
||||
span: None,
|
||||
missing_features: None,
|
||||
});
|
||||
}
|
||||
gcc_features_by_flags(sess, &mut features);
|
||||
|
||||
features
|
||||
}
|
||||
|
|
|
|||
|
|
@ -102,6 +102,7 @@ use rustc_codegen_ssa::back::write::{
|
|||
CodegenContext, FatLtoInput, ModuleConfig, TargetMachineFactoryFn,
|
||||
};
|
||||
use rustc_codegen_ssa::base::codegen_crate;
|
||||
use rustc_codegen_ssa::target_features::cfg_target_feature;
|
||||
use rustc_codegen_ssa::traits::{CodegenBackend, ExtraBackendMethods, WriteBackendMethods};
|
||||
use rustc_codegen_ssa::{CodegenResults, CompiledModule, ModuleCodegen, TargetConfig};
|
||||
use rustc_data_structures::fx::FxIndexMap;
|
||||
|
|
@ -476,42 +477,21 @@ fn to_gcc_opt_level(optlevel: Option<OptLevel>) -> OptimizationLevel {
|
|||
|
||||
/// Returns the features that should be set in `cfg(target_feature)`.
|
||||
fn target_config(sess: &Session, target_info: &LockedTargetInfo) -> TargetConfig {
|
||||
// TODO(antoyo): use global_gcc_features.
|
||||
let f = |allow_unstable| {
|
||||
sess.target
|
||||
.rust_target_features()
|
||||
.iter()
|
||||
.filter_map(|&(feature, gate, _)| {
|
||||
if allow_unstable
|
||||
|| (gate.in_cfg()
|
||||
&& (sess.is_nightly_build() || gate.requires_nightly().is_none()))
|
||||
{
|
||||
Some(feature)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.filter(|feature| {
|
||||
// TODO: we disable Neon for now since we don't support the LLVM intrinsics for it.
|
||||
if *feature == "neon" {
|
||||
return false;
|
||||
}
|
||||
target_info.cpu_supports(feature)
|
||||
// cSpell:disable
|
||||
/*
|
||||
adx, aes, avx, avx2, avx512bf16, avx512bitalg, avx512bw, avx512cd, avx512dq, avx512er, avx512f, avx512fp16, avx512ifma,
|
||||
avx512pf, avx512vbmi, avx512vbmi2, avx512vl, avx512vnni, avx512vp2intersect, avx512vpopcntdq,
|
||||
bmi1, bmi2, cmpxchg16b, ermsb, f16c, fma, fxsr, gfni, lzcnt, movbe, pclmulqdq, popcnt, rdrand, rdseed, rtm,
|
||||
sha, sse, sse2, sse3, sse4.1, sse4.2, sse4a, ssse3, tbm, vaes, vpclmulqdq, xsave, xsavec, xsaveopt, xsaves
|
||||
*/
|
||||
// cSpell:enable
|
||||
})
|
||||
.map(Symbol::intern)
|
||||
.collect()
|
||||
};
|
||||
|
||||
let target_features = f(false);
|
||||
let unstable_target_features = f(true);
|
||||
let (unstable_target_features, target_features) = cfg_target_feature(sess, |feature| {
|
||||
// TODO: we disable Neon for now since we don't support the LLVM intrinsics for it.
|
||||
if feature == "neon" {
|
||||
return false;
|
||||
}
|
||||
target_info.cpu_supports(feature)
|
||||
// cSpell:disable
|
||||
/*
|
||||
adx, aes, avx, avx2, avx512bf16, avx512bitalg, avx512bw, avx512cd, avx512dq, avx512er, avx512f, avx512fp16, avx512ifma,
|
||||
avx512pf, avx512vbmi, avx512vbmi2, avx512vl, avx512vnni, avx512vp2intersect, avx512vpopcntdq,
|
||||
bmi1, bmi2, cmpxchg16b, ermsb, f16c, fma, fxsr, gfni, lzcnt, movbe, pclmulqdq, popcnt, rdrand, rdseed, rtm,
|
||||
sha, sse, sse2, sse3, sse4.1, sse4.2, sse4a, ssse3, tbm, vaes, vpclmulqdq, xsave, xsavec, xsaveopt, xsaves
|
||||
*/
|
||||
// cSpell:enable
|
||||
});
|
||||
|
||||
let has_reliable_f16 = target_info.supports_target_dependent_type(CType::Float16);
|
||||
let has_reliable_f128 = target_info.supports_target_dependent_type(CType::Float128);
|
||||
|
|
|
|||
|
|
@ -59,16 +59,6 @@ codegen_llvm_symbol_already_defined =
|
|||
codegen_llvm_target_machine = could not create LLVM TargetMachine for triple: {$triple}
|
||||
codegen_llvm_target_machine_with_llvm_err = could not create LLVM TargetMachine for triple: {$triple}: {$llvm_err}
|
||||
|
||||
codegen_llvm_unknown_ctarget_feature =
|
||||
unknown and unstable feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = it is still passed through to the codegen backend, but use of this feature might be unsound and the behavior of this feature can change in the future
|
||||
.possible_feature = you might have meant: `{$rust_feature}`
|
||||
.consider_filing_feature_request = consider filing a feature request
|
||||
|
||||
codegen_llvm_unknown_ctarget_feature_prefix =
|
||||
unknown feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = features must begin with a `+` to enable or `-` to disable it
|
||||
|
||||
codegen_llvm_unknown_debuginfo_compression = unknown debuginfo compression algorithm {$algorithm} - will fall back to uncompressed debuginfo
|
||||
|
||||
codegen_llvm_write_bytecode = failed to write bytecode to {$path}: {$err}
|
||||
|
|
|
|||
|
|
@ -491,11 +491,7 @@ pub(crate) fn llfn_attrs_from_instance<'ll, 'tcx>(
|
|||
let allocated_pointer = AttributeKind::AllocatedPointer.create_attr(cx.llcx);
|
||||
attributes::apply_to_llfn(llfn, AttributePlace::Argument(0), &[allocated_pointer]);
|
||||
}
|
||||
// function alignment can be set globally with the `-Zmin-function-alignment=<n>` flag;
|
||||
// the alignment from a `#[repr(align(<n>))]` is used if it specifies a higher alignment.
|
||||
if let Some(align) =
|
||||
Ord::max(cx.tcx.sess.opts.unstable_opts.min_function_alignment, codegen_fn_attrs.alignment)
|
||||
{
|
||||
if let Some(align) = codegen_fn_attrs.alignment {
|
||||
llvm::set_alignment(llfn, align);
|
||||
}
|
||||
if let Some(backchain) = backchain_attr(cx) {
|
||||
|
|
|
|||
|
|
@ -587,7 +587,7 @@ fn thin_lto(
|
|||
}
|
||||
|
||||
fn enable_autodiff_settings(ad: &[config::AutoDiff]) {
|
||||
for &val in ad {
|
||||
for val in ad {
|
||||
// We intentionally don't use a wildcard, to not forget handling anything new.
|
||||
match val {
|
||||
config::AutoDiff::PrintPerf => {
|
||||
|
|
@ -599,6 +599,10 @@ fn enable_autodiff_settings(ad: &[config::AutoDiff]) {
|
|||
config::AutoDiff::PrintTA => {
|
||||
llvm::set_print_type(true);
|
||||
}
|
||||
config::AutoDiff::PrintTAFn(fun) => {
|
||||
llvm::set_print_type(true); // Enable general type printing
|
||||
llvm::set_print_type_fun(&fun); // Set specific function to analyze
|
||||
}
|
||||
config::AutoDiff::Inline => {
|
||||
llvm::set_inline(true);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1166,11 +1166,10 @@ impl<'a, 'll, 'tcx> BuilderMethods<'a, 'tcx> for Builder<'a, 'll, 'tcx> {
|
|||
(self.extract_value(landing_pad, 0), self.extract_value(landing_pad, 1))
|
||||
}
|
||||
|
||||
fn filter_landing_pad(&mut self, pers_fn: &'ll Value) -> (&'ll Value, &'ll Value) {
|
||||
fn filter_landing_pad(&mut self, pers_fn: &'ll Value) {
|
||||
let ty = self.type_struct(&[self.type_ptr(), self.type_i32()], false);
|
||||
let landing_pad = self.landing_pad(ty, pers_fn, 1);
|
||||
self.add_clause(landing_pad, self.const_array(self.type_ptr(), &[]));
|
||||
(self.extract_value(landing_pad, 0), self.extract_value(landing_pad, 1))
|
||||
}
|
||||
|
||||
fn resume(&mut self, exn0: &'ll Value, exn1: &'ll Value) {
|
||||
|
|
|
|||
|
|
@ -3,35 +3,11 @@ use std::path::Path;
|
|||
|
||||
use rustc_data_structures::small_c_str::SmallCStr;
|
||||
use rustc_errors::{Diag, DiagCtxtHandle, Diagnostic, EmissionGuarantee, Level};
|
||||
use rustc_macros::{Diagnostic, Subdiagnostic};
|
||||
use rustc_macros::Diagnostic;
|
||||
use rustc_span::Span;
|
||||
|
||||
use crate::fluent_generated as fluent;
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_llvm_unknown_ctarget_feature_prefix)]
|
||||
#[note]
|
||||
pub(crate) struct UnknownCTargetFeaturePrefix<'a> {
|
||||
pub feature: &'a str,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_llvm_unknown_ctarget_feature)]
|
||||
#[note]
|
||||
pub(crate) struct UnknownCTargetFeature<'a> {
|
||||
pub feature: &'a str,
|
||||
#[subdiagnostic]
|
||||
pub rust_feature: PossibleFeature<'a>,
|
||||
}
|
||||
|
||||
#[derive(Subdiagnostic)]
|
||||
pub(crate) enum PossibleFeature<'a> {
|
||||
#[help(codegen_llvm_possible_feature)]
|
||||
Some { rust_feature: &'a str },
|
||||
#[help(codegen_llvm_consider_filing_feature_request)]
|
||||
None,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_llvm_symbol_already_defined)]
|
||||
pub(crate) struct SymbolAlreadyDefined<'a> {
|
||||
|
|
|
|||
|
|
@ -57,14 +57,19 @@ pub(crate) use self::Enzyme_AD::*;
|
|||
|
||||
#[cfg(llvm_enzyme)]
|
||||
pub(crate) mod Enzyme_AD {
|
||||
use std::ffi::{CString, c_char};
|
||||
|
||||
use libc::c_void;
|
||||
|
||||
unsafe extern "C" {
|
||||
pub(crate) fn EnzymeSetCLBool(arg1: *mut ::std::os::raw::c_void, arg2: u8);
|
||||
pub(crate) fn EnzymeSetCLString(arg1: *mut ::std::os::raw::c_void, arg2: *const c_char);
|
||||
}
|
||||
unsafe extern "C" {
|
||||
static mut EnzymePrintPerf: c_void;
|
||||
static mut EnzymePrintActivity: c_void;
|
||||
static mut EnzymePrintType: c_void;
|
||||
static mut EnzymeFunctionToAnalyze: c_void;
|
||||
static mut EnzymePrint: c_void;
|
||||
static mut EnzymeStrictAliasing: c_void;
|
||||
static mut looseTypeAnalysis: c_void;
|
||||
|
|
@ -86,6 +91,15 @@ pub(crate) mod Enzyme_AD {
|
|||
EnzymeSetCLBool(std::ptr::addr_of_mut!(EnzymePrintType), print as u8);
|
||||
}
|
||||
}
|
||||
pub(crate) fn set_print_type_fun(fun_name: &str) {
|
||||
let c_fun_name = CString::new(fun_name).unwrap();
|
||||
unsafe {
|
||||
EnzymeSetCLString(
|
||||
std::ptr::addr_of_mut!(EnzymeFunctionToAnalyze),
|
||||
c_fun_name.as_ptr() as *const c_char,
|
||||
);
|
||||
}
|
||||
}
|
||||
pub(crate) fn set_print(print: bool) {
|
||||
unsafe {
|
||||
EnzymeSetCLBool(std::ptr::addr_of_mut!(EnzymePrint), print as u8);
|
||||
|
|
@ -132,6 +146,9 @@ pub(crate) mod Fallback_AD {
|
|||
pub(crate) fn set_print_type(print: bool) {
|
||||
unimplemented!()
|
||||
}
|
||||
pub(crate) fn set_print_type_fun(fun_name: &str) {
|
||||
unimplemented!()
|
||||
}
|
||||
pub(crate) fn set_print(print: bool) {
|
||||
unimplemented!()
|
||||
}
|
||||
|
|
|
|||
|
|
@ -6,27 +6,20 @@ use std::sync::Once;
|
|||
use std::{ptr, slice, str};
|
||||
|
||||
use libc::c_int;
|
||||
use rustc_codegen_ssa::TargetConfig;
|
||||
use rustc_codegen_ssa::base::wants_wasm_eh;
|
||||
use rustc_codegen_ssa::codegen_attrs::check_tied_features;
|
||||
use rustc_data_structures::fx::{FxHashMap, FxHashSet};
|
||||
use rustc_codegen_ssa::target_features::cfg_target_feature;
|
||||
use rustc_codegen_ssa::{TargetConfig, target_features};
|
||||
use rustc_data_structures::fx::FxHashSet;
|
||||
use rustc_data_structures::small_c_str::SmallCStr;
|
||||
use rustc_data_structures::unord::UnordSet;
|
||||
use rustc_fs_util::path_to_c_string;
|
||||
use rustc_middle::bug;
|
||||
use rustc_session::Session;
|
||||
use rustc_session::config::{PrintKind, PrintRequest};
|
||||
use rustc_session::features::{StabilityExt, retpoline_features_by_flags};
|
||||
use rustc_span::Symbol;
|
||||
use rustc_target::spec::{MergeFunctions, PanicStrategy, SmallDataThresholdSupport};
|
||||
use rustc_target::target_features::{RUSTC_SPECIAL_FEATURES, RUSTC_SPECIFIC_FEATURES};
|
||||
use smallvec::{SmallVec, smallvec};
|
||||
|
||||
use crate::back::write::create_informational_target_machine;
|
||||
use crate::errors::{
|
||||
FixedX18InvalidArch, PossibleFeature, UnknownCTargetFeature, UnknownCTargetFeaturePrefix,
|
||||
};
|
||||
use crate::llvm;
|
||||
use crate::{errors, llvm};
|
||||
|
||||
static INIT: Once = Once::new();
|
||||
|
||||
|
|
@ -195,15 +188,6 @@ impl<'a> LLVMFeature<'a> {
|
|||
) -> Self {
|
||||
Self { llvm_feature_name, dependencies }
|
||||
}
|
||||
|
||||
fn contains(&'a self, feat: &str) -> bool {
|
||||
self.iter().any(|dep| dep == feat)
|
||||
}
|
||||
|
||||
fn iter(&'a self) -> impl Iterator<Item = &'a str> {
|
||||
let dependencies = self.dependencies.iter().map(|feat| feat.as_str());
|
||||
std::iter::once(self.llvm_feature_name).chain(dependencies)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for LLVMFeature<'a> {
|
||||
|
|
@ -216,18 +200,22 @@ impl<'a> IntoIterator for LLVMFeature<'a> {
|
|||
}
|
||||
}
|
||||
|
||||
// WARNING: the features after applying `to_llvm_features` must be known
|
||||
// to LLVM or the feature detection code will walk past the end of the feature
|
||||
// array, leading to crashes.
|
||||
//
|
||||
// To find a list of LLVM's names, see llvm-project/llvm/lib/Target/{ARCH}/*.td
|
||||
// where `{ARCH}` is the architecture name. Look for instances of `SubtargetFeature`.
|
||||
//
|
||||
// Check the current rustc fork of LLVM in the repo at https://github.com/rust-lang/llvm-project/.
|
||||
// The commit in use can be found via the `llvm-project` submodule in
|
||||
// https://github.com/rust-lang/rust/tree/master/src Though note that Rust can also be build with
|
||||
// an external precompiled version of LLVM which might lead to failures if the oldest tested /
|
||||
// supported LLVM version doesn't yet support the relevant intrinsics.
|
||||
/// Convert a Rust feature name to an LLVM feature name. Returning `None` means the
|
||||
/// feature should be skipped, usually because it is not supported by the current
|
||||
/// LLVM version.
|
||||
///
|
||||
/// WARNING: the features after applying `to_llvm_features` must be known
|
||||
/// to LLVM or the feature detection code will walk past the end of the feature
|
||||
/// array, leading to crashes.
|
||||
///
|
||||
/// To find a list of LLVM's names, see llvm-project/llvm/lib/Target/{ARCH}/*.td
|
||||
/// where `{ARCH}` is the architecture name. Look for instances of `SubtargetFeature`.
|
||||
///
|
||||
/// Check the current rustc fork of LLVM in the repo at
|
||||
/// <https://github.com/rust-lang/llvm-project/>. The commit in use can be found via the
|
||||
/// `llvm-project` submodule in <https://github.com/rust-lang/rust/tree/master/src> Though note that
|
||||
/// Rust can also be build with an external precompiled version of LLVM which might lead to failures
|
||||
/// if the oldest tested / supported LLVM version doesn't yet support the relevant intrinsics.
|
||||
pub(crate) fn to_llvm_features<'a>(sess: &Session, s: &'a str) -> Option<LLVMFeature<'a>> {
|
||||
let arch = if sess.target.arch == "x86_64" {
|
||||
"x86"
|
||||
|
|
@ -343,98 +331,25 @@ pub(crate) fn target_config(sess: &Session) -> TargetConfig {
|
|||
// the target CPU, that is still expanded to target features (with all their implied features)
|
||||
// by LLVM.
|
||||
let target_machine = create_informational_target_machine(sess, true);
|
||||
// Compute which of the known target features are enabled in the 'base' target machine. We only
|
||||
// consider "supported" features; "forbidden" features are not reflected in `cfg` as of now.
|
||||
let mut features: FxHashSet<Symbol> = sess
|
||||
.target
|
||||
.rust_target_features()
|
||||
.iter()
|
||||
.filter(|(feature, _, _)| {
|
||||
// skip checking special features, as LLVM may not understand them
|
||||
if RUSTC_SPECIAL_FEATURES.contains(feature) {
|
||||
return true;
|
||||
}
|
||||
if let Some(feat) = to_llvm_features(sess, feature) {
|
||||
for llvm_feature in feat {
|
||||
let cstr = SmallCStr::new(llvm_feature);
|
||||
// `LLVMRustHasFeature` is moderately expensive. On targets with many
|
||||
// features (e.g. x86) these calls take a non-trivial fraction of runtime
|
||||
// when compiling very small programs.
|
||||
if !unsafe { llvm::LLVMRustHasFeature(target_machine.raw(), cstr.as_ptr()) } {
|
||||
return false;
|
||||
}
|
||||
|
||||
let (unstable_target_features, target_features) = cfg_target_feature(sess, |feature| {
|
||||
if let Some(feat) = to_llvm_features(sess, feature) {
|
||||
// All the LLVM features this expands to must be enabled.
|
||||
for llvm_feature in feat {
|
||||
let cstr = SmallCStr::new(llvm_feature);
|
||||
// `LLVMRustHasFeature` is moderately expensive. On targets with many
|
||||
// features (e.g. x86) these calls take a non-trivial fraction of runtime
|
||||
// when compiling very small programs.
|
||||
if !unsafe { llvm::LLVMRustHasFeature(target_machine.raw(), cstr.as_ptr()) } {
|
||||
return false;
|
||||
}
|
||||
true
|
||||
} else {
|
||||
false
|
||||
}
|
||||
})
|
||||
.map(|(feature, _, _)| Symbol::intern(feature))
|
||||
.collect();
|
||||
|
||||
// Add enabled and remove disabled features.
|
||||
for (enabled, feature) in
|
||||
sess.opts.cg.target_feature.split(',').filter_map(|s| match s.chars().next() {
|
||||
Some('+') => Some((true, Symbol::intern(&s[1..]))),
|
||||
Some('-') => Some((false, Symbol::intern(&s[1..]))),
|
||||
_ => None,
|
||||
})
|
||||
{
|
||||
if enabled {
|
||||
// Also add all transitively implied features.
|
||||
|
||||
// We don't care about the order in `features` since the only thing we use it for is the
|
||||
// `features.contains` below.
|
||||
#[allow(rustc::potential_query_instability)]
|
||||
features.extend(
|
||||
sess.target
|
||||
.implied_target_features(feature.as_str())
|
||||
.iter()
|
||||
.map(|s| Symbol::intern(s)),
|
||||
);
|
||||
true
|
||||
} else {
|
||||
// Remove transitively reverse-implied features.
|
||||
|
||||
// We don't care about the order in `features` since the only thing we use it for is the
|
||||
// `features.contains` below.
|
||||
#[allow(rustc::potential_query_instability)]
|
||||
features.retain(|f| {
|
||||
if sess.target.implied_target_features(f.as_str()).contains(&feature.as_str()) {
|
||||
// If `f` if implies `feature`, then `!feature` implies `!f`, so we have to
|
||||
// remove `f`. (This is the standard logical contraposition principle.)
|
||||
false
|
||||
} else {
|
||||
// We can keep `f`.
|
||||
true
|
||||
}
|
||||
});
|
||||
false
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Filter enabled features based on feature gates.
|
||||
let f = |allow_unstable| {
|
||||
sess.target
|
||||
.rust_target_features()
|
||||
.iter()
|
||||
.filter_map(|(feature, gate, _)| {
|
||||
// The `allow_unstable` set is used by rustc internally to determined which target
|
||||
// features are truly available, so we want to return even perma-unstable
|
||||
// "forbidden" features.
|
||||
if allow_unstable
|
||||
|| (gate.in_cfg()
|
||||
&& (sess.is_nightly_build() || gate.requires_nightly().is_none()))
|
||||
{
|
||||
Some(Symbol::intern(feature))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.filter(|feature| features.contains(&feature))
|
||||
.collect()
|
||||
};
|
||||
|
||||
let target_features = f(false);
|
||||
let unstable_target_features = f(true);
|
||||
let mut cfg = TargetConfig {
|
||||
target_features,
|
||||
unstable_target_features,
|
||||
|
|
@ -707,10 +622,18 @@ pub(crate) fn target_cpu(sess: &Session) -> &str {
|
|||
handle_native(cpu_name)
|
||||
}
|
||||
|
||||
fn llvm_features_by_flags(sess: &Session) -> Vec<&str> {
|
||||
let mut features: Vec<&str> = Vec::new();
|
||||
retpoline_features_by_flags(sess, &mut features);
|
||||
features
|
||||
/// The target features for compiler flags other than `-Ctarget-features`.
|
||||
fn llvm_features_by_flags(sess: &Session, features: &mut Vec<String>) {
|
||||
target_features::retpoline_features_by_flags(sess, features);
|
||||
|
||||
// -Zfixed-x18
|
||||
if sess.opts.unstable_opts.fixed_x18 {
|
||||
if sess.target.arch != "aarch64" {
|
||||
sess.dcx().emit_fatal(errors::FixedX18InvalidArch { arch: &sess.target.arch });
|
||||
} else {
|
||||
features.push("+reserve-x18".into());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// The list of LLVM features computed from CLI flags (`-Ctarget-cpu`, `-Ctarget-feature`,
|
||||
|
|
@ -777,6 +700,8 @@ pub(crate) fn global_llvm_features(
|
|||
.split(',')
|
||||
.filter(|v| !v.is_empty())
|
||||
// Drop +v8plus feature introduced in LLVM 20.
|
||||
// (Hard-coded target features do not go through `to_llvm_feature` since they already
|
||||
// are LLVM feature names, hence we need a special case here.)
|
||||
.filter(|v| *v != "+v8plus" || get_version() >= (20, 0, 0))
|
||||
.map(String::from),
|
||||
);
|
||||
|
|
@ -787,86 +712,23 @@ pub(crate) fn global_llvm_features(
|
|||
|
||||
// -Ctarget-features
|
||||
if !only_base_features {
|
||||
let known_features = sess.target.rust_target_features();
|
||||
// Will only be filled when `diagnostics` is set!
|
||||
let mut featsmap = FxHashMap::default();
|
||||
|
||||
// Compute implied features
|
||||
let mut all_rust_features = vec![];
|
||||
for feature in sess.opts.cg.target_feature.split(',').chain(llvm_features_by_flags(sess)) {
|
||||
if let Some(feature) = feature.strip_prefix('+') {
|
||||
all_rust_features.extend(
|
||||
UnordSet::from(sess.target.implied_target_features(feature))
|
||||
.to_sorted_stable_ord()
|
||||
.iter()
|
||||
.map(|&&s| (true, s)),
|
||||
)
|
||||
} else if let Some(feature) = feature.strip_prefix('-') {
|
||||
// FIXME: Why do we not remove implied features on "-" here?
|
||||
// We do the equivalent above in `target_config`.
|
||||
// See <https://github.com/rust-lang/rust/issues/134792>.
|
||||
all_rust_features.push((false, feature));
|
||||
} else if !feature.is_empty() {
|
||||
if diagnostics {
|
||||
sess.dcx().emit_warn(UnknownCTargetFeaturePrefix { feature });
|
||||
}
|
||||
}
|
||||
}
|
||||
// Remove features that are meant for rustc, not LLVM.
|
||||
all_rust_features.retain(|(_, feature)| {
|
||||
// Retain if it is not a rustc feature
|
||||
!RUSTC_SPECIFIC_FEATURES.contains(feature)
|
||||
});
|
||||
|
||||
// Check feature validity.
|
||||
if diagnostics {
|
||||
for &(enable, feature) in &all_rust_features {
|
||||
let feature_state = known_features.iter().find(|&&(v, _, _)| v == feature);
|
||||
match feature_state {
|
||||
None => {
|
||||
let rust_feature =
|
||||
known_features.iter().find_map(|&(rust_feature, _, _)| {
|
||||
let llvm_features = to_llvm_features(sess, rust_feature)?;
|
||||
if llvm_features.contains(feature)
|
||||
&& !llvm_features.contains(rust_feature)
|
||||
{
|
||||
Some(rust_feature)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
});
|
||||
let unknown_feature = if let Some(rust_feature) = rust_feature {
|
||||
UnknownCTargetFeature {
|
||||
feature,
|
||||
rust_feature: PossibleFeature::Some { rust_feature },
|
||||
}
|
||||
} else {
|
||||
UnknownCTargetFeature { feature, rust_feature: PossibleFeature::None }
|
||||
};
|
||||
sess.dcx().emit_warn(unknown_feature);
|
||||
}
|
||||
Some((_, stability, _)) => {
|
||||
stability.verify_feature_enabled_by_flag(sess, enable, feature);
|
||||
}
|
||||
}
|
||||
|
||||
// FIXME(nagisa): figure out how to not allocate a full hashset here.
|
||||
featsmap.insert(feature, enable);
|
||||
}
|
||||
}
|
||||
|
||||
// Translate this into LLVM features.
|
||||
let feats = all_rust_features
|
||||
.iter()
|
||||
.filter_map(|&(enable, feature)| {
|
||||
target_features::flag_to_backend_features(
|
||||
sess,
|
||||
diagnostics,
|
||||
|feature| {
|
||||
to_llvm_features(sess, feature)
|
||||
.map(|f| SmallVec::<[&str; 2]>::from_iter(f.into_iter()))
|
||||
.unwrap_or_default()
|
||||
},
|
||||
|feature, enable| {
|
||||
let enable_disable = if enable { '+' } else { '-' };
|
||||
// We run through `to_llvm_features` when
|
||||
// passing requests down to LLVM. This means that all in-language
|
||||
// features also work on the command line instead of having two
|
||||
// different names when the LLVM name and the Rust name differ.
|
||||
let llvm_feature = to_llvm_features(sess, feature)?;
|
||||
let Some(llvm_feature) = to_llvm_features(sess, feature) else { return };
|
||||
|
||||
Some(
|
||||
features.extend(
|
||||
std::iter::once(format!(
|
||||
"{}{}",
|
||||
enable_disable, llvm_feature.llvm_feature_name
|
||||
|
|
@ -881,27 +743,12 @@ pub(crate) fn global_llvm_features(
|
|||
},
|
||||
)),
|
||||
)
|
||||
})
|
||||
.flatten();
|
||||
features.extend(feats);
|
||||
|
||||
if diagnostics && let Some(f) = check_tied_features(sess, &featsmap) {
|
||||
sess.dcx().emit_err(rustc_codegen_ssa::errors::TargetFeatureDisableOrEnable {
|
||||
features: f,
|
||||
span: None,
|
||||
missing_features: None,
|
||||
});
|
||||
}
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
// -Zfixed-x18
|
||||
if sess.opts.unstable_opts.fixed_x18 {
|
||||
if sess.target.arch != "aarch64" {
|
||||
sess.dcx().emit_fatal(FixedX18InvalidArch { arch: &sess.target.arch });
|
||||
} else {
|
||||
features.push("+reserve-x18".into());
|
||||
}
|
||||
}
|
||||
// We add this in the "base target" so that these show up in `sess.unstable_target_features`.
|
||||
llvm_features_by_flags(sess, &mut features);
|
||||
|
||||
features
|
||||
}
|
||||
|
|
|
|||
|
|
@ -48,8 +48,6 @@ codegen_ssa_error_writing_def_file =
|
|||
|
||||
codegen_ssa_expected_name_value_pair = expected name value pair
|
||||
|
||||
codegen_ssa_expected_one_argument = expected one argument
|
||||
|
||||
codegen_ssa_expected_used_symbol = expected `used`, `used(compiler)` or `used(linker)`
|
||||
|
||||
codegen_ssa_extern_funcs_not_found = some `extern` functions couldn't be found; some native libraries may need to be installed or have their path specified
|
||||
|
|
@ -68,6 +66,11 @@ codegen_ssa_failed_to_write = failed to write {$path}: {$error}
|
|||
|
||||
codegen_ssa_field_associated_value_expected = associated value expected for `{$name}`
|
||||
|
||||
codegen_ssa_forbidden_ctarget_feature =
|
||||
target feature `{$feature}` cannot be {$enabled} with `-Ctarget-feature`: {$reason}
|
||||
.note = this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
codegen_ssa_forbidden_ctarget_feature_issue = for more information, see issue #116344 <https://github.com/rust-lang/rust/issues/116344>
|
||||
|
||||
codegen_ssa_forbidden_target_feature_attr =
|
||||
target feature `{$feature}` cannot be enabled with `#[target_feature]`: {$reason}
|
||||
|
||||
|
|
@ -86,9 +89,6 @@ codegen_ssa_incorrect_cgu_reuse_type =
|
|||
|
||||
codegen_ssa_insufficient_vs_code_product = VS Code is a different product, and is not sufficient.
|
||||
|
||||
codegen_ssa_invalid_argument = invalid argument
|
||||
.help = valid inline arguments are `always` and `never`
|
||||
|
||||
codegen_ssa_invalid_instruction_set = invalid instruction set specified
|
||||
|
||||
codegen_ssa_invalid_link_ordinal_nargs = incorrect number of arguments to `#[link_ordinal]`
|
||||
|
|
@ -221,6 +221,8 @@ codegen_ssa_multiple_main_functions = entry symbol `main` declared multiple time
|
|||
|
||||
codegen_ssa_no_field = no field `{$name}`
|
||||
|
||||
codegen_ssa_no_mangle_nameless = `#[no_mangle]` cannot be used on {$definition} as it has no name
|
||||
|
||||
codegen_ssa_no_module_named =
|
||||
no module named `{$user_path}` (mangled: {$cgu_name}). available modules: {$cgu_names}
|
||||
|
||||
|
|
@ -368,8 +370,22 @@ codegen_ssa_unexpected_parameter_name = unexpected parameter name
|
|||
codegen_ssa_unknown_archive_kind =
|
||||
Don't know how to build archive of type: {$kind}
|
||||
|
||||
codegen_ssa_unknown_ctarget_feature =
|
||||
unknown and unstable feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = it is still passed through to the codegen backend, but use of this feature might be unsound and the behavior of this feature can change in the future
|
||||
.possible_feature = you might have meant: `{$rust_feature}`
|
||||
.consider_filing_feature_request = consider filing a feature request
|
||||
|
||||
codegen_ssa_unknown_ctarget_feature_prefix =
|
||||
unknown feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = features must begin with a `+` to enable or `-` to disable it
|
||||
|
||||
codegen_ssa_unknown_reuse_kind = unknown cgu-reuse-kind `{$kind}` specified
|
||||
|
||||
codegen_ssa_unstable_ctarget_feature =
|
||||
unstable feature specified for `-Ctarget-feature`: `{$feature}`
|
||||
.note = this feature is not stably supported; its behavior can change in the future
|
||||
|
||||
codegen_ssa_unsupported_instruction_set = target does not support `#[instruction_set]`
|
||||
|
||||
codegen_ssa_unsupported_link_self_contained = option `-C link-self-contained` is not supported on this target
|
||||
|
|
|
|||
|
|
@ -865,7 +865,7 @@ fn link_natively(
|
|||
command: cmd,
|
||||
escaped_output,
|
||||
verbose: sess.opts.verbose,
|
||||
sysroot_dir: sess.sysroot.clone(),
|
||||
sysroot_dir: sess.opts.sysroot.path().to_owned(),
|
||||
};
|
||||
sess.dcx().emit_err(err);
|
||||
// If MSVC's `link.exe` was expected but the return code
|
||||
|
|
@ -1249,10 +1249,10 @@ fn link_sanitizer_runtime(
|
|||
if path.exists() {
|
||||
sess.target_tlib_path.dir.clone()
|
||||
} else {
|
||||
let default_sysroot = filesearch::get_or_default_sysroot();
|
||||
let default_tlib =
|
||||
filesearch::make_target_lib_path(&default_sysroot, sess.opts.target_triple.tuple());
|
||||
default_tlib
|
||||
filesearch::make_target_lib_path(
|
||||
&sess.opts.sysroot.default,
|
||||
sess.opts.target_triple.tuple(),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1758,7 +1758,7 @@ fn detect_self_contained_mingw(sess: &Session, linker: &Path) -> bool {
|
|||
for dir in env::split_paths(&env::var_os("PATH").unwrap_or_default()) {
|
||||
let full_path = dir.join(&linker_with_extension);
|
||||
// If linker comes from sysroot assume self-contained mode
|
||||
if full_path.is_file() && !full_path.starts_with(&sess.sysroot) {
|
||||
if full_path.is_file() && !full_path.starts_with(sess.opts.sysroot.path()) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -337,7 +337,12 @@ pub(crate) trait Linker {
|
|||
fn debuginfo(&mut self, strip: Strip, natvis_debugger_visualizers: &[PathBuf]);
|
||||
fn no_crt_objects(&mut self);
|
||||
fn no_default_libraries(&mut self);
|
||||
fn export_symbols(&mut self, tmpdir: &Path, crate_type: CrateType, symbols: &[String]);
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
tmpdir: &Path,
|
||||
crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
);
|
||||
fn subsystem(&mut self, subsystem: &str);
|
||||
fn linker_plugin_lto(&mut self);
|
||||
fn add_eh_frame_header(&mut self) {}
|
||||
|
|
@ -770,7 +775,12 @@ impl<'a> Linker for GccLinker<'a> {
|
|||
}
|
||||
}
|
||||
|
||||
fn export_symbols(&mut self, tmpdir: &Path, crate_type: CrateType, symbols: &[String]) {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
tmpdir: &Path,
|
||||
crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
// Symbol visibility in object files typically takes care of this.
|
||||
if crate_type == CrateType::Executable {
|
||||
let should_export_executable_symbols =
|
||||
|
|
@ -799,7 +809,7 @@ impl<'a> Linker for GccLinker<'a> {
|
|||
// Write a plain, newline-separated list of symbols
|
||||
let res: io::Result<()> = try {
|
||||
let mut f = File::create_buffered(&path)?;
|
||||
for sym in symbols {
|
||||
for (sym, _) in symbols {
|
||||
debug!(" _{sym}");
|
||||
writeln!(f, "_{sym}")?;
|
||||
}
|
||||
|
|
@ -814,11 +824,12 @@ impl<'a> Linker for GccLinker<'a> {
|
|||
// .def file similar to MSVC one but without LIBRARY section
|
||||
// because LD doesn't like when it's empty
|
||||
writeln!(f, "EXPORTS")?;
|
||||
for symbol in symbols {
|
||||
for (symbol, kind) in symbols {
|
||||
let kind_marker = if *kind == SymbolExportKind::Data { " DATA" } else { "" };
|
||||
debug!(" _{symbol}");
|
||||
// Quote the name in case it's reserved by linker in some way
|
||||
// (this accounts for names with dots in particular).
|
||||
writeln!(f, " \"{symbol}\"")?;
|
||||
writeln!(f, " \"{symbol}\"{kind_marker}")?;
|
||||
}
|
||||
};
|
||||
if let Err(error) = res {
|
||||
|
|
@ -831,7 +842,7 @@ impl<'a> Linker for GccLinker<'a> {
|
|||
writeln!(f, "{{")?;
|
||||
if !symbols.is_empty() {
|
||||
writeln!(f, " global:")?;
|
||||
for sym in symbols {
|
||||
for (sym, _) in symbols {
|
||||
debug!(" {sym};");
|
||||
writeln!(f, " {sym};")?;
|
||||
}
|
||||
|
|
@ -1059,7 +1070,7 @@ impl<'a> Linker for MsvcLinker<'a> {
|
|||
self.link_arg("/PDBALTPATH:%_PDB%");
|
||||
|
||||
// This will cause the Microsoft linker to embed .natvis info into the PDB file
|
||||
let natvis_dir_path = self.sess.sysroot.join("lib\\rustlib\\etc");
|
||||
let natvis_dir_path = self.sess.opts.sysroot.path().join("lib\\rustlib\\etc");
|
||||
if let Ok(natvis_dir) = fs::read_dir(&natvis_dir_path) {
|
||||
for entry in natvis_dir {
|
||||
match entry {
|
||||
|
|
@ -1098,7 +1109,12 @@ impl<'a> Linker for MsvcLinker<'a> {
|
|||
// crates. Upstream rlibs may be linked statically to this dynamic library,
|
||||
// in which case they may continue to transitively be used and hence need
|
||||
// their symbols exported.
|
||||
fn export_symbols(&mut self, tmpdir: &Path, crate_type: CrateType, symbols: &[String]) {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
tmpdir: &Path,
|
||||
crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
// Symbol visibility takes care of this typically
|
||||
if crate_type == CrateType::Executable {
|
||||
let should_export_executable_symbols =
|
||||
|
|
@ -1116,9 +1132,10 @@ impl<'a> Linker for MsvcLinker<'a> {
|
|||
// straight to exports.
|
||||
writeln!(f, "LIBRARY")?;
|
||||
writeln!(f, "EXPORTS")?;
|
||||
for symbol in symbols {
|
||||
for (symbol, kind) in symbols {
|
||||
let kind_marker = if *kind == SymbolExportKind::Data { " DATA" } else { "" };
|
||||
debug!(" _{symbol}");
|
||||
writeln!(f, " {symbol}")?;
|
||||
writeln!(f, " {symbol}{kind_marker}")?;
|
||||
}
|
||||
};
|
||||
if let Err(error) = res {
|
||||
|
|
@ -1259,14 +1276,19 @@ impl<'a> Linker for EmLinker<'a> {
|
|||
self.cc_arg("-nodefaultlibs");
|
||||
}
|
||||
|
||||
fn export_symbols(&mut self, _tmpdir: &Path, _crate_type: CrateType, symbols: &[String]) {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
_tmpdir: &Path,
|
||||
_crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
debug!("EXPORTED SYMBOLS:");
|
||||
|
||||
self.cc_arg("-s");
|
||||
|
||||
let mut arg = OsString::from("EXPORTED_FUNCTIONS=");
|
||||
let encoded = serde_json::to_string(
|
||||
&symbols.iter().map(|sym| "_".to_owned() + sym).collect::<Vec<_>>(),
|
||||
&symbols.iter().map(|(sym, _)| "_".to_owned() + sym).collect::<Vec<_>>(),
|
||||
)
|
||||
.unwrap();
|
||||
debug!("{encoded}");
|
||||
|
|
@ -1428,8 +1450,13 @@ impl<'a> Linker for WasmLd<'a> {
|
|||
|
||||
fn no_default_libraries(&mut self) {}
|
||||
|
||||
fn export_symbols(&mut self, _tmpdir: &Path, _crate_type: CrateType, symbols: &[String]) {
|
||||
for sym in symbols {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
_tmpdir: &Path,
|
||||
_crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
for (sym, _) in symbols {
|
||||
self.link_args(&["--export", sym]);
|
||||
}
|
||||
|
||||
|
|
@ -1563,7 +1590,7 @@ impl<'a> Linker for L4Bender<'a> {
|
|||
self.cc_arg("-nostdlib");
|
||||
}
|
||||
|
||||
fn export_symbols(&mut self, _: &Path, _: CrateType, _: &[String]) {
|
||||
fn export_symbols(&mut self, _: &Path, _: CrateType, _: &[(String, SymbolExportKind)]) {
|
||||
// ToDo, not implemented, copy from GCC
|
||||
self.sess.dcx().emit_warn(errors::L4BenderExportingSymbolsUnimplemented);
|
||||
}
|
||||
|
|
@ -1720,12 +1747,17 @@ impl<'a> Linker for AixLinker<'a> {
|
|||
|
||||
fn no_default_libraries(&mut self) {}
|
||||
|
||||
fn export_symbols(&mut self, tmpdir: &Path, _crate_type: CrateType, symbols: &[String]) {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
tmpdir: &Path,
|
||||
_crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
let path = tmpdir.join("list.exp");
|
||||
let res: io::Result<()> = try {
|
||||
let mut f = File::create_buffered(&path)?;
|
||||
// FIXME: use llvm-nm to generate export list.
|
||||
for symbol in symbols {
|
||||
for (symbol, _) in symbols {
|
||||
debug!(" _{symbol}");
|
||||
writeln!(f, " {symbol}")?;
|
||||
}
|
||||
|
|
@ -1769,9 +1801,23 @@ fn for_each_exported_symbols_include_dep<'tcx>(
|
|||
}
|
||||
}
|
||||
|
||||
pub(crate) fn exported_symbols(tcx: TyCtxt<'_>, crate_type: CrateType) -> Vec<String> {
|
||||
pub(crate) fn exported_symbols(
|
||||
tcx: TyCtxt<'_>,
|
||||
crate_type: CrateType,
|
||||
) -> Vec<(String, SymbolExportKind)> {
|
||||
if let Some(ref exports) = tcx.sess.target.override_export_symbols {
|
||||
return exports.iter().map(ToString::to_string).collect();
|
||||
return exports
|
||||
.iter()
|
||||
.map(|name| {
|
||||
(
|
||||
name.to_string(),
|
||||
// FIXME use the correct export kind for this symbol. override_export_symbols
|
||||
// can't directly specify the SymbolExportKind as it is defined in rustc_middle
|
||||
// which rustc_target can't depend on.
|
||||
SymbolExportKind::Text,
|
||||
)
|
||||
})
|
||||
.collect();
|
||||
}
|
||||
|
||||
if let CrateType::ProcMacro = crate_type {
|
||||
|
|
@ -1781,7 +1827,10 @@ pub(crate) fn exported_symbols(tcx: TyCtxt<'_>, crate_type: CrateType) -> Vec<St
|
|||
}
|
||||
}
|
||||
|
||||
fn exported_symbols_for_non_proc_macro(tcx: TyCtxt<'_>, crate_type: CrateType) -> Vec<String> {
|
||||
fn exported_symbols_for_non_proc_macro(
|
||||
tcx: TyCtxt<'_>,
|
||||
crate_type: CrateType,
|
||||
) -> Vec<(String, SymbolExportKind)> {
|
||||
let mut symbols = Vec::new();
|
||||
let export_threshold = symbol_export::crates_export_threshold(&[crate_type]);
|
||||
for_each_exported_symbols_include_dep(tcx, crate_type, |symbol, info, cnum| {
|
||||
|
|
@ -1789,8 +1838,9 @@ fn exported_symbols_for_non_proc_macro(tcx: TyCtxt<'_>, crate_type: CrateType) -
|
|||
// from any cdylib. The latter doesn't work anyway as we use hidden visibility for
|
||||
// compiler-builtins. Most linkers silently ignore it, but ld64 gives a warning.
|
||||
if info.level.is_below_threshold(export_threshold) && !tcx.is_compiler_builtins(cnum) {
|
||||
symbols.push(symbol_export::exporting_symbol_name_for_instance_in_crate(
|
||||
tcx, symbol, cnum,
|
||||
symbols.push((
|
||||
symbol_export::exporting_symbol_name_for_instance_in_crate(tcx, symbol, cnum),
|
||||
info.kind,
|
||||
));
|
||||
symbol_export::extend_exported_symbols(&mut symbols, tcx, symbol, cnum);
|
||||
}
|
||||
|
|
@ -1799,7 +1849,7 @@ fn exported_symbols_for_non_proc_macro(tcx: TyCtxt<'_>, crate_type: CrateType) -
|
|||
symbols
|
||||
}
|
||||
|
||||
fn exported_symbols_for_proc_macro_crate(tcx: TyCtxt<'_>) -> Vec<String> {
|
||||
fn exported_symbols_for_proc_macro_crate(tcx: TyCtxt<'_>) -> Vec<(String, SymbolExportKind)> {
|
||||
// `exported_symbols` will be empty when !should_codegen.
|
||||
if !tcx.sess.opts.output_types.should_codegen() {
|
||||
return Vec::new();
|
||||
|
|
@ -1809,7 +1859,10 @@ fn exported_symbols_for_proc_macro_crate(tcx: TyCtxt<'_>) -> Vec<String> {
|
|||
let proc_macro_decls_name = tcx.sess.generate_proc_macro_decls_symbol(stable_crate_id);
|
||||
let metadata_symbol_name = exported_symbols::metadata_symbol_name(tcx);
|
||||
|
||||
vec![proc_macro_decls_name, metadata_symbol_name]
|
||||
vec![
|
||||
(proc_macro_decls_name, SymbolExportKind::Data),
|
||||
(metadata_symbol_name, SymbolExportKind::Data),
|
||||
]
|
||||
}
|
||||
|
||||
pub(crate) fn linked_symbols(
|
||||
|
|
@ -1831,7 +1884,9 @@ pub(crate) fn linked_symbols(
|
|||
|| info.used
|
||||
{
|
||||
symbols.push((
|
||||
symbol_export::linking_symbol_name_for_instance_in_crate(tcx, symbol, cnum),
|
||||
symbol_export::linking_symbol_name_for_instance_in_crate(
|
||||
tcx, symbol, info.kind, cnum,
|
||||
),
|
||||
info.kind,
|
||||
));
|
||||
}
|
||||
|
|
@ -1906,7 +1961,13 @@ impl<'a> Linker for PtxLinker<'a> {
|
|||
|
||||
fn ehcont_guard(&mut self) {}
|
||||
|
||||
fn export_symbols(&mut self, _tmpdir: &Path, _crate_type: CrateType, _symbols: &[String]) {}
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
_tmpdir: &Path,
|
||||
_crate_type: CrateType,
|
||||
_symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
}
|
||||
|
||||
fn subsystem(&mut self, _subsystem: &str) {}
|
||||
|
||||
|
|
@ -1975,10 +2036,15 @@ impl<'a> Linker for LlbcLinker<'a> {
|
|||
|
||||
fn ehcont_guard(&mut self) {}
|
||||
|
||||
fn export_symbols(&mut self, _tmpdir: &Path, _crate_type: CrateType, symbols: &[String]) {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
_tmpdir: &Path,
|
||||
_crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
match _crate_type {
|
||||
CrateType::Cdylib => {
|
||||
for sym in symbols {
|
||||
for (sym, _) in symbols {
|
||||
self.link_args(&["--export-symbol", sym]);
|
||||
}
|
||||
}
|
||||
|
|
@ -2052,11 +2118,16 @@ impl<'a> Linker for BpfLinker<'a> {
|
|||
|
||||
fn ehcont_guard(&mut self) {}
|
||||
|
||||
fn export_symbols(&mut self, tmpdir: &Path, _crate_type: CrateType, symbols: &[String]) {
|
||||
fn export_symbols(
|
||||
&mut self,
|
||||
tmpdir: &Path,
|
||||
_crate_type: CrateType,
|
||||
symbols: &[(String, SymbolExportKind)],
|
||||
) {
|
||||
let path = tmpdir.join("symbols");
|
||||
let res: io::Result<()> = try {
|
||||
let mut f = File::create_buffered(&path)?;
|
||||
for sym in symbols {
|
||||
for (sym, _) in symbols {
|
||||
writeln!(f, "{sym}")?;
|
||||
}
|
||||
};
|
||||
|
|
|
|||
|
|
@ -680,6 +680,7 @@ fn calling_convention_for_symbol<'tcx>(
|
|||
pub(crate) fn linking_symbol_name_for_instance_in_crate<'tcx>(
|
||||
tcx: TyCtxt<'tcx>,
|
||||
symbol: ExportedSymbol<'tcx>,
|
||||
export_kind: SymbolExportKind,
|
||||
instantiating_crate: CrateNum,
|
||||
) -> String {
|
||||
let mut undecorated = symbol_name_for_instance_in_crate(tcx, symbol, instantiating_crate);
|
||||
|
|
@ -700,8 +701,9 @@ pub(crate) fn linking_symbol_name_for_instance_in_crate<'tcx>(
|
|||
let prefix = match &target.arch[..] {
|
||||
"x86" => Some('_'),
|
||||
"x86_64" => None,
|
||||
"arm64ec" => Some('#'),
|
||||
// Only x86/64 use symbol decorations.
|
||||
// Only functions are decorated for arm64ec.
|
||||
"arm64ec" if export_kind == SymbolExportKind::Text => Some('#'),
|
||||
// Only x86/64 and arm64ec use symbol decorations.
|
||||
_ => return undecorated,
|
||||
};
|
||||
|
||||
|
|
@ -741,7 +743,7 @@ pub(crate) fn exporting_symbol_name_for_instance_in_crate<'tcx>(
|
|||
/// Add it to the symbols list for all kernel functions, so that it is exported in the linked
|
||||
/// object.
|
||||
pub(crate) fn extend_exported_symbols<'tcx>(
|
||||
symbols: &mut Vec<String>,
|
||||
symbols: &mut Vec<(String, SymbolExportKind)>,
|
||||
tcx: TyCtxt<'tcx>,
|
||||
symbol: ExportedSymbol<'tcx>,
|
||||
instantiating_crate: CrateNum,
|
||||
|
|
@ -755,7 +757,9 @@ pub(crate) fn extend_exported_symbols<'tcx>(
|
|||
let undecorated = symbol_name_for_instance_in_crate(tcx, symbol, instantiating_crate);
|
||||
|
||||
// Add the symbol for the kernel descriptor (with .kd suffix)
|
||||
symbols.push(format!("{undecorated}.kd"));
|
||||
// Per https://llvm.org/docs/AMDGPUUsage.html#symbols these will always be `STT_OBJECT` so
|
||||
// export as data.
|
||||
symbols.push((format!("{undecorated}.kd"), SymbolExportKind::Data));
|
||||
}
|
||||
|
||||
fn maybe_emutls_symbol_name<'tcx>(
|
||||
|
|
|
|||
|
|
@ -14,10 +14,10 @@ use rustc_data_structures::jobserver::{self, Acquired};
|
|||
use rustc_data_structures::memmap::Mmap;
|
||||
use rustc_data_structures::profiling::{SelfProfilerRef, VerboseTimingGuard};
|
||||
use rustc_errors::emitter::Emitter;
|
||||
use rustc_errors::translation::Translate;
|
||||
use rustc_errors::translation::Translator;
|
||||
use rustc_errors::{
|
||||
Diag, DiagArgMap, DiagCtxt, DiagMessage, ErrCode, FatalError, FluentBundle, Level, MultiSpan,
|
||||
Style, Suggestions,
|
||||
Diag, DiagArgMap, DiagCtxt, DiagMessage, ErrCode, FatalError, Level, MultiSpan, Style,
|
||||
Suggestions,
|
||||
};
|
||||
use rustc_fs_util::link_or_copy;
|
||||
use rustc_hir::def_id::{CrateNum, LOCAL_CRATE};
|
||||
|
|
@ -1889,16 +1889,6 @@ impl SharedEmitter {
|
|||
}
|
||||
}
|
||||
|
||||
impl Translate for SharedEmitter {
|
||||
fn fluent_bundle(&self) -> Option<&FluentBundle> {
|
||||
None
|
||||
}
|
||||
|
||||
fn fallback_fluent_bundle(&self) -> &FluentBundle {
|
||||
panic!("shared emitter attempted to translate a diagnostic");
|
||||
}
|
||||
}
|
||||
|
||||
impl Emitter for SharedEmitter {
|
||||
fn emit_diagnostic(
|
||||
&mut self,
|
||||
|
|
@ -1932,6 +1922,10 @@ impl Emitter for SharedEmitter {
|
|||
fn source_map(&self) -> Option<&SourceMap> {
|
||||
None
|
||||
}
|
||||
|
||||
fn translator(&self) -> &Translator {
|
||||
panic!("shared emitter attempted to translate a diagnostic");
|
||||
}
|
||||
}
|
||||
|
||||
impl SharedEmitterMain {
|
||||
|
|
|
|||
|
|
@ -12,9 +12,9 @@ use rustc_data_structures::fx::{FxHashMap, FxIndexSet};
|
|||
use rustc_data_structures::profiling::{get_resident_set_size, print_time_passes_entry};
|
||||
use rustc_data_structures::sync::{IntoDynSyncSend, par_map};
|
||||
use rustc_data_structures::unord::UnordMap;
|
||||
use rustc_hir::ItemId;
|
||||
use rustc_hir::def_id::{DefId, LOCAL_CRATE};
|
||||
use rustc_hir::lang_items::LangItem;
|
||||
use rustc_hir::{ItemId, Target};
|
||||
use rustc_middle::middle::codegen_fn_attrs::CodegenFnAttrs;
|
||||
use rustc_middle::middle::debugger_visualizer::{DebuggerVisualizerFile, DebuggerVisualizerType};
|
||||
use rustc_middle::middle::exported_symbols::{self, SymbolExportKind};
|
||||
|
|
@ -1003,21 +1003,35 @@ impl CrateInfo {
|
|||
// by the compiler, but that's ok because all this stuff is unstable anyway.
|
||||
let target = &tcx.sess.target;
|
||||
if !are_upstream_rust_objects_already_included(tcx.sess) {
|
||||
let missing_weak_lang_items: FxIndexSet<Symbol> = info
|
||||
let add_prefix = match (target.is_like_windows, target.arch.as_ref()) {
|
||||
(true, "x86") => |name: String, _: SymbolExportKind| format!("_{name}"),
|
||||
(true, "arm64ec") => {
|
||||
// Only functions are decorated for arm64ec.
|
||||
|name: String, export_kind: SymbolExportKind| match export_kind {
|
||||
SymbolExportKind::Text => format!("#{name}"),
|
||||
_ => name,
|
||||
}
|
||||
}
|
||||
_ => |name: String, _: SymbolExportKind| name,
|
||||
};
|
||||
let missing_weak_lang_items: FxIndexSet<(Symbol, SymbolExportKind)> = info
|
||||
.used_crates
|
||||
.iter()
|
||||
.flat_map(|&cnum| tcx.missing_lang_items(cnum))
|
||||
.filter(|l| l.is_weak())
|
||||
.filter_map(|&l| {
|
||||
let name = l.link_name()?;
|
||||
lang_items::required(tcx, l).then_some(name)
|
||||
let export_kind = match l.target() {
|
||||
Target::Fn => SymbolExportKind::Text,
|
||||
Target::Static => SymbolExportKind::Data,
|
||||
_ => bug!(
|
||||
"Don't know what the export kind is for lang item of kind {:?}",
|
||||
l.target()
|
||||
),
|
||||
};
|
||||
lang_items::required(tcx, l).then_some((name, export_kind))
|
||||
})
|
||||
.collect();
|
||||
let prefix = match (target.is_like_windows, target.arch.as_ref()) {
|
||||
(true, "x86") => "_",
|
||||
(true, "arm64ec") => "#",
|
||||
_ => "",
|
||||
};
|
||||
|
||||
// This loop only adds new items to values of the hash map, so the order in which we
|
||||
// iterate over the values is not important.
|
||||
|
|
@ -1030,10 +1044,13 @@ impl CrateInfo {
|
|||
.for_each(|(_, linked_symbols)| {
|
||||
let mut symbols = missing_weak_lang_items
|
||||
.iter()
|
||||
.map(|item| {
|
||||
.map(|(item, export_kind)| {
|
||||
(
|
||||
format!("{prefix}{}", mangle_internal_symbol(tcx, item.as_str())),
|
||||
SymbolExportKind::Text,
|
||||
add_prefix(
|
||||
mangle_internal_symbol(tcx, item.as_str()),
|
||||
*export_kind,
|
||||
),
|
||||
*export_kind,
|
||||
)
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
|
|
@ -1048,12 +1065,12 @@ impl CrateInfo {
|
|||
// errors.
|
||||
linked_symbols.extend(ALLOCATOR_METHODS.iter().map(|method| {
|
||||
(
|
||||
format!(
|
||||
"{prefix}{}",
|
||||
add_prefix(
|
||||
mangle_internal_symbol(
|
||||
tcx,
|
||||
global_fn_name(method.name).as_str()
|
||||
)
|
||||
global_fn_name(method.name).as_str(),
|
||||
),
|
||||
SymbolExportKind::Text,
|
||||
),
|
||||
SymbolExportKind::Text,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,11 +3,9 @@ use std::str::FromStr;
|
|||
use rustc_abi::ExternAbi;
|
||||
use rustc_ast::expand::autodiff_attrs::{AutoDiffAttrs, DiffActivity, DiffMode};
|
||||
use rustc_ast::{LitKind, MetaItem, MetaItemInner, attr};
|
||||
use rustc_attr_data_structures::ReprAttr::ReprAlign;
|
||||
use rustc_attr_data_structures::{
|
||||
AttributeKind, InlineAttr, InstructionSetAttr, OptimizeAttr, find_attr,
|
||||
AttributeKind, InlineAttr, InstructionSetAttr, OptimizeAttr, ReprAttr, find_attr,
|
||||
};
|
||||
use rustc_data_structures::fx::FxHashMap;
|
||||
use rustc_hir::def::DefKind;
|
||||
use rustc_hir::def_id::{DefId, LOCAL_CRATE, LocalDefId};
|
||||
use rustc_hir::weak_lang_items::WEAK_LANG_ITEMS;
|
||||
|
|
@ -19,13 +17,16 @@ use rustc_middle::mir::mono::Linkage;
|
|||
use rustc_middle::query::Providers;
|
||||
use rustc_middle::span_bug;
|
||||
use rustc_middle::ty::{self as ty, TyCtxt};
|
||||
use rustc_session::lint;
|
||||
use rustc_session::parse::feature_err;
|
||||
use rustc_session::{Session, lint};
|
||||
use rustc_span::{Ident, Span, sym};
|
||||
use rustc_target::spec::SanitizerSet;
|
||||
|
||||
use crate::errors;
|
||||
use crate::target_features::{check_target_feature_trait_unsafe, from_target_feature_attr};
|
||||
use crate::errors::NoMangleNameless;
|
||||
use crate::target_features::{
|
||||
check_target_feature_trait_unsafe, check_tied_features, from_target_feature_attr,
|
||||
};
|
||||
|
||||
fn linkage_by_name(tcx: TyCtxt<'_>, def_id: LocalDefId, name: &str) -> Linkage {
|
||||
use rustc_middle::mir::mono::Linkage::*;
|
||||
|
|
@ -87,7 +88,6 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
let mut link_ordinal_span = None;
|
||||
let mut no_sanitize_span = None;
|
||||
let mut mixed_export_name_no_mangle_lint_state = MixedExportNameAndNoMangleState::default();
|
||||
let mut no_mangle_span = None;
|
||||
|
||||
for attr in attrs.iter() {
|
||||
// In some cases, attribute are only valid on functions, but it's the `check_attr`
|
||||
|
|
@ -95,17 +95,15 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
// In these cases, we bail from performing further checks that are only meaningful for
|
||||
// functions (such as calling `fn_sig`, which ICEs if given a non-function). We also
|
||||
// report a delayed bug, just in case `check_attr` isn't doing its job.
|
||||
let fn_sig = || {
|
||||
let fn_sig = |attr_span| {
|
||||
use DefKind::*;
|
||||
|
||||
let def_kind = tcx.def_kind(did);
|
||||
if let Fn | AssocFn | Variant | Ctor(..) = def_kind {
|
||||
Some(tcx.fn_sig(did))
|
||||
} else {
|
||||
tcx.dcx().span_delayed_bug(
|
||||
attr.span(),
|
||||
"this attribute can only be applied to functions",
|
||||
);
|
||||
tcx.dcx()
|
||||
.span_delayed_bug(attr_span, "this attribute can only be applied to functions");
|
||||
None
|
||||
}
|
||||
};
|
||||
|
|
@ -115,10 +113,56 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
AttributeKind::Repr(reprs) => {
|
||||
codegen_fn_attrs.alignment = reprs
|
||||
.iter()
|
||||
.filter_map(|(r, _)| if let ReprAlign(x) = r { Some(*x) } else { None })
|
||||
.filter_map(
|
||||
|(r, _)| if let ReprAttr::ReprAlign(x) = r { Some(*x) } else { None },
|
||||
)
|
||||
.max();
|
||||
}
|
||||
AttributeKind::Cold(_) => codegen_fn_attrs.flags |= CodegenFnAttrFlags::COLD,
|
||||
AttributeKind::Naked(_) => codegen_fn_attrs.flags |= CodegenFnAttrFlags::NAKED,
|
||||
AttributeKind::Align { align, .. } => codegen_fn_attrs.alignment = Some(*align),
|
||||
AttributeKind::NoMangle(attr_span) => {
|
||||
if tcx.opt_item_name(did.to_def_id()).is_some() {
|
||||
codegen_fn_attrs.flags |= CodegenFnAttrFlags::NO_MANGLE;
|
||||
mixed_export_name_no_mangle_lint_state.track_no_mangle(
|
||||
*attr_span,
|
||||
tcx.local_def_id_to_hir_id(did),
|
||||
attr,
|
||||
);
|
||||
} else {
|
||||
tcx.dcx().emit_err(NoMangleNameless {
|
||||
span: *attr_span,
|
||||
definition: format!(
|
||||
"{} {}",
|
||||
tcx.def_descr_article(did.to_def_id()),
|
||||
tcx.def_descr(did.to_def_id())
|
||||
),
|
||||
});
|
||||
}
|
||||
}
|
||||
AttributeKind::TrackCaller(attr_span) => {
|
||||
let is_closure = tcx.is_closure_like(did.to_def_id());
|
||||
|
||||
if !is_closure
|
||||
&& let Some(fn_sig) = fn_sig(*attr_span)
|
||||
&& fn_sig.skip_binder().abi() != ExternAbi::Rust
|
||||
{
|
||||
tcx.dcx().emit_err(errors::RequiresRustAbi { span: *attr_span });
|
||||
}
|
||||
if is_closure
|
||||
&& !tcx.features().closure_track_caller()
|
||||
&& !attr_span.allows_unstable(sym::closure_track_caller)
|
||||
{
|
||||
feature_err(
|
||||
&tcx.sess,
|
||||
sym::closure_track_caller,
|
||||
*attr_span,
|
||||
"`#[track_caller]` on closures is currently unstable",
|
||||
)
|
||||
.emit();
|
||||
}
|
||||
codegen_fn_attrs.flags |= CodegenFnAttrFlags::TRACK_CALLER
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
|
@ -128,7 +172,6 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
};
|
||||
|
||||
match name {
|
||||
sym::cold => codegen_fn_attrs.flags |= CodegenFnAttrFlags::COLD,
|
||||
sym::rustc_allocator => codegen_fn_attrs.flags |= CodegenFnAttrFlags::ALLOCATOR,
|
||||
sym::ffi_pure => codegen_fn_attrs.flags |= CodegenFnAttrFlags::FFI_PURE,
|
||||
sym::ffi_const => codegen_fn_attrs.flags |= CodegenFnAttrFlags::FFI_CONST,
|
||||
|
|
@ -138,29 +181,6 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
sym::rustc_allocator_zeroed => {
|
||||
codegen_fn_attrs.flags |= CodegenFnAttrFlags::ALLOCATOR_ZEROED
|
||||
}
|
||||
sym::naked => codegen_fn_attrs.flags |= CodegenFnAttrFlags::NAKED,
|
||||
sym::no_mangle => {
|
||||
no_mangle_span = Some(attr.span());
|
||||
if tcx.opt_item_name(did.to_def_id()).is_some() {
|
||||
codegen_fn_attrs.flags |= CodegenFnAttrFlags::NO_MANGLE;
|
||||
mixed_export_name_no_mangle_lint_state.track_no_mangle(
|
||||
attr.span(),
|
||||
tcx.local_def_id_to_hir_id(did),
|
||||
attr,
|
||||
);
|
||||
} else {
|
||||
tcx.dcx()
|
||||
.struct_span_err(
|
||||
attr.span(),
|
||||
format!(
|
||||
"`#[no_mangle]` cannot be used on {} {} as it has no name",
|
||||
tcx.def_descr_article(did.to_def_id()),
|
||||
tcx.def_descr(did.to_def_id()),
|
||||
),
|
||||
)
|
||||
.emit();
|
||||
}
|
||||
}
|
||||
sym::rustc_std_internal_symbol => {
|
||||
codegen_fn_attrs.flags |= CodegenFnAttrFlags::RUSTC_STD_INTERNAL_SYMBOL
|
||||
}
|
||||
|
|
@ -203,29 +223,6 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
}
|
||||
}
|
||||
sym::thread_local => codegen_fn_attrs.flags |= CodegenFnAttrFlags::THREAD_LOCAL,
|
||||
sym::track_caller => {
|
||||
let is_closure = tcx.is_closure_like(did.to_def_id());
|
||||
|
||||
if !is_closure
|
||||
&& let Some(fn_sig) = fn_sig()
|
||||
&& fn_sig.skip_binder().abi() != ExternAbi::Rust
|
||||
{
|
||||
tcx.dcx().emit_err(errors::RequiresRustAbi { span: attr.span() });
|
||||
}
|
||||
if is_closure
|
||||
&& !tcx.features().closure_track_caller()
|
||||
&& !attr.span().allows_unstable(sym::closure_track_caller)
|
||||
{
|
||||
feature_err(
|
||||
&tcx.sess,
|
||||
sym::closure_track_caller,
|
||||
attr.span(),
|
||||
"`#[track_caller]` on closures is currently unstable",
|
||||
)
|
||||
.emit();
|
||||
}
|
||||
codegen_fn_attrs.flags |= CodegenFnAttrFlags::TRACK_CALLER
|
||||
}
|
||||
sym::export_name => {
|
||||
if let Some(s) = attr.value_str() {
|
||||
if s.as_str().contains('\0') {
|
||||
|
|
@ -449,6 +446,10 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
|
||||
mixed_export_name_no_mangle_lint_state.lint_if_mixed(tcx);
|
||||
|
||||
// Apply the minimum function alignment here, so that individual backends don't have to.
|
||||
codegen_fn_attrs.alignment =
|
||||
Ord::max(codegen_fn_attrs.alignment, tcx.sess.opts.unstable_opts.min_function_alignment);
|
||||
|
||||
let inline_span;
|
||||
(codegen_fn_attrs.inline, inline_span) = if let Some((inline_attr, span)) =
|
||||
find_attr!(attrs, AttributeKind::Inline(i, span) => (*i, *span))
|
||||
|
|
@ -465,33 +466,8 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
codegen_fn_attrs.inline = InlineAttr::Never;
|
||||
}
|
||||
|
||||
codegen_fn_attrs.optimize = attrs.iter().fold(OptimizeAttr::Default, |ia, attr| {
|
||||
if !attr.has_name(sym::optimize) {
|
||||
return ia;
|
||||
}
|
||||
if attr.is_word() {
|
||||
tcx.dcx().emit_err(errors::ExpectedOneArgumentOptimize { span: attr.span() });
|
||||
return ia;
|
||||
}
|
||||
let Some(ref items) = attr.meta_item_list() else {
|
||||
return OptimizeAttr::Default;
|
||||
};
|
||||
|
||||
let [item] = &items[..] else {
|
||||
tcx.dcx().emit_err(errors::ExpectedOneArgumentOptimize { span: attr.span() });
|
||||
return OptimizeAttr::Default;
|
||||
};
|
||||
if item.has_name(sym::size) {
|
||||
OptimizeAttr::Size
|
||||
} else if item.has_name(sym::speed) {
|
||||
OptimizeAttr::Speed
|
||||
} else if item.has_name(sym::none) {
|
||||
OptimizeAttr::DoNotOptimize
|
||||
} else {
|
||||
tcx.dcx().emit_err(errors::InvalidArgumentOptimize { span: item.span() });
|
||||
OptimizeAttr::Default
|
||||
}
|
||||
});
|
||||
codegen_fn_attrs.optimize =
|
||||
find_attr!(attrs, AttributeKind::Optimize(i, _) => *i).unwrap_or(OptimizeAttr::Default);
|
||||
|
||||
// #73631: closures inherit `#[target_feature]` annotations
|
||||
//
|
||||
|
|
@ -567,12 +543,15 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
if codegen_fn_attrs.flags.contains(CodegenFnAttrFlags::RUSTC_STD_INTERNAL_SYMBOL)
|
||||
&& codegen_fn_attrs.flags.contains(CodegenFnAttrFlags::NO_MANGLE)
|
||||
{
|
||||
let no_mangle_span =
|
||||
find_attr!(attrs, AttributeKind::NoMangle(no_mangle_span) => *no_mangle_span)
|
||||
.unwrap_or_default();
|
||||
let lang_item =
|
||||
lang_items::extract(attrs).map_or(None, |(name, _span)| LangItem::from_name(name));
|
||||
let mut err = tcx
|
||||
.dcx()
|
||||
.struct_span_err(
|
||||
no_mangle_span.unwrap_or_default(),
|
||||
no_mangle_span,
|
||||
"`#[no_mangle]` cannot be used on internal language items",
|
||||
)
|
||||
.with_note("Rustc requires this item to have a specific mangled name.")
|
||||
|
|
@ -625,25 +604,6 @@ fn codegen_fn_attrs(tcx: TyCtxt<'_>, did: LocalDefId) -> CodegenFnAttrs {
|
|||
codegen_fn_attrs
|
||||
}
|
||||
|
||||
/// Given a map from target_features to whether they are enabled or disabled, ensure only valid
|
||||
/// combinations are allowed.
|
||||
pub fn check_tied_features(
|
||||
sess: &Session,
|
||||
features: &FxHashMap<&str, bool>,
|
||||
) -> Option<&'static [&'static str]> {
|
||||
if !features.is_empty() {
|
||||
for tied in sess.target.tied_target_features() {
|
||||
// Tied features must be set to the same value, or not set at all
|
||||
let mut tied_iter = tied.iter();
|
||||
let enabled = features.get(tied_iter.next().unwrap());
|
||||
if tied_iter.any(|f| enabled != features.get(f)) {
|
||||
return Some(tied);
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// Checks if the provided DefId is a method in a trait impl for a trait which has track_caller
|
||||
/// applied to the method prototype.
|
||||
fn should_inherit_track_caller(tcx: TyCtxt<'_>, def_id: DefId) -> bool {
|
||||
|
|
|
|||
|
|
@ -208,20 +208,6 @@ pub(crate) struct OutOfRangeInteger {
|
|||
pub span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_expected_one_argument, code = E0722)]
|
||||
pub(crate) struct ExpectedOneArgumentOptimize {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_invalid_argument, code = E0722)]
|
||||
pub(crate) struct InvalidArgumentOptimize {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_copy_path_buf)]
|
||||
pub(crate) struct CopyPathBuf {
|
||||
|
|
@ -1217,30 +1203,6 @@ pub(crate) struct ErrorCreatingImportLibrary<'a> {
|
|||
pub error: String,
|
||||
}
|
||||
|
||||
pub struct TargetFeatureDisableOrEnable<'a> {
|
||||
pub features: &'a [&'a str],
|
||||
pub span: Option<Span>,
|
||||
pub missing_features: Option<MissingFeatures>,
|
||||
}
|
||||
|
||||
#[derive(Subdiagnostic)]
|
||||
#[help(codegen_ssa_missing_features)]
|
||||
pub struct MissingFeatures;
|
||||
|
||||
impl<G: EmissionGuarantee> Diagnostic<'_, G> for TargetFeatureDisableOrEnable<'_> {
|
||||
fn into_diag(self, dcx: DiagCtxtHandle<'_>, level: Level) -> Diag<'_, G> {
|
||||
let mut diag = Diag::new(dcx, level, fluent::codegen_ssa_target_feature_disable_or_enable);
|
||||
if let Some(span) = self.span {
|
||||
diag.span(span);
|
||||
};
|
||||
if let Some(missing_features) = self.missing_features {
|
||||
diag.subdiagnostic(missing_features);
|
||||
}
|
||||
diag.arg("features", self.features.join(", "));
|
||||
diag
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_aix_strip_not_used)]
|
||||
pub(crate) struct AixStripNotUsed;
|
||||
|
|
@ -1283,3 +1245,76 @@ pub(crate) struct XcrunSdkPathWarning {
|
|||
#[derive(LintDiagnostic)]
|
||||
#[diag(codegen_ssa_aarch64_softfloat_neon)]
|
||||
pub(crate) struct Aarch64SoftfloatNeon;
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_unknown_ctarget_feature_prefix)]
|
||||
#[note]
|
||||
pub(crate) struct UnknownCTargetFeaturePrefix<'a> {
|
||||
pub feature: &'a str,
|
||||
}
|
||||
|
||||
#[derive(Subdiagnostic)]
|
||||
pub(crate) enum PossibleFeature<'a> {
|
||||
#[help(codegen_ssa_possible_feature)]
|
||||
Some { rust_feature: &'a str },
|
||||
#[help(codegen_ssa_consider_filing_feature_request)]
|
||||
None,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_unknown_ctarget_feature)]
|
||||
#[note]
|
||||
pub(crate) struct UnknownCTargetFeature<'a> {
|
||||
pub feature: &'a str,
|
||||
#[subdiagnostic]
|
||||
pub rust_feature: PossibleFeature<'a>,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_unstable_ctarget_feature)]
|
||||
#[note]
|
||||
pub(crate) struct UnstableCTargetFeature<'a> {
|
||||
pub feature: &'a str,
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_forbidden_ctarget_feature)]
|
||||
#[note]
|
||||
#[note(codegen_ssa_forbidden_ctarget_feature_issue)]
|
||||
pub(crate) struct ForbiddenCTargetFeature<'a> {
|
||||
pub feature: &'a str,
|
||||
pub enabled: &'a str,
|
||||
pub reason: &'a str,
|
||||
}
|
||||
|
||||
pub struct TargetFeatureDisableOrEnable<'a> {
|
||||
pub features: &'a [&'a str],
|
||||
pub span: Option<Span>,
|
||||
pub missing_features: Option<MissingFeatures>,
|
||||
}
|
||||
|
||||
#[derive(Subdiagnostic)]
|
||||
#[help(codegen_ssa_missing_features)]
|
||||
pub struct MissingFeatures;
|
||||
|
||||
impl<G: EmissionGuarantee> Diagnostic<'_, G> for TargetFeatureDisableOrEnable<'_> {
|
||||
fn into_diag(self, dcx: DiagCtxtHandle<'_>, level: Level) -> Diag<'_, G> {
|
||||
let mut diag = Diag::new(dcx, level, fluent::codegen_ssa_target_feature_disable_or_enable);
|
||||
if let Some(span) = self.span {
|
||||
diag.span(span);
|
||||
};
|
||||
if let Some(missing_features) = self.missing_features {
|
||||
diag.subdiagnostic(missing_features);
|
||||
}
|
||||
diag.arg("features", self.features.join(", "));
|
||||
diag
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Diagnostic)]
|
||||
#[diag(codegen_ssa_no_mangle_nameless)]
|
||||
pub(crate) struct NoMangleNameless {
|
||||
#[primary_span]
|
||||
pub span: Span,
|
||||
pub definition: String,
|
||||
}
|
||||
|
|
|
|||
|
|
@ -218,7 +218,7 @@ pub struct CrateInfo {
|
|||
pub target_cpu: String,
|
||||
pub target_features: Vec<String>,
|
||||
pub crate_types: Vec<CrateType>,
|
||||
pub exported_symbols: UnordMap<CrateType, Vec<String>>,
|
||||
pub exported_symbols: UnordMap<CrateType, Vec<(String, SymbolExportKind)>>,
|
||||
pub linked_symbols: FxIndexMap<CrateType, Vec<(String, SymbolExportKind)>>,
|
||||
pub local_crate_name: Symbol,
|
||||
pub compiler_builtins: Option<CrateNum>,
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue