commit
92f08b78a1
8787 changed files with 158490 additions and 108839 deletions
34
.gitignore
vendored
34
.gitignore
vendored
|
|
@ -1,3 +1,7 @@
|
|||
# This file should only ignore things that are generated during a build,
|
||||
# generated by common IDEs, and optional files controlled by the user
|
||||
# that affect the build (such as config.toml).
|
||||
# FIXME: This needs cleanup.
|
||||
*~
|
||||
.#*
|
||||
.DS_Store
|
||||
|
|
@ -12,22 +16,19 @@ __pycache__/
|
|||
.project
|
||||
.settings/
|
||||
.valgrindrc
|
||||
.vscode/
|
||||
.vscode
|
||||
.favorites.json
|
||||
/*-*-*-*/
|
||||
/*-*-*/
|
||||
/Makefile
|
||||
/build
|
||||
/build/
|
||||
/config.toml
|
||||
/dist/
|
||||
/dl/
|
||||
/doc
|
||||
/doc/
|
||||
/inst/
|
||||
/llvm/
|
||||
/mingw-build/
|
||||
/nd/
|
||||
# Created by default with `src/ci/docker/run.sh`:
|
||||
/obj/
|
||||
/rt/
|
||||
/rustllvm/
|
||||
/src/libcore/unicode/DerivedCoreProperties.txt
|
||||
/src/libcore/unicode/DerivedNormalizationProps.txt
|
||||
|
|
@ -36,10 +37,9 @@ __pycache__/
|
|||
/src/libcore/unicode/Scripts.txt
|
||||
/src/libcore/unicode/SpecialCasing.txt
|
||||
/src/libcore/unicode/UnicodeData.txt
|
||||
/stage[0-9]+/
|
||||
/target
|
||||
target/
|
||||
/test/
|
||||
/src/libcore/unicode/downloaded
|
||||
/target/
|
||||
# Generated by compiletest for incremental:
|
||||
/tmp/
|
||||
tags
|
||||
tags.*
|
||||
|
|
@ -49,18 +49,6 @@ TAGS.*
|
|||
\#*\#
|
||||
config.mk
|
||||
config.stamp
|
||||
keywords.md
|
||||
lexer.ml
|
||||
mir_dump
|
||||
Session.vim
|
||||
src/etc/dl
|
||||
tmp.*.rs
|
||||
version.md
|
||||
version.ml
|
||||
version.texi
|
||||
.cargo
|
||||
!src/vendor/**
|
||||
/src/target/
|
||||
|
||||
no_llvm_build
|
||||
|
||||
|
|
|
|||
8
.gitmodules
vendored
8
.gitmodules
vendored
|
|
@ -31,9 +31,9 @@
|
|||
[submodule "src/llvm-emscripten"]
|
||||
path = src/llvm-emscripten
|
||||
url = https://github.com/rust-lang/llvm.git
|
||||
[submodule "src/stdsimd"]
|
||||
path = src/stdsimd
|
||||
url = https://github.com/rust-lang-nursery/stdsimd.git
|
||||
[submodule "src/stdarch"]
|
||||
path = src/stdarch
|
||||
url = https://github.com/rust-lang/stdarch.git
|
||||
[submodule "src/doc/rustc-guide"]
|
||||
path = src/doc/rustc-guide
|
||||
url = https://github.com/rust-lang/rustc-guide.git
|
||||
|
|
@ -43,7 +43,7 @@
|
|||
[submodule "src/llvm-project"]
|
||||
path = src/llvm-project
|
||||
url = https://github.com/rust-lang/llvm-project.git
|
||||
branch = rustc/8.0-2019-03-18
|
||||
branch = rustc/9.0-2019-07-12
|
||||
[submodule "src/doc/embedded-book"]
|
||||
path = src/doc/embedded-book
|
||||
url = https://github.com/rust-embedded/book.git
|
||||
|
|
|
|||
33
.mailmap
33
.mailmap
|
|
@ -5,8 +5,8 @@
|
|||
# email addresses.
|
||||
#
|
||||
|
||||
Aaron Power <theaaronepower@gmail.com> Erin Power <xampprocky@gmail.com>
|
||||
Aaron Todd <github@opprobrio.us>
|
||||
Aaron Power <theaaronepower@gmail.com>
|
||||
Abhishek Chanda <abhishek.becs@gmail.com> Abhishek Chanda <abhishek@cloudscaling.com>
|
||||
Adolfo Ochagavía <aochagavia92@gmail.com>
|
||||
Adrien Tétar <adri-from-59@hotmail.fr>
|
||||
|
|
@ -29,8 +29,8 @@ Ariel Ben-Yehuda <arielb1@mail.tau.ac.il> Ariel Ben-Yehuda <ariel.byd@gmail.com>
|
|||
Ariel Ben-Yehuda <arielb1@mail.tau.ac.il> arielb1 <arielb1@mail.tau.ac.il>
|
||||
Austin Seipp <mad.one@gmail.com> <as@hacks.yi.org>
|
||||
Aydin Kim <ladinjin@hanmail.net> aydin.kim <aydin.kim@samsung.com>
|
||||
Bastian Kauschke <bastian_kauschke@hotmail.de>
|
||||
Barosl Lee <vcs@barosl.com> Barosl LEE <github@barosl.com>
|
||||
Bastian Kauschke <bastian_kauschke@hotmail.de>
|
||||
Ben Alpert <ben@benalpert.com> <spicyjalapeno@gmail.com>
|
||||
Ben Sago <ogham@users.noreply.github.com> Ben S <ogham@bsago.me>
|
||||
Ben Sago <ogham@users.noreply.github.com> Ben S <ogham@users.noreply.github.com>
|
||||
|
|
@ -46,22 +46,24 @@ Brian Anderson <banderson@mozilla.com> <banderson@mozilla.org>
|
|||
Brian Dawn <brian.t.dawn@gmail.com>
|
||||
Brian Leibig <brian@brianleibig.com> Brian Leibig <brian.leibig@gmail.com>
|
||||
Carl-Anton Ingmarsson <mail@carlanton.se> <ca.ingmarsson@gmail.com>
|
||||
Carol (Nichols || Goulding) <carol.nichols@gmail.com> <193874+carols10cents@users.noreply.github.com>
|
||||
Carol (Nichols || Goulding) <carol.nichols@gmail.com> <carol.nichols@gmail.com>
|
||||
Carol (Nichols || Goulding) <carol.nichols@gmail.com> <cnichols@thinkthroughmath.com>
|
||||
Carol (Nichols || Goulding) <carol.nichols@gmail.com> Carol Nichols <carol.nichols@gmail.com>
|
||||
Carol Willing <carolcode@willingconsulting.com>
|
||||
Chris C Cerami <chrisccerami@users.noreply.github.com> Chris C Cerami <chrisccerami@gmail.com>
|
||||
Chris Pressey <cpressey@gmail.com>
|
||||
Chris Thorn <chris@thorn.co> Chris Thorn <thorn@thoughtbot.com>
|
||||
Chris Vittal <christopher.vittal@gmail.com> Christopher Vittal <christopher.vittal@gmail.com>
|
||||
Christian Poveda <christianpoveda@protonmail.com> <z1mvader@protonmail.com>
|
||||
Christian Poveda <christianpoveda@protonmail.com> <cn.poveda.ruiz@gmail.com>
|
||||
Christian Poveda <christianpoveda@protonmail.com> <z1mvader@protonmail.com>
|
||||
Christian Poveda <christianpoveda@protonmail.com> <cpovedar@fnal.gov>
|
||||
Clark Gaebel <cg.wowus.cg@gmail.com> <cgaebel@mozilla.com>
|
||||
Clinton Ryan <clint.ryan3@gmail.com>
|
||||
Corey Richardson <corey@octayn.net> Elaine "See More" Nemo <corey@octayn.net>
|
||||
Cyryl Płotnicki <cyplo@cyplo.net>
|
||||
Damien Schoof <damien.schoof@gmail.com>
|
||||
Daniel Ramos <dan@daramos.com>
|
||||
Daniel J Rollins <drollins@financialforce.com>
|
||||
Daniel Ramos <dan@daramos.com>
|
||||
David Klein <david.klein@baesystemsdetica.com>
|
||||
David Manescu <david.manescu@gmail.com> <dman2626@uni.sydney.edu.au>
|
||||
David Ross <daboross@daboross.net>
|
||||
|
|
@ -70,9 +72,9 @@ Diggory Hardy <diggory.hardy@gmail.com> Diggory Hardy <github@dhardy.name>
|
|||
Dylan Braithwaite <dylanbraithwaite1@gmail.com> <mail@dylanb.me>
|
||||
Dzmitry Malyshau <kvarkus@gmail.com>
|
||||
E. Dunham <edunham@mozilla.com> edunham <edunham@mozilla.com>
|
||||
Eduard-Mihai Burtescu <edy.burt@gmail.com>
|
||||
Eduardo Bautista <me@eduardobautista.com> <=>
|
||||
Eduardo Bautista <me@eduardobautista.com> <mail@eduardobautista.com>
|
||||
Eduard-Mihai Burtescu <edy.burt@gmail.com>
|
||||
Elliott Slaughter <elliottslaughter@gmail.com> <eslaughter@mozilla.com>
|
||||
Elly Fong-Jones <elly@leptoquark.net>
|
||||
Eric Holk <eric.holk@gmail.com> <eholk@cs.indiana.edu>
|
||||
|
|
@ -80,8 +82,8 @@ Eric Holk <eric.holk@gmail.com> <eholk@mozilla.com>
|
|||
Eric Holmes <eric@ejholmes.net>
|
||||
Eric Reed <ecreed@cs.washington.edu> <ereed@mozilla.com>
|
||||
Erick Tryzelaar <erick.tryzelaar@gmail.com> <etryzelaar@iqt.org>
|
||||
Esteban Küber <esteban@kuber.com.ar> <estebank@users.noreply.github.com>
|
||||
Esteban Küber <esteban@kuber.com.ar> <esteban@commure.com>
|
||||
Esteban Küber <esteban@kuber.com.ar> <estebank@users.noreply.github.com>
|
||||
Esteban Küber <esteban@kuber.com.ar> <github@kuber.com.ar>
|
||||
Evgeny Sologubov
|
||||
Falco Hirschenberger <falco.hirschenberger@gmail.com> <hirschen@itwm.fhg.de>
|
||||
|
|
@ -102,9 +104,9 @@ Herman J. Radtke III <herman@hermanradtke.com> Herman J. Radtke III <hermanradtk
|
|||
Ilyong Cho <ilyoan@gmail.com>
|
||||
Ivan Ivaschenko <defuz.net@gmail.com>
|
||||
J. J. Weber <jjweber@gmail.com>
|
||||
Jakub Adam Wieczorek <jakub.adam.wieczorek@gmail.com> <jakub.bukaj@yahoo.com>
|
||||
Jakub Adam Wieczorek <jakub.adam.wieczorek@gmail.com> <jakub@jakub.cc>
|
||||
Jakub Adam Wieczorek <jakub.adam.wieczorek@gmail.com> <jakubw@jakubw.net>
|
||||
Jakub Adam Wieczorek <jakub.adam.wieczorek@gmail.com> <jakub.bukaj@yahoo.com>
|
||||
James Deng <cnjamesdeng@gmail.com> <cnJamesDeng@gmail.com>
|
||||
James Miller <bladeon@gmail.com> <james@aatch.net>
|
||||
James Perry <james.austin.perry@gmail.com>
|
||||
|
|
@ -119,6 +121,7 @@ Jethro Beekman <github@jbeekman.nl>
|
|||
Jihyun Yu <j.yu@navercorp.com> <yjh0502@gmail.com>
|
||||
Jihyun Yu <j.yu@navercorp.com> jihyun <jihyun@nablecomm.com>
|
||||
Jihyun Yu <j.yu@navercorp.com> Jihyun Yu <jihyun@nclab.kaist.ac.kr>
|
||||
João Oliveira <hello@jxs.pt> joaoxsouls <joaoxsouls@gmail.com>
|
||||
Johann Hofmann <git@johann-hofmann.com> Johann <git@johann-hofmann.com>
|
||||
John Clements <clements@racket-lang.org> <clements@brinckerhoff.org>
|
||||
John Hodge <acessdev@gmail.com> John Hodge <tpg@mutabah.net>
|
||||
|
|
@ -129,13 +132,15 @@ Jonathan S <gereeter@gmail.com> Jonathan S <gereeter+code@gmail.com>
|
|||
Jonathan Turner <probata@hotmail.com>
|
||||
Jorge Aparicio <japaric@linux.com> <japaricious@gmail.com>
|
||||
Joseph Martin <pythoner6@gmail.com>
|
||||
João Oliveira <hello@jxs.pt> joaoxsouls <joaoxsouls@gmail.com>
|
||||
Joseph T. Lyons <JosephTLyons@gmail.com> <josephtlyons@gmail.com>
|
||||
Joseph T. Lyons <JosephTLyons@gmail.com> <JosephTLyons@users.noreply.github.com>
|
||||
Junyoung Cho <june0.cho@samsung.com>
|
||||
Jyun-Yan You <jyyou.tw@gmail.com> <jyyou@cs.nctu.edu.tw>
|
||||
Kang Seonghoon <kang.seonghoon@mearie.org> <public+git@mearie.org>
|
||||
Keegan McAllister <mcallister.keegan@gmail.com> <kmcallister@mozilla.com>
|
||||
Kevin Butler <haqkrs@gmail.com>
|
||||
Kyeongwoon Lee <kyeongwoon.lee@samsung.com>
|
||||
Laurențiu Nicola <lnicola@dend.ro>
|
||||
Lee Jeffery <leejeffery@gmail.com> Lee Jeffery <lee@leejeffery.co.uk>
|
||||
Lee Wondong <wdlee91@gmail.com>
|
||||
Lennart Kudling <github@kudling.de>
|
||||
|
|
@ -145,8 +150,6 @@ Lindsey Kuper <lindsey@composition.al> <lkuper@mozilla.com>
|
|||
Luke Metz <luke.metz@students.olin.edu>
|
||||
Luqman Aden <me@luqman.ca> <laden@csclub.uwaterloo.ca>
|
||||
Luqman Aden <me@luqman.ca> <laden@mozilla.com>
|
||||
NAKASHIMA, Makoto <makoto.nksm+github@gmail.com> <makoto.nksm@gmail.com>
|
||||
NAKASHIMA, Makoto <makoto.nksm+github@gmail.com> <makoto.nksm+github@gmail.com>
|
||||
Marcell Pardavi <marcell.pardavi@gmail.com>
|
||||
Margaret Meyerhofer <mmeyerho@andrew.cmu.edu> <mmeyerho@andrew>
|
||||
Mark Rousskov <mark.simulacrum@gmail.com>
|
||||
|
|
@ -164,15 +167,19 @@ Matthijs Hofstra <thiezz@gmail.com>
|
|||
Melody Horn <melody@boringcactus.com> <mathphreak@gmail.com>
|
||||
Michael Williams <m.t.williams@live.com>
|
||||
Michael Woerister <michaelwoerister@posteo> <michaelwoerister@gmail>
|
||||
Michael Woerister <michaelwoerister@posteo> <michaelwoerister@users.noreply.github.com>
|
||||
Michael Woerister <michaelwoerister@posteo> <michaelwoerister@posteo.net>
|
||||
Mickaël Raybaud-Roig <raybaudroigm@gmail.com> m-r-r <raybaudroigm@gmail.com>
|
||||
Ms2ger <ms2ger@gmail.com> <Ms2ger@gmail.com>
|
||||
Mukilan Thiagarajan <mukilanthiagarajan@gmail.com>
|
||||
NAKASHIMA, Makoto <makoto.nksm+github@gmail.com> <makoto.nksm@gmail.com>
|
||||
NAKASHIMA, Makoto <makoto.nksm+github@gmail.com> <makoto.nksm+github@gmail.com>
|
||||
Nathan West <Lucretiel@gmail.com> <lucretiel@gmail.com>
|
||||
Nathan Wilson <wilnathan@gmail.com>
|
||||
Nathaniel Herman <nherman@post.harvard.edu> Nathaniel Herman <nherman@college.harvard.edu>
|
||||
Neil Pankey <npankey@gmail.com> <neil@wire.im>
|
||||
Nicole Mazzuca <npmazzuca@gmail.com>
|
||||
Nick Platt <platt.nicholas@gmail.com>
|
||||
Nicole Mazzuca <npmazzuca@gmail.com>
|
||||
Nif Ward <nif.ward@gmail.com>
|
||||
Oliver Schneider <oliver.schneider@kit.edu> oli-obk <github6541940@oli-obk.de>
|
||||
Oliver Schneider <oliver.schneider@kit.edu> Oliver 'ker' Schneider <rust19446194516@oli-obk.de>
|
||||
|
|
@ -230,8 +237,8 @@ Tim JIANG <p90eri@gmail.com>
|
|||
Tim Joseph Dumol <tim@timdumol.com>
|
||||
Torsten Weber <TorstenWeber12@gmail.com> <torstenweber12@gmail.com>
|
||||
Ty Overby <ty@pre-alpha.com>
|
||||
Ulrik Sverdrup <bluss@users.noreply.github.com> bluss <bluss>
|
||||
Ulrik Sverdrup <bluss@users.noreply.github.com> bluss <bluss@users.noreply.github.com>
|
||||
Ulrik Sverdrup <bluss@users.noreply.github.com> bluss <bluss>
|
||||
Ulrik Sverdrup <bluss@users.noreply.github.com> Ulrik Sverdrup <root@localhost>
|
||||
Vadim Petrochenkov <vadim.petrochenkov@gmail.com>
|
||||
Vadim Petrochenkov <vadim.petrochenkov@gmail.com> petrochenkov <vadim.petrochenkov@gmail.com>
|
||||
|
|
|
|||
379
.travis.yml
379
.travis.yml
|
|
@ -1,379 +0,0 @@
|
|||
language: shell
|
||||
sudo: required
|
||||
dist: xenial
|
||||
services:
|
||||
- docker
|
||||
addons:
|
||||
apt:
|
||||
packages:
|
||||
- gdb
|
||||
|
||||
git:
|
||||
depth: 2
|
||||
submodules: false
|
||||
|
||||
env:
|
||||
global:
|
||||
- CI_JOB_NAME=$TRAVIS_JOB_NAME
|
||||
|
||||
matrix:
|
||||
fast_finish: true
|
||||
include:
|
||||
# Images used in testing PR and try-build should be run first.
|
||||
- env: IMAGE=x86_64-gnu-llvm-6.0 RUST_BACKTRACE=1
|
||||
name: x86_64-gnu-llvm-6.0
|
||||
if: type = pull_request OR branch = auto
|
||||
|
||||
- env: IMAGE=dist-x86_64-linux DEPLOY=1
|
||||
name: dist-x86_64-linux
|
||||
if: branch = try OR branch = auto
|
||||
|
||||
# "alternate" deployments, these are "nightlies" but have LLVM assertions
|
||||
# turned on, they're deployed to a different location primarily for
|
||||
# additional testing.
|
||||
- env: IMAGE=dist-x86_64-linux DEPLOY_ALT=1
|
||||
name: dist-x86_64-linux-alt
|
||||
if: branch = try OR branch = auto
|
||||
|
||||
- env: >
|
||||
RUST_CHECK_TARGET=dist
|
||||
RUST_CONFIGURE_ARGS="--enable-extended --enable-profiler --enable-lldb --set rust.jemalloc"
|
||||
SRC=.
|
||||
DEPLOY_ALT=1
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT=1
|
||||
MACOSX_DEPLOYMENT_TARGET=10.7
|
||||
NO_LLVM_ASSERTIONS=1
|
||||
NO_DEBUG_ASSERTIONS=1
|
||||
os: osx
|
||||
osx_image: xcode9.3-moar
|
||||
name: dist-x86_64-apple-alt
|
||||
if: branch = auto
|
||||
|
||||
# macOS builders. These are placed near the beginning because they are very
|
||||
# slow to run.
|
||||
|
||||
# OSX builders running tests, these run the full test suite.
|
||||
# NO_DEBUG_ASSERTIONS=1 to make them go faster, but also do have some
|
||||
# runners that run `//ignore-debug` tests.
|
||||
#
|
||||
# Note that the compiler is compiled to target 10.8 here because the Xcode
|
||||
# version that we're using, 8.2, cannot compile LLVM for OSX 10.7.
|
||||
- env: >
|
||||
RUST_CHECK_TARGET=check
|
||||
RUST_CONFIGURE_ARGS="--build=x86_64-apple-darwin --enable-sanitizers --enable-profiler --set rust.jemalloc"
|
||||
SRC=.
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT=1
|
||||
MACOSX_DEPLOYMENT_TARGET=10.8
|
||||
MACOSX_STD_DEPLOYMENT_TARGET=10.7
|
||||
NO_LLVM_ASSERTIONS=1
|
||||
NO_DEBUG_ASSERTIONS=1
|
||||
os: osx
|
||||
osx_image: xcode9.3-moar
|
||||
name: x86_64-apple
|
||||
if: branch = auto
|
||||
|
||||
- env: >
|
||||
RUST_CHECK_TARGET=check
|
||||
RUST_CONFIGURE_ARGS="--build=i686-apple-darwin --set rust.jemalloc"
|
||||
SRC=.
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT=1
|
||||
MACOSX_DEPLOYMENT_TARGET=10.8
|
||||
MACOSX_STD_DEPLOYMENT_TARGET=10.7
|
||||
NO_LLVM_ASSERTIONS=1
|
||||
NO_DEBUG_ASSERTIONS=1
|
||||
os: osx
|
||||
osx_image: xcode9.3-moar
|
||||
name: i686-apple
|
||||
if: branch = auto
|
||||
|
||||
# OSX builders producing releases. These do not run the full test suite and
|
||||
# just produce a bunch of artifacts.
|
||||
#
|
||||
# Note that these are running in the `xcode7` image instead of the
|
||||
# `xcode8.2` image as above. That's because we want to build releases for
|
||||
# OSX 10.7 and `xcode7` is the latest Xcode able to compile LLVM for 10.7.
|
||||
- env: >
|
||||
RUST_CHECK_TARGET=dist
|
||||
RUST_CONFIGURE_ARGS="--build=i686-apple-darwin --enable-full-tools --enable-profiler --enable-lldb --set rust.jemalloc"
|
||||
SRC=.
|
||||
DEPLOY=1
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT=1
|
||||
MACOSX_DEPLOYMENT_TARGET=10.7
|
||||
NO_LLVM_ASSERTIONS=1
|
||||
NO_DEBUG_ASSERTIONS=1
|
||||
DIST_REQUIRE_ALL_TOOLS=1
|
||||
os: osx
|
||||
osx_image: xcode9.3-moar
|
||||
name: dist-i686-apple
|
||||
if: branch = auto
|
||||
|
||||
- env: >
|
||||
RUST_CHECK_TARGET=dist
|
||||
RUST_CONFIGURE_ARGS="--target=aarch64-apple-ios,armv7-apple-ios,armv7s-apple-ios,i386-apple-ios,x86_64-apple-ios --enable-full-tools --enable-sanitizers --enable-profiler --enable-lldb --set rust.jemalloc"
|
||||
SRC=.
|
||||
DEPLOY=1
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT=1
|
||||
MACOSX_DEPLOYMENT_TARGET=10.7
|
||||
NO_LLVM_ASSERTIONS=1
|
||||
NO_DEBUG_ASSERTIONS=1
|
||||
DIST_REQUIRE_ALL_TOOLS=1
|
||||
os: osx
|
||||
osx_image: xcode9.3-moar
|
||||
name: dist-x86_64-apple
|
||||
if: branch = auto
|
||||
|
||||
# Linux builders, remaining docker images
|
||||
- env: IMAGE=arm-android
|
||||
name: arm-android
|
||||
if: branch = auto
|
||||
- env: IMAGE=armhf-gnu
|
||||
name: armhf-gnu
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-various-1 DEPLOY=1
|
||||
name: dist-various-1
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-various-2 DEPLOY=1
|
||||
name: dist-various-2
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-aarch64-linux DEPLOY=1
|
||||
name: dist-aarch64-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-android DEPLOY=1
|
||||
name: dist-android
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-arm-linux DEPLOY=1
|
||||
name: dist-arm-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-armhf-linux DEPLOY=1
|
||||
name: dist-armhf-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-armv7-linux DEPLOY=1
|
||||
name: dist-armv7-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-i586-gnu-i586-i686-musl DEPLOY=1
|
||||
name: dist-i586-gnu-i586-i686-musl
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-i686-freebsd DEPLOY=1
|
||||
name: dist-i686-freebsd
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-i686-linux DEPLOY=1
|
||||
name: dist-i686-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-mips-linux DEPLOY=1
|
||||
name: dist-mips-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-mips64-linux DEPLOY=1
|
||||
name: dist-mips64-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-mips64el-linux DEPLOY=1
|
||||
name: dist-mips64el-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-mipsel-linux DEPLOY=1
|
||||
name: dist-mipsel-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-powerpc-linux DEPLOY=1
|
||||
name: dist-powerpc-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-powerpc64-linux DEPLOY=1
|
||||
name: dist-powerpc64-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-powerpc64le-linux DEPLOY=1
|
||||
name: dist-powerpc64le-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-s390x-linux DEPLOY=1
|
||||
name: dist-s390x-linux
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-x86_64-freebsd DEPLOY=1
|
||||
name: dist-x86_64-freebsd
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-x86_64-musl DEPLOY=1
|
||||
name: dist-x86_64-musl
|
||||
if: branch = auto
|
||||
- env: IMAGE=dist-x86_64-netbsd DEPLOY=1
|
||||
name: dist-x86_64-netbsd
|
||||
if: branch = auto
|
||||
- env: IMAGE=asmjs
|
||||
name: asmjs
|
||||
if: branch = auto
|
||||
- env: IMAGE=i686-gnu
|
||||
name: i686-gnu
|
||||
if: branch = auto
|
||||
- env: IMAGE=i686-gnu-nopt
|
||||
name: i686-gnu-nopt
|
||||
if: branch = auto
|
||||
- env: IMAGE=test-various
|
||||
name: test-various
|
||||
if: branch = auto
|
||||
- env: IMAGE=x86_64-gnu
|
||||
name: x86_64-gnu
|
||||
if: branch = auto
|
||||
- env: IMAGE=x86_64-gnu-full-bootstrap
|
||||
name: x86_64-gnu-full-bootstrap
|
||||
if: branch = auto
|
||||
- env: IMAGE=x86_64-gnu-aux
|
||||
name: x86_64-gnu-aux
|
||||
if: branch = auto
|
||||
- env: IMAGE=x86_64-gnu-tools
|
||||
name: x86_64-gnu-tools
|
||||
if: branch = auto OR (type = pull_request AND commit_message =~ /(?i:^update.*\b(rls|rustfmt|clippy|miri|cargo)\b)/)
|
||||
- env: IMAGE=x86_64-gnu-debug
|
||||
name: x86_64-gnu-debug
|
||||
if: branch = auto
|
||||
- env: IMAGE=x86_64-gnu-nopt
|
||||
name: x86_64-gnu-nopt
|
||||
if: branch = auto
|
||||
- env: IMAGE=x86_64-gnu-distcheck
|
||||
name: x86_64-gnu-distcheck
|
||||
if: branch = auto
|
||||
- env: IMAGE=mingw-check
|
||||
name: mingw-check
|
||||
if: type = pull_request OR branch = auto
|
||||
|
||||
- stage: publish toolstate
|
||||
if: branch = master AND type = push
|
||||
before_install: []
|
||||
install: []
|
||||
sudo: false
|
||||
script:
|
||||
MESSAGE_FILE=$(mktemp -t msg.XXXXXX);
|
||||
. src/ci/docker/x86_64-gnu-tools/repo.sh;
|
||||
commit_toolstate_change "$MESSAGE_FILE" "$TRAVIS_BUILD_DIR/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "$MESSAGE_FILE" "$TOOLSTATE_REPO_ACCESS_TOKEN";
|
||||
|
||||
before_install:
|
||||
# We'll use the AWS cli to download/upload cached docker layers as well as
|
||||
# push our deployments, so download that here.
|
||||
- pip install --user awscli; export PATH=$PATH:$HOME/.local/bin:$HOME/Library/Python/2.7/bin/
|
||||
- mkdir -p $HOME/rustsrc
|
||||
# FIXME(#46924): these two commands are required to enable IPv6,
|
||||
# they shouldn't exist, please revert once more official solutions appeared.
|
||||
# see https://github.com/travis-ci/travis-ci/issues/8891#issuecomment-353403729
|
||||
- if [ "$TRAVIS_OS_NAME" = linux ]; then
|
||||
echo '{"ipv6":true,"fixed-cidr-v6":"fd9a:8454:6789:13f7::/64"}' | sudo tee /etc/docker/daemon.json;
|
||||
sudo service docker restart;
|
||||
fi
|
||||
|
||||
install:
|
||||
- case "$TRAVIS_OS_NAME" in
|
||||
linux)
|
||||
travis_retry curl -fo $HOME/stamp https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2017-03-17-stamp-x86_64-unknown-linux-musl &&
|
||||
chmod +x $HOME/stamp &&
|
||||
export PATH=$PATH:$HOME
|
||||
;;
|
||||
osx)
|
||||
if [[ "$RUST_CHECK_TARGET" == dist ]]; then
|
||||
travis_retry brew update &&
|
||||
travis_retry brew install xz &&
|
||||
travis_retry brew install swig;
|
||||
fi &&
|
||||
travis_retry curl -fo /usr/local/bin/sccache https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2018-04-02-sccache-x86_64-apple-darwin &&
|
||||
chmod +x /usr/local/bin/sccache &&
|
||||
travis_retry curl -fo /usr/local/bin/stamp https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2017-03-17-stamp-x86_64-apple-darwin &&
|
||||
chmod +x /usr/local/bin/stamp &&
|
||||
travis_retry curl -f http://releases.llvm.org/7.0.0/clang+llvm-7.0.0-x86_64-apple-darwin.tar.xz | tar xJf - &&
|
||||
export CC=`pwd`/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang &&
|
||||
export CXX=`pwd`/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang++ &&
|
||||
export AR=ar
|
||||
;;
|
||||
esac
|
||||
|
||||
before_script:
|
||||
- >
|
||||
echo "#### Disk usage before running script:";
|
||||
df -h;
|
||||
du . | sort -nr | head -n100
|
||||
- >
|
||||
RUN_SCRIPT="src/ci/init_repo.sh . $HOME/rustsrc";
|
||||
if [ "$TRAVIS_OS_NAME" = "osx" ]; then
|
||||
export RUN_SCRIPT="$RUN_SCRIPT && src/ci/run.sh";
|
||||
else
|
||||
export RUN_SCRIPT="$RUN_SCRIPT && src/ci/docker/run.sh $IMAGE";
|
||||
# Enable core dump on Linux.
|
||||
sudo sh -c 'echo "/checkout/obj/cores/core.%p.%E" > /proc/sys/kernel/core_pattern';
|
||||
fi
|
||||
- >
|
||||
if [ "$IMAGE" = mingw-check ]; then
|
||||
# verify the publish_toolstate script works.
|
||||
git clone --depth=1 https://github.com/rust-lang-nursery/rust-toolstate.git;
|
||||
cd rust-toolstate;
|
||||
python2.7 "$TRAVIS_BUILD_DIR/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "" "";
|
||||
cd ..;
|
||||
rm -rf rust-toolstate;
|
||||
fi
|
||||
|
||||
# Log time information from this machine and an external machine for insight into possible
|
||||
# clock drift. Timezones don't matter since relative deltas give all the necessary info.
|
||||
script:
|
||||
- >
|
||||
date && (curl -fs --head https://google.com | grep ^Date: | sed 's/Date: //g' || true)
|
||||
- stamp sh -x -c "$RUN_SCRIPT"
|
||||
- >
|
||||
date && (curl -fs --head https://google.com | grep ^Date: | sed 's/Date: //g' || true)
|
||||
|
||||
after_success:
|
||||
- >
|
||||
echo "#### Build successful; Disk usage after running script:";
|
||||
df -h;
|
||||
du . | sort -nr | head -n100
|
||||
- >
|
||||
if [ "$DEPLOY$DEPLOY_ALT" == "1" ]; then
|
||||
mkdir -p deploy/$TRAVIS_COMMIT;
|
||||
if [ "$TRAVIS_OS_NAME" == "osx" ]; then
|
||||
rm -rf build/dist/doc &&
|
||||
cp -r build/dist/* deploy/$TRAVIS_COMMIT;
|
||||
else
|
||||
rm -rf obj/build/dist/doc &&
|
||||
cp -r obj/build/dist/* deploy/$TRAVIS_COMMIT;
|
||||
fi;
|
||||
ls -la deploy/$TRAVIS_COMMIT;
|
||||
deploy_dir=rustc-builds;
|
||||
if [ "$DEPLOY_ALT" == "1" ]; then
|
||||
deploy_dir=rustc-builds-alt;
|
||||
fi;
|
||||
travis_retry aws s3 cp --no-progress --recursive --acl public-read ./deploy s3://rust-lang-ci2/$deploy_dir
|
||||
fi
|
||||
|
||||
after_failure:
|
||||
- >
|
||||
echo "#### Build failed; Disk usage after running script:";
|
||||
df -h;
|
||||
du . | sort -nr | head -n100
|
||||
|
||||
# Random attempt at debugging currently. Just poking around in here to see if
|
||||
# anything shows up.
|
||||
|
||||
# Dump backtrace for macOS
|
||||
- ls -lat $HOME/Library/Logs/DiagnosticReports/
|
||||
- find $HOME/Library/Logs/DiagnosticReports
|
||||
-type f
|
||||
-name '*.crash'
|
||||
-not -name '*.stage2-*.crash'
|
||||
-not -name 'com.apple.CoreSimulator.CoreSimulatorService-*.crash'
|
||||
-exec printf travis_fold":start:crashlog\n\033[31;1m%s\033[0m\n" {} \;
|
||||
-exec head -750 {} \;
|
||||
-exec echo travis_fold":"end:crashlog \; || true
|
||||
|
||||
# Dump backtrace for Linux
|
||||
- ln -s . checkout &&
|
||||
for CORE in obj/cores/core.*; do
|
||||
EXE=$(echo $CORE | sed 's|obj/cores/core\.[0-9]*\.!checkout!\(.*\)|\1|;y|!|/|');
|
||||
if [ -f "$EXE" ]; then
|
||||
printf travis_fold":start:crashlog\n\033[31;1m%s\033[0m\n" "$CORE";
|
||||
gdb --batch -q -c "$CORE" "$EXE"
|
||||
-iex 'set auto-load off'
|
||||
-iex 'dir src/'
|
||||
-iex 'set sysroot .'
|
||||
-ex bt
|
||||
-ex q;
|
||||
echo travis_fold":"end:crashlog;
|
||||
fi;
|
||||
done || true
|
||||
|
||||
# see #50887
|
||||
- cat ./obj/build/x86_64-unknown-linux-gnu/native/asan/build/lib/asan/clang_rt.asan-dynamic-i386.vers || true
|
||||
|
||||
# attempt to debug anything killed by the oom killer on linux, just to see if
|
||||
# it happened
|
||||
- dmesg | grep -i kill
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
|
|
@ -179,7 +179,6 @@ Speaking of tests, Rust has a comprehensive test suite. More information about
|
|||
it can be found [here][rctd].
|
||||
|
||||
### External Dependencies
|
||||
[external-dependencies]: #external-dependencies
|
||||
|
||||
Currently building Rust will also build the following external projects:
|
||||
|
||||
|
|
@ -209,7 +208,6 @@ Breakage is not allowed in the beta and stable channels, and must be addressed
|
|||
before the PR is merged.
|
||||
|
||||
#### Breaking Tools Built With The Compiler
|
||||
[breaking-tools-built-with-the-compiler]: #breaking-tools-built-with-the-compiler
|
||||
|
||||
Rust's build system builds a number of tools that make use of the
|
||||
internals of the compiler. This includes
|
||||
|
|
@ -242,7 +240,7 @@ Here are those same steps in detail:
|
|||
`config.toml.example` in the root directory of the Rust repository.
|
||||
Set `submodules = false` in the `[build]` section. This will prevent `x.py`
|
||||
from resetting to the original branch after you make your changes. If you
|
||||
need to [update any submodules to their latest versions][updating-submodules],
|
||||
need to [update any submodules to their latest versions](#updating-submodules),
|
||||
see the section of this file about that for more information.
|
||||
2. (optional) Run `./x.py test src/tools/rustfmt` (substituting the submodule
|
||||
that broke for `rustfmt`). Fix any errors in the submodule (and possibly others).
|
||||
|
|
@ -256,7 +254,6 @@ Here are those same steps in detail:
|
|||
8. (optional) Send a PR to rust-lang/rust updating the submodule.
|
||||
|
||||
#### Updating submodules
|
||||
[updating-submodules]: #updating-submodules
|
||||
|
||||
These instructions are specific to updating `rustfmt`, however they may apply
|
||||
to the other submodules as well. Please help by improving these instructions
|
||||
|
|
@ -310,7 +307,6 @@ This should change the version listed in `Cargo.lock` to the new version you upd
|
|||
the submodule to. Running `./x.py build` should work now.
|
||||
|
||||
## Writing Documentation
|
||||
[writing-documentation]: #writing-documentation
|
||||
|
||||
Documentation improvements are very welcome. The source of `doc.rust-lang.org`
|
||||
is located in `src/doc` in the tree, and standard API documentation is generated
|
||||
|
|
@ -337,7 +333,6 @@ tracker in that repo is also a great way to find things that need doing. There
|
|||
are issues for beginners and advanced compiler devs alike!
|
||||
|
||||
## Issue Triage
|
||||
[issue-triage]: #issue-triage
|
||||
|
||||
Sometimes, an issue will stay open, even though the bug has been fixed. And
|
||||
sometimes, the original bug may go stale because something has changed in the
|
||||
|
|
@ -405,7 +400,6 @@ If you're looking for somewhere to start, check out the [E-easy][eeasy] tag.
|
|||
[rfcbot]: https://github.com/anp/rfcbot-rs/
|
||||
|
||||
## Out-of-tree Contributions
|
||||
[out-of-tree-contributions]: #out-of-tree-contributions
|
||||
|
||||
There are a number of other ways to contribute to Rust that don't deal with
|
||||
this repository.
|
||||
|
|
@ -425,7 +419,6 @@ valuable!
|
|||
[community-library]: https://github.com/rust-lang/rfcs/labels/A-community-library
|
||||
|
||||
## Helpful Links and Information
|
||||
[helpful-info]: #helpful-info
|
||||
|
||||
For people new to Rust, and just starting to contribute, or even for
|
||||
more seasoned developers, some useful places to look for information
|
||||
|
|
|
|||
2047
Cargo.lock
2047
Cargo.lock
File diff suppressed because it is too large
Load diff
91
README.md
91
README.md
|
|
@ -6,7 +6,6 @@ standard library, and documentation.
|
|||
[Rust]: https://www.rust-lang.org
|
||||
|
||||
## Quick Start
|
||||
[quick-start]: #quick-start
|
||||
|
||||
Read ["Installation"] from [The Book].
|
||||
|
||||
|
|
@ -14,11 +13,15 @@ Read ["Installation"] from [The Book].
|
|||
[The Book]: https://doc.rust-lang.org/book/index.html
|
||||
|
||||
## Installing from Source
|
||||
[building-from-source]: #building-from-source
|
||||
|
||||
_Note: If you wish to contribute to the compiler, you should read
|
||||
[this chapter](https://rust-lang.github.io/rustc-guide/how-to-build-and-run.html)
|
||||
of the rustc-guide instead._
|
||||
_Note: If you wish to contribute to the compiler, you should read [this
|
||||
chapter][rustcguidebuild] of the rustc-guide instead of this section._
|
||||
|
||||
The Rust build system has a Python script called `x.py` to bootstrap building
|
||||
the compiler. More information about it may be found by running `./x.py --help`
|
||||
or reading the [rustc guide][rustcguidebuild].
|
||||
|
||||
[rustcguidebuild]: https://rust-lang.github.io/rustc-guide/how-to-build-and-run.html
|
||||
|
||||
### Building on *nix
|
||||
1. Make sure you have installed the dependencies:
|
||||
|
|
@ -39,43 +42,36 @@ of the rustc-guide instead._
|
|||
|
||||
[source]: https://github.com/rust-lang/rust
|
||||
|
||||
3. Build and install:
|
||||
3. Configure the build settings:
|
||||
|
||||
The Rust build system uses a file named `config.toml` in the root of the
|
||||
source tree to determine various configuration settings for the build.
|
||||
Copy the default `config.toml.example` to `config.toml` to get started.
|
||||
|
||||
```sh
|
||||
$ ./x.py build && sudo ./x.py install
|
||||
$ cp config.toml.example config.toml
|
||||
```
|
||||
|
||||
If after running `sudo ./x.py install` you see an error message like
|
||||
It is recommended that if you plan to use the Rust build system to create
|
||||
an installation (using `./x.py install`) that you set the `prefix` value
|
||||
in the `[install]` section to a directory that you have write permissions.
|
||||
|
||||
```
|
||||
error: failed to load source for a dependency on 'cc'
|
||||
4. Build and install:
|
||||
|
||||
```sh
|
||||
$ ./x.py build && ./x.py install
|
||||
```
|
||||
|
||||
then run these two commands and then try `sudo ./x.py install` again:
|
||||
|
||||
```
|
||||
$ cargo install cargo-vendor
|
||||
```
|
||||
|
||||
```
|
||||
$ cargo vendor
|
||||
```
|
||||
|
||||
> ***Note:*** Install locations can be adjusted by copying the config file
|
||||
> from `./config.toml.example` to `./config.toml`, and
|
||||
> adjusting the `prefix` option under `[install]`. Various other options, such
|
||||
> as enabling debug information, are also supported, and are documented in
|
||||
> the config file.
|
||||
|
||||
When complete, `sudo ./x.py install` will place several programs into
|
||||
`/usr/local/bin`: `rustc`, the Rust compiler, and `rustdoc`, the
|
||||
When complete, `./x.py install` will place several programs into
|
||||
`$PREFIX/bin`: `rustc`, the Rust compiler, and `rustdoc`, the
|
||||
API-documentation tool. This install does not include [Cargo],
|
||||
Rust's package manager, which you may also want to build.
|
||||
Rust's package manager. To build and install Cargo, you may
|
||||
run `./x.py install cargo` or set the `build.extended` key in
|
||||
`config.toml` to `true` to build and install all tools.
|
||||
|
||||
[Cargo]: https://github.com/rust-lang/cargo
|
||||
|
||||
### Building on Windows
|
||||
[building-on-windows]: #building-on-windows
|
||||
|
||||
There are two prominent ABIs in use on Windows: the native (MSVC) ABI used by
|
||||
Visual Studio, and the GNU ABI used by the GCC toolchain. Which version of Rust
|
||||
|
|
@ -85,7 +81,6 @@ for interop with GNU software built using the MinGW/MSYS2 toolchain use the GNU
|
|||
build.
|
||||
|
||||
#### MinGW
|
||||
[windows-mingw]: #windows-mingw
|
||||
|
||||
[MSYS2][msys2] can be used to easily build Rust on Windows:
|
||||
|
||||
|
|
@ -126,17 +121,15 @@ build.
|
|||
```
|
||||
|
||||
#### MSVC
|
||||
[windows-msvc]: #windows-msvc
|
||||
|
||||
MSVC builds of Rust additionally require an installation of Visual Studio 2017
|
||||
(or later) so `rustc` can use its linker. The simplest way is to get the
|
||||
[Visual Studio Build Tools] and check the “C++ build tools” workload.
|
||||
[Visual Studio], check the “C++ build tools” and “Windows 10 SDK” workload.
|
||||
|
||||
[Visual Studio Build Tools]: https://visualstudio.microsoft.com/downloads/#build-tools-for-visual-studio-2019
|
||||
[Visual Studio]: https://visualstudio.microsoft.com/downloads/
|
||||
|
||||
At last check (cmake 3.14.3 and msvc 16.0.3) using the 2019 tools fails to
|
||||
build the in-tree LLVM build with a CMake error, so use 2017 instead by
|
||||
including the “MSVC v141 – VS 2017 C++ x64/x86 build tools (v14.16)” component.
|
||||
(If you're installing cmake yourself, be careful that “C++ CMake tools for
|
||||
Windows” doesn't get included under “Individual components”.)
|
||||
|
||||
With these dependencies installed, you can build the compiler in a `cmd.exe`
|
||||
shell with:
|
||||
|
|
@ -151,12 +144,11 @@ then you may need to force rustbuild to use an older version. This can be done
|
|||
by manually calling the appropriate vcvars file before running the bootstrap.
|
||||
|
||||
```batch
|
||||
> CALL "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\amd64\vcvars64.bat"
|
||||
> CALL "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars64.bat"
|
||||
> python x.py build
|
||||
```
|
||||
|
||||
#### Specifying an ABI
|
||||
[specifying-an-abi]: #specifying-an-abi
|
||||
|
||||
Each specific ABI can also be used from either environment (for example, using
|
||||
the GNU ABI in PowerShell) by using an explicit build triple. The available
|
||||
|
|
@ -170,11 +162,10 @@ Windows build triples are:
|
|||
|
||||
The build triple can be specified by either specifying `--build=<triple>` when
|
||||
invoking `x.py` commands, or by copying the `config.toml` file (as described
|
||||
in Building From Source), and modifying the `build` option under the `[build]`
|
||||
section.
|
||||
in [Installing From Source](#installing-from-source)), and modifying the
|
||||
`build` option under the `[build]` section.
|
||||
|
||||
### Configure and Make
|
||||
[configure-and-make]: #configure-and-make
|
||||
|
||||
While it's not the recommended build system, this project also provides a
|
||||
configure script and makefile (the latter of which just invokes `x.py`).
|
||||
|
|
@ -189,7 +180,6 @@ When using the configure script, the generated `config.mk` file may override the
|
|||
`config.mk` file.
|
||||
|
||||
## Building Documentation
|
||||
[building-documentation]: #building-documentation
|
||||
|
||||
If you’d like to build the documentation, it’s almost the same:
|
||||
|
||||
|
|
@ -202,7 +192,6 @@ the ABI used. I.e., if the ABI was `x86_64-pc-windows-msvc`, the directory will
|
|||
`build\x86_64-pc-windows-msvc\doc`.
|
||||
|
||||
## Notes
|
||||
[notes]: #notes
|
||||
|
||||
Since the Rust compiler is written in Rust, it must be built by a
|
||||
precompiled "snapshot" version of itself (made in an earlier stage of
|
||||
|
|
@ -211,11 +200,11 @@ fetch snapshots, and an OS that can execute the available snapshot binaries.
|
|||
|
||||
Snapshot binaries are currently built and tested on several platforms:
|
||||
|
||||
| Platform / Architecture | x86 | x86_64 |
|
||||
|--------------------------|-----|--------|
|
||||
| Windows (7, 8, 10, ...) | ✓ | ✓ |
|
||||
| Linux (2.6.18 or later) | ✓ | ✓ |
|
||||
| OSX (10.7 Lion or later) | ✓ | ✓ |
|
||||
| Platform / Architecture | x86 | x86_64 |
|
||||
|----------------------------|-----|--------|
|
||||
| Windows (7, 8, 10, ...) | ✓ | ✓ |
|
||||
| Linux (2.6.18 or later) | ✓ | ✓ |
|
||||
| macOS (10.7 Lion or later) | ✓ | ✓ |
|
||||
|
||||
You may find that other platforms work, but these are our officially
|
||||
supported build environments that are most likely to work.
|
||||
|
|
@ -225,7 +214,6 @@ There is more advice about hacking on Rust in [CONTRIBUTING.md].
|
|||
[CONTRIBUTING.md]: https://github.com/rust-lang/rust/blob/master/CONTRIBUTING.md
|
||||
|
||||
## Getting Help
|
||||
[getting-help]: #getting-help
|
||||
|
||||
The Rust community congregates in a few places:
|
||||
|
||||
|
|
@ -238,7 +226,6 @@ The Rust community congregates in a few places:
|
|||
[users.rust-lang.org]: https://users.rust-lang.org/
|
||||
|
||||
## Contributing
|
||||
[contributing]: #contributing
|
||||
|
||||
To contribute to Rust, please see [CONTRIBUTING](CONTRIBUTING.md).
|
||||
|
||||
|
|
@ -259,7 +246,6 @@ Also, you may find the [rustdocs for the compiler itself][rustdocs] useful.
|
|||
[rustdocs]: https://doc.rust-lang.org/nightly/nightly-rustc/rustc/
|
||||
|
||||
## License
|
||||
[license]: #license
|
||||
|
||||
Rust is primarily distributed under the terms of both the MIT license
|
||||
and the Apache License (Version 2.0), with portions covered by various
|
||||
|
|
@ -269,7 +255,6 @@ See [LICENSE-APACHE](LICENSE-APACHE), [LICENSE-MIT](LICENSE-MIT), and
|
|||
[COPYRIGHT](COPYRIGHT) for details.
|
||||
|
||||
## Trademark
|
||||
[trademark]: #trademark
|
||||
|
||||
The Rust programming language is an open source, community project governed
|
||||
by a core team. It is also sponsored by the Mozilla Foundation (“Mozilla”),
|
||||
|
|
|
|||
237
RELEASES.md
237
RELEASES.md
|
|
@ -1,3 +1,222 @@
|
|||
Version 1.37.0 (2019-08-15)
|
||||
==========================
|
||||
|
||||
Language
|
||||
--------
|
||||
- `#[must_use]` will now warn if the type is contained in a [tuple][61100],
|
||||
[`Box`][62228], or an [array][62235] and unused.
|
||||
- [You can now use the `cfg` and `cfg_attr` attributes on
|
||||
generic parameters.][61547]
|
||||
- [You can now use enum variants through type alias.][61682] e.g. You can
|
||||
write the following:
|
||||
```rust
|
||||
type MyOption = Option<u8>;
|
||||
|
||||
fn increment_or_zero(x: MyOption) -> u8 {
|
||||
match x {
|
||||
MyOption::Some(y) => y + 1,
|
||||
MyOption::None => 0,
|
||||
}
|
||||
}
|
||||
```
|
||||
- [You can now use `_` as an identifier for consts.][61347] e.g. You can write
|
||||
`const _: u32 = 5;`.
|
||||
- [You can now use `#[repr(align(X)]` on enums.][61229]
|
||||
- [The `?` Kleene macro operator is now available in the
|
||||
2015 edition.][60932]
|
||||
|
||||
Compiler
|
||||
--------
|
||||
- [You can now enable Profile-Guided Optimization with the `-C profile-generate`
|
||||
and `-C profile-use` flags.][61268] For more information on how to use profile
|
||||
guided optimization, please refer to the [rustc book][rustc-book-pgo].
|
||||
- [The `rust-lldb` wrapper script should now work again.][61827]
|
||||
|
||||
Libraries
|
||||
---------
|
||||
- [`mem::MaybeUninit<T>` is now ABI-compatible with `T`.][61802]
|
||||
|
||||
Stabilized APIs
|
||||
---------------
|
||||
- [`BufReader::buffer`]
|
||||
- [`BufWriter::buffer`]
|
||||
- [`Cell::from_mut`]
|
||||
- [`Cell<[T]>::as_slice_of_cells`][`Cell<slice>::as_slice_of_cells`]
|
||||
- [`DoubleEndedIterator::nth_back`]
|
||||
- [`Option::xor`]
|
||||
- [`Wrapping::reverse_bits`]
|
||||
- [`i128::reverse_bits`]
|
||||
- [`i16::reverse_bits`]
|
||||
- [`i32::reverse_bits`]
|
||||
- [`i64::reverse_bits`]
|
||||
- [`i8::reverse_bits`]
|
||||
- [`isize::reverse_bits`]
|
||||
- [`slice::copy_within`]
|
||||
- [`u128::reverse_bits`]
|
||||
- [`u16::reverse_bits`]
|
||||
- [`u32::reverse_bits`]
|
||||
- [`u64::reverse_bits`]
|
||||
- [`u8::reverse_bits`]
|
||||
- [`usize::reverse_bits`]
|
||||
|
||||
Cargo
|
||||
-----
|
||||
- [`Cargo.lock` files are now included by default when publishing executable crates
|
||||
with executables.][cargo/7026]
|
||||
- [You can now specify `default-run="foo"` in `[package]` to specify the
|
||||
default executable to use for `cargo run`.][cargo/7056]
|
||||
|
||||
Misc
|
||||
----
|
||||
|
||||
Compatibility Notes
|
||||
-------------------
|
||||
- [Using `...` for inclusive range patterns will now warn by default.][61342]
|
||||
Please transition your code to using the `..=` syntax for inclusive
|
||||
ranges instead.
|
||||
- [Using a trait object without the `dyn` will now warn by default.][61203]
|
||||
Please transition your code to use `dyn Trait` for trait objects instead.
|
||||
|
||||
[62228]: https://github.com/rust-lang/rust/pull/62228/
|
||||
[62235]: https://github.com/rust-lang/rust/pull/62235/
|
||||
[61802]: https://github.com/rust-lang/rust/pull/61802/
|
||||
[61827]: https://github.com/rust-lang/rust/pull/61827/
|
||||
[61547]: https://github.com/rust-lang/rust/pull/61547/
|
||||
[61682]: https://github.com/rust-lang/rust/pull/61682/
|
||||
[61268]: https://github.com/rust-lang/rust/pull/61268/
|
||||
[61342]: https://github.com/rust-lang/rust/pull/61342/
|
||||
[61347]: https://github.com/rust-lang/rust/pull/61347/
|
||||
[61100]: https://github.com/rust-lang/rust/pull/61100/
|
||||
[61203]: https://github.com/rust-lang/rust/pull/61203/
|
||||
[61229]: https://github.com/rust-lang/rust/pull/61229/
|
||||
[60932]: https://github.com/rust-lang/rust/pull/60932/
|
||||
[cargo/7026]: https://github.com/rust-lang/cargo/pull/7026/
|
||||
[cargo/7056]: https://github.com/rust-lang/cargo/pull/7056/
|
||||
[`BufReader::buffer`]: https://doc.rust-lang.org/std/io/struct.BufReader.html#method.buffer
|
||||
[`BufWriter::buffer`]: https://doc.rust-lang.org/std/io/struct.BufWriter.html#method.buffer
|
||||
[`Cell::from_mut`]: https://doc.rust-lang.org/std/cell/struct.Cell.html#method.from_mut
|
||||
[`Cell<slice>::as_slice_of_cells`]: https://doc.rust-lang.org/std/cell/struct.Cell.html#method.as_slice_of_cells
|
||||
[`DoubleEndedIterator::nth_back`]: https://doc.rust-lang.org/std/iter/trait.DoubleEndedIterator.html#method.nth_back
|
||||
[`Option::xor`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.xor
|
||||
[`RefCell::try_borrow_unguarded`]: https://doc.rust-lang.org/std/cell/struct.RefCell.html#method.try_borrow_unguarded
|
||||
[`Wrapping::reverse_bits`]: https://doc.rust-lang.org/std/num/struct.Wrapping.html#method.reverse_bits
|
||||
[`i128::reverse_bits`]: https://doc.rust-lang.org/std/primitive.i128.html#method.reverse_bits
|
||||
[`i16::reverse_bits`]: https://doc.rust-lang.org/std/primitive.i16.html#method.reverse_bits
|
||||
[`i32::reverse_bits`]: https://doc.rust-lang.org/std/primitive.i32.html#method.reverse_bits
|
||||
[`i64::reverse_bits`]: https://doc.rust-lang.org/std/primitive.i64.html#method.reverse_bits
|
||||
[`i8::reverse_bits`]: https://doc.rust-lang.org/std/primitive.i8.html#method.reverse_bits
|
||||
[`isize::reverse_bits`]: https://doc.rust-lang.org/std/primitive.isize.html#method.reverse_bits
|
||||
[`slice::copy_within`]: https://doc.rust-lang.org/std/primitive.slice.html#method.copy_within
|
||||
[`u128::reverse_bits`]: https://doc.rust-lang.org/std/primitive.u128.html#method.reverse_bits
|
||||
[`u16::reverse_bits`]: https://doc.rust-lang.org/std/primitive.u16.html#method.reverse_bits
|
||||
[`u32::reverse_bits`]: https://doc.rust-lang.org/std/primitive.u32.html#method.reverse_bits
|
||||
[`u64::reverse_bits`]: https://doc.rust-lang.org/std/primitive.u64.html#method.reverse_bits
|
||||
[`u8::reverse_bits`]: https://doc.rust-lang.org/std/primitive.u8.html#method.reverse_bits
|
||||
[`usize::reverse_bits`]: https://doc.rust-lang.org/std/primitive.usize.html#method.reverse_bits
|
||||
[rustc-book-pgo]: https://doc.rust-lang.org/rustc/profile-guided-optimization.html
|
||||
|
||||
|
||||
Version 1.36.0 (2019-07-04)
|
||||
==========================
|
||||
|
||||
Language
|
||||
--------
|
||||
- [Non-Lexical Lifetimes are now enabled on the 2015 edition.][59114]
|
||||
- [The order of traits in trait objects no longer affects the semantics of that
|
||||
object.][59445] e.g. `dyn Send + fmt::Debug` is now equivalent to
|
||||
`dyn fmt::Debug + Send`, where this was previously not the case.
|
||||
|
||||
Libraries
|
||||
---------
|
||||
- [`HashMap`'s implementation has been replaced with `hashbrown::HashMap` implementation.][58623]
|
||||
- [`TryFromSliceError` now implements `From<Infallible>`.][60318]
|
||||
- [`mem::needs_drop` is now available as a const fn.][60364]
|
||||
- [`alloc::Layout::from_size_align_unchecked` is now available as a const fn.][60370]
|
||||
- [`String` now implements `BorrowMut<str>`.][60404]
|
||||
- [`io::Cursor` now implements `Default`.][60234]
|
||||
- [Both `NonNull::{dangling, cast}` are now const fns.][60244]
|
||||
- [The `alloc` crate is now stable.][59675] `alloc` allows you to use a subset
|
||||
of `std` (e.g. `Vec`, `Box`, `Arc`) in `#![no_std]` environments if the
|
||||
environment has access to heap memory allocation.
|
||||
- [`String` now implements `From<&String>`.][59825]
|
||||
- [You can now pass multiple arguments to the `dbg!` macro.][59826] `dbg!` will
|
||||
return a tuple of each argument when there is multiple arguments.
|
||||
- [`Result::{is_err, is_ok}` are now `#[must_use]` and will produce a warning if
|
||||
not used.][59648]
|
||||
|
||||
Stabilized APIs
|
||||
---------------
|
||||
- [`VecDeque::rotate_left`]
|
||||
- [`VecDeque::rotate_right`]
|
||||
- [`Iterator::copied`]
|
||||
- [`io::IoSlice`]
|
||||
- [`io::IoSliceMut`]
|
||||
- [`Read::read_vectored`]
|
||||
- [`Write::write_vectored`]
|
||||
- [`str::as_mut_ptr`]
|
||||
- [`mem::MaybeUninit`]
|
||||
- [`pointer::align_offset`]
|
||||
- [`future::Future`]
|
||||
- [`task::Context`]
|
||||
- [`task::RawWaker`]
|
||||
- [`task::RawWakerVTable`]
|
||||
- [`task::Waker`]
|
||||
- [`task::Poll`]
|
||||
|
||||
Cargo
|
||||
-----
|
||||
- [Cargo will now produce an error if you attempt to use the name of a required dependency as a feature.][cargo/6860]
|
||||
- [You can now pass the `--offline` flag to run cargo without accessing the network.][cargo/6934]
|
||||
|
||||
You can find further change's in [Cargo's 1.36.0 release notes][cargo-1-36-0].
|
||||
|
||||
Clippy
|
||||
------
|
||||
There have been numerous additions and fixes to clippy, see [Clippy's 1.36.0 release notes][clippy-1-36-0] for more details.
|
||||
|
||||
Misc
|
||||
----
|
||||
|
||||
Compatibility Notes
|
||||
-------------------
|
||||
- With the stabilisation of `mem::MaybeUninit`, `mem::uninitialized` use is no
|
||||
longer recommended, and will be deprecated in 1.39.0.
|
||||
|
||||
[60318]: https://github.com/rust-lang/rust/pull/60318/
|
||||
[60364]: https://github.com/rust-lang/rust/pull/60364/
|
||||
[60370]: https://github.com/rust-lang/rust/pull/60370/
|
||||
[60404]: https://github.com/rust-lang/rust/pull/60404/
|
||||
[60234]: https://github.com/rust-lang/rust/pull/60234/
|
||||
[60244]: https://github.com/rust-lang/rust/pull/60244/
|
||||
[58623]: https://github.com/rust-lang/rust/pull/58623/
|
||||
[59648]: https://github.com/rust-lang/rust/pull/59648/
|
||||
[59675]: https://github.com/rust-lang/rust/pull/59675/
|
||||
[59825]: https://github.com/rust-lang/rust/pull/59825/
|
||||
[59826]: https://github.com/rust-lang/rust/pull/59826/
|
||||
[59445]: https://github.com/rust-lang/rust/pull/59445/
|
||||
[59114]: https://github.com/rust-lang/rust/pull/59114/
|
||||
[cargo/6860]: https://github.com/rust-lang/cargo/pull/6860/
|
||||
[cargo/6934]: https://github.com/rust-lang/cargo/pull/6934/
|
||||
[`VecDeque::rotate_left`]: https://doc.rust-lang.org/std/collections/struct.VecDeque.html#method.rotate_left
|
||||
[`VecDeque::rotate_right`]: https://doc.rust-lang.org/std/collections/struct.VecDeque.html#method.rotate_right
|
||||
[`Iterator::copied`]: https://doc.rust-lang.org/std/iter/trait.Iterator.html#tymethod.copied
|
||||
[`io::IoSlice`]: https://doc.rust-lang.org/std/io/struct.IoSlice.html
|
||||
[`io::IoSliceMut`]: https://doc.rust-lang.org/std/io/struct.IoSliceMut.html
|
||||
[`Read::read_vectored`]: https://doc.rust-lang.org/std/io/trait.Read.html#method.read_vectored
|
||||
[`Write::write_vectored`]: https://doc.rust-lang.org/std/io/trait.Write.html#method.write_vectored
|
||||
[`str::as_mut_ptr`]: https://doc.rust-lang.org/std/primitive.str.html#method.as_mut_ptr
|
||||
[`mem::MaybeUninit`]: https://doc.rust-lang.org/std/mem/union.MaybeUninit.html
|
||||
[`pointer::align_offset`]: https://doc.rust-lang.org/std/primitive.pointer.html#method.align_offset
|
||||
[`future::Future`]: https://doc.rust-lang.org/std/future/trait.Future.html
|
||||
[`task::Context`]: https://doc.rust-lang.org/beta/std/task/struct.Context.html
|
||||
[`task::RawWaker`]: https://doc.rust-lang.org/beta/std/task/struct.RawWaker.html
|
||||
[`task::RawWakerVTable`]: https://doc.rust-lang.org/beta/std/task/struct.RawWakerVTable.html
|
||||
[`task::Waker`]: https://doc.rust-lang.org/beta/std/task/struct.Waker.html
|
||||
[`task::Poll`]: https://doc.rust-lang.org/beta/std/task/enum.Poll.html
|
||||
[clippy-1-36-0]: https://github.com/rust-lang/rust-clippy/blob/master/CHANGELOG.md#rust-136
|
||||
[cargo-1-36-0]: https://github.com/rust-lang/cargo/blob/master/CHANGELOG.md#cargo-136-2019-07-04
|
||||
|
||||
|
||||
Version 1.35.0 (2019-05-23)
|
||||
==========================
|
||||
|
||||
|
|
@ -62,7 +281,7 @@ Cargo
|
|||
- [You can now set `cargo:rustc-cdylib-link-arg` at build time to pass custom
|
||||
linker arguments when building a `cdylib`.][cargo/6298] Its usage is highly
|
||||
platform specific.
|
||||
|
||||
|
||||
Misc
|
||||
----
|
||||
- [The Rust toolchain is now available natively for musl based distros.][58575]
|
||||
|
|
@ -157,7 +376,7 @@ Libraries
|
|||
produce a warning if their returning type is unused.
|
||||
- [The methods `checked_pow`, `saturating_pow`, `wrapping_pow`, and
|
||||
`overflowing_pow` are now available for all numeric types.][57873] These are
|
||||
equivalvent to methods such as `wrapping_add` for the `pow` operation.
|
||||
equivalent to methods such as `wrapping_add` for the `pow` operation.
|
||||
|
||||
|
||||
Stabilized APIs
|
||||
|
|
@ -208,9 +427,9 @@ Misc
|
|||
|
||||
Compatibility Notes
|
||||
-------------------
|
||||
- [`Command::before_exec` is now deprecated in favor of the
|
||||
unsafe method `Command::pre_exec`.][58059]
|
||||
- [Use of `ATOMIC_{BOOL, ISIZE, USIZE}_INIT` is now deprecated.][57425] As you
|
||||
- [`Command::before_exec` is being replaced by the unsafe method
|
||||
`Command::pre_exec`][58059] and will be deprecated with Rust 1.37.0.
|
||||
- [Use of `ATOMIC_{BOOL, ISIZE, USIZE}_INIT` is now deprecated][57425] as you
|
||||
can now use `const` functions in `static` variables.
|
||||
|
||||
[58370]: https://github.com/rust-lang/rust/pull/58370/
|
||||
|
|
@ -320,7 +539,7 @@ Compiler
|
|||
--------
|
||||
- [You can now set a linker flavor for `rustc` with the `-Clinker-flavor`
|
||||
command line argument.][56351]
|
||||
- [The mininum required LLVM version has been bumped to 6.0.][56642]
|
||||
- [The minimum required LLVM version has been bumped to 6.0.][56642]
|
||||
- [Added support for the PowerPC64 architecture on FreeBSD.][57615]
|
||||
- [The `x86_64-fortanix-unknown-sgx` target support has been upgraded to
|
||||
tier 2 support.][57130] Visit the [platform support][platform-support] page for
|
||||
|
|
@ -751,7 +970,7 @@ Compiler
|
|||
|
||||
Libraries
|
||||
---------
|
||||
- [You can now convert `num::NonZero*` types to their raw equivalvents using the
|
||||
- [You can now convert `num::NonZero*` types to their raw equivalents using the
|
||||
`From` trait.][54240] E.g. `u8` now implements `From<NonZeroU8>`.
|
||||
- [You can now convert a `&Option<T>` into `Option<&T>` and `&mut Option<T>`
|
||||
into `Option<&mut T>` using the `From` trait.][53218]
|
||||
|
|
@ -944,7 +1163,7 @@ Security Notes
|
|||
caused by an integer overflow. This has been fixed by deterministically
|
||||
panicking when an overflow happens.
|
||||
|
||||
Thank you to Scott McMurray for responsibily disclosing this vulnerability to
|
||||
Thank you to Scott McMurray for responsibly disclosing this vulnerability to
|
||||
us.
|
||||
|
||||
|
||||
|
|
@ -1216,7 +1435,7 @@ Security Notes
|
|||
given machine. This release fixes that vulnerability; you can read
|
||||
more about this on the [blog][rustdoc-sec]. The associated CVE is [CVE-2018-1000622].
|
||||
|
||||
Thank you to Red Hat for responsibily disclosing this vulnerability to us.
|
||||
Thank you to Red Hat for responsibly disclosing this vulnerability to us.
|
||||
|
||||
Compatibility Notes
|
||||
-------------------
|
||||
|
|
|
|||
275
appveyor.yml
275
appveyor.yml
|
|
@ -1,275 +0,0 @@
|
|||
environment:
|
||||
# This is required for at least an AArch64 compiler in one image, and is also
|
||||
# going to soon be required for compiling LLVM.
|
||||
APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2017 Preview
|
||||
|
||||
# By default schannel checks revocation of certificates unlike some other SSL
|
||||
# backends, but we've historically had problems on CI where a revocation
|
||||
# server goes down presumably. See #43333 for more info
|
||||
CARGO_HTTP_CHECK_REVOKE: false
|
||||
|
||||
matrix:
|
||||
# 32/64 bit MSVC tests
|
||||
- CI_JOB_NAME: x86_64-msvc
|
||||
MSYS_BITS: 64
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-profiler
|
||||
SCRIPT: python x.py test
|
||||
# FIXME(#59637)
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
- CI_JOB_NAME: i686-msvc-1
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
|
||||
SCRIPT: make appveyor-subset-1
|
||||
- CI_JOB_NAME: i686-msvc-2
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
|
||||
SCRIPT: make appveyor-subset-2
|
||||
|
||||
# MSVC aux tests
|
||||
- CI_JOB_NAME: x86_64-msvc-aux
|
||||
MSYS_BITS: 64
|
||||
RUST_CHECK_TARGET: check-aux EXCLUDE_CARGO=1
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
|
||||
- CI_JOB_NAME: x86_64-msvc-cargo
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: python x.py test src/tools/cargotest src/tools/cargo
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
|
||||
|
||||
# MSVC tools tests
|
||||
- CI_JOB_NAME: x86_64-msvc-tools
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: src/ci/docker/x86_64-gnu-tools/checktools.sh x.py /tmp/toolstates.json windows
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --save-toolstates=/tmp/toolstates.json --enable-test-miri
|
||||
|
||||
# 32/64-bit MinGW builds.
|
||||
#
|
||||
# We are using MinGW with posix threads since LLVM does not compile with
|
||||
# the win32 threads version due to missing support for C++'s std::thread.
|
||||
#
|
||||
# Instead of relying on the MinGW version installed on appveryor we download
|
||||
# and install one ourselves so we won't be surprised by changes to appveyor's
|
||||
# build image.
|
||||
#
|
||||
# Finally, note that the downloads below are all in the `rust-lang-ci` S3
|
||||
# bucket, but they cleraly didn't originate there! The downloads originally
|
||||
# came from the mingw-w64 SourceForge download site. Unfortunately
|
||||
# SourceForge is notoriously flaky, so we mirror it on our own infrastructure.
|
||||
- CI_JOB_NAME: i686-mingw-1
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
|
||||
SCRIPT: make appveyor-subset-1
|
||||
MINGW_URL: https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
|
||||
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw32
|
||||
# FIXME(#59637)
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
- CI_JOB_NAME: i686-mingw-2
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
|
||||
SCRIPT: make appveyor-subset-2
|
||||
MINGW_URL: https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
|
||||
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw32
|
||||
- CI_JOB_NAME: x86_64-mingw
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: python x.py test
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
|
||||
MINGW_URL: https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
|
||||
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw64
|
||||
# FIXME(#59637)
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
|
||||
# 32/64 bit MSVC and GNU deployment
|
||||
- CI_JOB_NAME: dist-x86_64-msvc
|
||||
RUST_CONFIGURE_ARGS: >
|
||||
--build=x86_64-pc-windows-msvc
|
||||
--target=x86_64-pc-windows-msvc,aarch64-pc-windows-msvc
|
||||
--enable-full-tools
|
||||
--enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
- CI_JOB_NAME: dist-i686-msvc
|
||||
RUST_CONFIGURE_ARGS: >
|
||||
--build=i686-pc-windows-msvc
|
||||
--target=i586-pc-windows-msvc
|
||||
--enable-full-tools
|
||||
--enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
- CI_JOB_NAME: dist-i686-mingw
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu --enable-full-tools
|
||||
SCRIPT: python x.py dist
|
||||
MINGW_URL: https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
|
||||
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw32
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
- CI_JOB_NAME: dist-x86_64-mingw
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: python x.py dist
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu --enable-full-tools
|
||||
MINGW_URL: https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
|
||||
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw64
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
|
||||
# "alternate" deployment, see .travis.yml for more info
|
||||
- CI_JOB_NAME: dist-x86_64-msvc-alt
|
||||
MSYS_BITS: 64
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-extended --enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
DEPLOY_ALT: 1
|
||||
|
||||
matrix:
|
||||
fast_finish: true
|
||||
|
||||
clone_depth: 2
|
||||
build: false
|
||||
|
||||
install:
|
||||
# Print which AppVeyor agent version we're running on.
|
||||
- appveyor version
|
||||
# If we need to download a custom MinGW, do so here and set the path
|
||||
# appropriately.
|
||||
#
|
||||
# Note that this *also* means that we're not using what is typically
|
||||
# /mingw32/bin/python2.7.exe, which is a "correct" python interpreter where
|
||||
# /usr/bin/python2.7.exe is not. To ensure we use the right interpreter we
|
||||
# move `C:\Python27` ahead in PATH and then also make sure the `python2.7.exe`
|
||||
# file exists in there (which it doesn't by default).
|
||||
- if defined MINGW_URL appveyor-retry appveyor DownloadFile %MINGW_URL%/%MINGW_ARCHIVE%
|
||||
- if defined MINGW_URL 7z x -y %MINGW_ARCHIVE% > nul
|
||||
- if defined MINGW_URL set PATH=%CD%\%MINGW_DIR%\bin;C:\msys64\usr\bin;%PATH%
|
||||
|
||||
# If we're compiling for MSVC then we, like most other distribution builders,
|
||||
# switch to clang as the compiler. This'll allow us eventually to enable LTO
|
||||
# amongst LLVM and rustc. Note that we only do this on MSVC as I don't think
|
||||
# clang has an output mode compatible with MinGW that we need. If it does we
|
||||
# should switch to clang for MinGW as well!
|
||||
#
|
||||
# Note that the LLVM installer is an NSIS installer
|
||||
#
|
||||
# Original downloaded here came from
|
||||
# http://releases.llvm.org/8.0.0/LLVM-8.0.0-win64.exe
|
||||
- if NOT defined MINGW_URL appveyor-retry appveyor DownloadFile https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/LLVM-8.0.0-win64.exe
|
||||
- if NOT defined MINGW_URL .\LLVM-8.0.0-win64.exe /S /NCRC /D=C:\clang-rust
|
||||
- if NOT defined MINGW_URL set RUST_CONFIGURE_ARGS=%RUST_CONFIGURE_ARGS% --set llvm.clang-cl=C:\clang-rust\bin\clang-cl.exe
|
||||
|
||||
# Here we do a pretty heinous thing which is to mangle the MinGW installation
|
||||
# we just had above. Currently, as of this writing, we're using MinGW-w64
|
||||
# builds of gcc, and that's currently at 6.3.0. We use 6.3.0 as it appears to
|
||||
# be the first version which contains a fix for #40546, builds randomly
|
||||
# failing during LLVM due to ar.exe/ranlib.exe failures.
|
||||
#
|
||||
# Unfortunately, though, 6.3.0 *also* is the first version of MinGW-w64 builds
|
||||
# to contain a regression in gdb (#40184). As a result if we were to use the
|
||||
# gdb provided (7.11.1) then we would fail all debuginfo tests.
|
||||
#
|
||||
# In order to fix spurious failures (pretty high priority) we use 6.3.0. To
|
||||
# avoid disabling gdb tests we download an *old* version of gdb, specifically
|
||||
# that found inside the 6.2.0 distribution. We then overwrite the 6.3.0 gdb
|
||||
# with the 6.2.0 gdb to get tests passing.
|
||||
#
|
||||
# Note that we don't literally overwrite the gdb.exe binary because it appears
|
||||
# to just use gdborig.exe, so that's the binary we deal with instead.
|
||||
- if defined MINGW_URL appveyor-retry appveyor DownloadFile %MINGW_URL%/2017-04-20-%MSYS_BITS%bit-gdborig.exe
|
||||
- if defined MINGW_URL mv 2017-04-20-%MSYS_BITS%bit-gdborig.exe %MINGW_DIR%\bin\gdborig.exe
|
||||
|
||||
# Otherwise pull in the MinGW installed on appveyor
|
||||
- if NOT defined MINGW_URL set PATH=C:\msys64\mingw%MSYS_BITS%\bin;C:\msys64\usr\bin;%PATH%
|
||||
|
||||
# Prefer the "native" Python as LLVM has trouble building with MSYS sometimes
|
||||
- copy C:\Python27\python.exe C:\Python27\python2.7.exe
|
||||
- set PATH=C:\Python27;%PATH%
|
||||
|
||||
# Download and install sccache
|
||||
- appveyor-retry appveyor DownloadFile https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2018-04-26-sccache-x86_64-pc-windows-msvc
|
||||
- mv 2018-04-26-sccache-x86_64-pc-windows-msvc sccache.exe
|
||||
- set PATH=%PATH%;%CD%
|
||||
|
||||
# Download and install ninja
|
||||
#
|
||||
# Note that this is originally from the github releases patch of Ninja
|
||||
- appveyor-retry appveyor DownloadFile https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2017-03-15-ninja-win.zip
|
||||
- 7z x 2017-03-15-ninja-win.zip
|
||||
- set RUST_CONFIGURE_ARGS=%RUST_CONFIGURE_ARGS% --enable-ninja
|
||||
# - set PATH=%PATH%;%CD% -- this already happens above for sccache
|
||||
|
||||
# Install InnoSetup to get `iscc` used to produce installers
|
||||
- appveyor-retry appveyor DownloadFile https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2017-08-22-is.exe
|
||||
- 2017-08-22-is.exe /VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-
|
||||
- set PATH="C:\Program Files (x86)\Inno Setup 5";%PATH%
|
||||
|
||||
# Help debug some handle issues on AppVeyor
|
||||
- appveyor-retry appveyor DownloadFile https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2017-05-15-Handle.zip
|
||||
- mkdir handle
|
||||
- 7z x -ohandle 2017-05-15-Handle.zip
|
||||
- set PATH=%PATH%;%CD%\handle
|
||||
- handle.exe -accepteula -help
|
||||
|
||||
test_script:
|
||||
- if not exist C:\cache\rustsrc\NUL mkdir C:\cache\rustsrc
|
||||
- sh src/ci/init_repo.sh . /c/cache/rustsrc
|
||||
- set SRC=.
|
||||
- set NO_CCACHE=1
|
||||
- sh src/ci/run.sh
|
||||
|
||||
on_failure:
|
||||
# Dump crash log
|
||||
- set PATH=%PATH%;"C:\Program Files (x86)\Windows Kits\10\Debuggers\X64"
|
||||
- if exist %LOCALAPPDATA%\CrashDumps for %%f in (%LOCALAPPDATA%\CrashDumps\*) do cdb -c "k;q" -G -z "%%f"
|
||||
|
||||
branches:
|
||||
only:
|
||||
- auto
|
||||
|
||||
before_deploy:
|
||||
- ps: |
|
||||
New-Item -Path deploy -ItemType directory
|
||||
Remove-Item -Recurse -Force build\dist\doc
|
||||
Get-ChildItem -Path build\dist | Move-Item -Destination deploy
|
||||
Get-ChildItem -Path deploy | Foreach-Object {
|
||||
Push-AppveyorArtifact $_.FullName -FileName ${env:APPVEYOR_REPO_COMMIT}/$_
|
||||
}
|
||||
|
||||
deploy:
|
||||
- provider: S3
|
||||
access_key_id: $(AWS_ACCESS_KEY_ID)
|
||||
secret_access_key: $(AWS_SECRET_ACCESS_KEY)
|
||||
bucket: rust-lang-ci2
|
||||
set_public: true
|
||||
region: us-west-1
|
||||
artifact: /.*/
|
||||
folder: rustc-builds
|
||||
on:
|
||||
branch: auto
|
||||
DEPLOY: 1
|
||||
max_error_retry: 5
|
||||
|
||||
# This provider is the same as the one above except that it has a slightly
|
||||
# different upload directory and a slightly different trigger
|
||||
- provider: S3
|
||||
access_key_id: $(AWS_ACCESS_KEY_ID)
|
||||
secret_access_key: $(AWS_SECRET_ACCESS_KEY)
|
||||
bucket: rust-lang-ci2
|
||||
set_public: true
|
||||
region: us-west-1
|
||||
artifact: /.*/
|
||||
folder: rustc-builds-alt
|
||||
on:
|
||||
branch: auto
|
||||
DEPLOY_ALT: 1
|
||||
max_error_retry: 5
|
||||
|
||||
# init:
|
||||
# - ps: iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
|
||||
# on_finish:
|
||||
# - ps: $blockRdp = $true; iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
|
||||
|
|
@ -57,14 +57,13 @@
|
|||
# support. You'll need to write a target specification at least, and most
|
||||
# likely, teach rustc about the C ABI of the target. Get in touch with the
|
||||
# Rust team and file an issue if you need assistance in porting!
|
||||
#targets = "X86;ARM;AArch64;Mips;PowerPC;SystemZ;MSP430;Sparc;NVPTX;Hexagon"
|
||||
#targets = "AArch64;ARM;Hexagon;MSP430;Mips;NVPTX;PowerPC;RISCV;Sparc;SystemZ;WebAssembly;X86"
|
||||
|
||||
# LLVM experimental targets to build support for. These targets are specified in
|
||||
# the same format as above, but since these targets are experimental, they are
|
||||
# not built by default and the experimental Rust compilation targets that depend
|
||||
# on them will not work unless the user opts in to building them. By default the
|
||||
# `WebAssembly` and `RISCV` targets are enabled when compiling LLVM from scratch.
|
||||
#experimental-targets = "WebAssembly;RISCV"
|
||||
# on them will not work unless the user opts in to building them.
|
||||
#experimental-targets = ""
|
||||
|
||||
# Cap the number of parallel linker invocations when compiling LLVM.
|
||||
# This can be useful when building LLVM with debug info, which significantly
|
||||
|
|
@ -142,10 +141,10 @@
|
|||
# library and facade crates.
|
||||
#compiler-docs = false
|
||||
|
||||
# Indicate whether submodules are managed and updated automatically.
|
||||
# Indicate whether git submodules are managed and updated automatically.
|
||||
#submodules = true
|
||||
|
||||
# Update submodules only when the checked out commit in the submodules differs
|
||||
# Update git submodules only when the checked out commit in the submodules differs
|
||||
# from what is committed in the main rustc repo.
|
||||
#fast-submodules = true
|
||||
|
||||
|
|
@ -301,20 +300,27 @@
|
|||
# library.
|
||||
#debug-assertions = false
|
||||
|
||||
# Whether or not debuginfo is emitted
|
||||
#debuginfo = false
|
||||
# Debuginfo level for most of Rust code, corresponds to the `-C debuginfo=N` option of `rustc`.
|
||||
# `0` - no debug info
|
||||
# `1` - line tables only
|
||||
# `2` - full debug info with variable and type information
|
||||
# Can be overriden for specific subsets of Rust code (rustc, std or tools).
|
||||
# Debuginfo for tests run with compiletest is not controlled by this option
|
||||
# and needs to be enabled separately with `debuginfo-level-tests`.
|
||||
#debuginfo-level = if debug { 2 } else { 0 }
|
||||
|
||||
# Whether or not line number debug information is emitted
|
||||
#debuginfo-lines = false
|
||||
# Debuginfo level for the compiler.
|
||||
#debuginfo-level-rustc = debuginfo-level
|
||||
|
||||
# Whether or not to only build debuginfo for the standard library if enabled.
|
||||
# If enabled, this will not compile the compiler with debuginfo, just the
|
||||
# standard library.
|
||||
#debuginfo-only-std = false
|
||||
# Debuginfo level for the standard library.
|
||||
#debuginfo-level-std = debuginfo-level
|
||||
|
||||
# Enable debuginfo for the extended tools: cargo, rls, rustfmt
|
||||
# Adding debuginfo makes them several times larger.
|
||||
#debuginfo-tools = false
|
||||
# Debuginfo level for the tools.
|
||||
#debuginfo-level-tools = debuginfo-level
|
||||
|
||||
# Debuginfo level for the test suites run with compiletest.
|
||||
# FIXME(#61117): Some tests fail when this option is enabled.
|
||||
#debuginfo-level-tests = 0
|
||||
|
||||
# Whether or not `panic!`s generate backtraces (RUST_BACKTRACE)
|
||||
#backtrace = true
|
||||
|
|
@ -345,10 +351,8 @@
|
|||
# harness are debuggable just from logfiles.
|
||||
#verbose-tests = false
|
||||
|
||||
# Flag indicating whether tests are compiled with optimizations (the -O flag) or
|
||||
# with debuginfo (the -g flag)
|
||||
# Flag indicating whether tests are compiled with optimizations (the -O flag).
|
||||
#optimize-tests = true
|
||||
#debuginfo-tests = true
|
||||
|
||||
# Flag indicating whether codegen tests will be run or not. If you get an error
|
||||
# saying that the FileCheck executable is missing, you may want to disable this.
|
||||
|
|
@ -364,10 +368,6 @@
|
|||
# When creating source tarballs whether or not to create a source tarball.
|
||||
#dist-src = false
|
||||
|
||||
# Whether to also run the Miri tests suite when running tests.
|
||||
# As a side-effect also generates MIR for all libraries.
|
||||
#test-miri = false
|
||||
|
||||
# After building or testing extended tools (e.g. clippy and rustfmt), append the
|
||||
# result (broken, compiling, testing) into this JSON file.
|
||||
#save-toolstates = "/path/to/toolstates.json"
|
||||
|
|
|
|||
46
src/.gitignore
vendored
46
src/.gitignore
vendored
|
|
@ -1,46 +0,0 @@
|
|||
*.a
|
||||
*.aux
|
||||
*.bc
|
||||
*.boot
|
||||
*.bz2
|
||||
*.cmi
|
||||
*.cmo
|
||||
*.cmx
|
||||
*.cp
|
||||
*.cps
|
||||
*.d
|
||||
*.dSYM
|
||||
*.def
|
||||
*.diff
|
||||
*.dll
|
||||
*.dylib
|
||||
*.elc
|
||||
*.epub
|
||||
*.exe
|
||||
*.fn
|
||||
*.html
|
||||
*.kdev4
|
||||
*.ky
|
||||
*.ll
|
||||
*.llvm
|
||||
*.log
|
||||
*.o
|
||||
*.orig
|
||||
*.out
|
||||
*.patch
|
||||
*.pdb
|
||||
*.pdf
|
||||
*.pg
|
||||
*.pot
|
||||
*.pyc
|
||||
*.rej
|
||||
*.rlib
|
||||
*.rustc
|
||||
*.so
|
||||
*.swo
|
||||
*.swp
|
||||
*.tmp
|
||||
*.toc
|
||||
*.tp
|
||||
*.vr
|
||||
*.x86
|
||||
|
|
@ -5,10 +5,7 @@ This directory contains the source code of the rust project, including:
|
|||
|
||||
For more information on how various parts of the compiler work, see the [rustc guide].
|
||||
|
||||
There is also useful content in the following READMEs, which are gradually being moved over to the guide:
|
||||
- https://github.com/rust-lang/rust/tree/master/src/librustc/ty/query
|
||||
- https://github.com/rust-lang/rust/tree/master/src/librustc/dep_graph
|
||||
- https://github.com/rust-lang/rust/tree/master/src/librustc/infer/higher_ranked
|
||||
- https://github.com/rust-lang/rust/tree/master/src/librustc/infer/lexical_region_resolve
|
||||
There is also useful content in this README:
|
||||
https://github.com/rust-lang/rust/tree/master/src/librustc/infer/lexical_region_resolve.
|
||||
|
||||
[rustc guide]: https://rust-lang.github.io/rustc-guide/about-this-guide.html
|
||||
|
|
|
|||
|
|
@ -55,11 +55,11 @@ The script accepts commands, flags, and arguments to determine what to do:
|
|||
# run all unit tests
|
||||
./x.py test
|
||||
|
||||
# execute the run-pass test suite
|
||||
./x.py test src/test/run-pass
|
||||
# execute the UI test suite
|
||||
./x.py test src/test/ui
|
||||
|
||||
# execute only some tests in the run-pass test suite
|
||||
./x.py test src/test/run-pass --test-args substring-of-test-name
|
||||
# execute only some tests in the UI test suite
|
||||
./x.py test src/test/ui --test-args substring-of-test-name
|
||||
|
||||
# execute tests in the standard library in stage0
|
||||
./x.py test --stage 0 src/libstd
|
||||
|
|
@ -215,7 +215,7 @@ build/
|
|||
|
||||
# Output for all compiletest-based test suites
|
||||
test/
|
||||
run-pass/
|
||||
ui/
|
||||
compile-fail/
|
||||
debuginfo/
|
||||
...
|
||||
|
|
|
|||
|
|
@ -5,7 +5,8 @@
|
|||
//! parent directory, and otherwise documentation can be found throughout the `build`
|
||||
//! directory in each respective module.
|
||||
|
||||
#![deny(warnings)]
|
||||
// NO-RUSTC-WRAPPER
|
||||
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
|
||||
|
||||
use std::env;
|
||||
|
||||
|
|
|
|||
|
|
@ -15,7 +15,8 @@
|
|||
//! switching compilers for the bootstrap and for build scripts will probably
|
||||
//! never get replaced.
|
||||
|
||||
#![deny(warnings)]
|
||||
// NO-RUSTC-WRAPPER
|
||||
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
|
||||
|
||||
use std::env;
|
||||
use std::ffi::OsString;
|
||||
|
|
@ -36,7 +37,7 @@ fn main() {
|
|||
let mut new = None;
|
||||
if let Some(current_as_str) = args[i].to_str() {
|
||||
if (&*args[i - 1] == "-C" && current_as_str.starts_with("metadata")) ||
|
||||
current_as_str.starts_with("-Cmetadata") {
|
||||
current_as_str.starts_with("-Cmetadata") {
|
||||
new = Some(format!("{}-{}", current_as_str, s));
|
||||
}
|
||||
}
|
||||
|
|
@ -44,18 +45,6 @@ fn main() {
|
|||
}
|
||||
}
|
||||
|
||||
// Drop `--error-format json` because despite our desire for json messages
|
||||
// from Cargo we don't want any from rustc itself.
|
||||
if let Some(n) = args.iter().position(|n| n == "--error-format") {
|
||||
args.remove(n);
|
||||
args.remove(n);
|
||||
}
|
||||
|
||||
if let Some(s) = env::var_os("RUSTC_ERROR_FORMAT") {
|
||||
args.push("--error-format".into());
|
||||
args.push(s);
|
||||
}
|
||||
|
||||
// Detect whether or not we're a build script depending on whether --target
|
||||
// is passed (a bit janky...)
|
||||
let target = args.windows(2)
|
||||
|
|
@ -89,11 +78,38 @@ fn main() {
|
|||
|
||||
let mut cmd = Command::new(rustc);
|
||||
cmd.args(&args)
|
||||
.arg("--cfg")
|
||||
.arg(format!("stage{}", stage))
|
||||
.env(bootstrap::util::dylib_path_var(),
|
||||
env::join_paths(&dylib_path).unwrap());
|
||||
let mut maybe_crate = None;
|
||||
|
||||
// Get the name of the crate we're compiling, if any.
|
||||
let crate_name = args.windows(2)
|
||||
.find(|args| args[0] == "--crate-name")
|
||||
.and_then(|args| args[1].to_str());
|
||||
|
||||
if let Some(crate_name) = crate_name {
|
||||
if let Some(target) = env::var_os("RUSTC_TIME") {
|
||||
if target == "all" ||
|
||||
target.into_string().unwrap().split(",").any(|c| c.trim() == crate_name)
|
||||
{
|
||||
cmd.arg("-Ztime");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Non-zero stages must all be treated uniformly to avoid problems when attempting to uplift
|
||||
// compiler libraries and such from stage 1 to 2.
|
||||
//
|
||||
// FIXME: the fact that core here is excluded is due to core_arch from our stdarch submodule
|
||||
// being broken on the beta compiler with bootstrap passed, so this is a temporary workaround
|
||||
// (we've just snapped, so there are no cfg(bootstrap) related annotations in core).
|
||||
if stage == "0" {
|
||||
if crate_name != Some("core") {
|
||||
cmd.arg("--cfg").arg("bootstrap");
|
||||
} else {
|
||||
// NOTE(eddyb) see FIXME above, except now we need annotations again in core.
|
||||
cmd.arg("--cfg").arg("boostrap_stdarch_ignore_this");
|
||||
}
|
||||
}
|
||||
|
||||
// Print backtrace in case of ICE
|
||||
if env::var("RUSTC_BACKTRACE_ON_ICE").is_ok() && env::var("RUST_BACKTRACE").is_err() {
|
||||
|
|
@ -102,10 +118,30 @@ fn main() {
|
|||
|
||||
cmd.env("RUSTC_BREAK_ON_ICE", "1");
|
||||
|
||||
if let Ok(debuginfo_level) = env::var("RUSTC_DEBUGINFO_LEVEL") {
|
||||
cmd.arg(format!("-Cdebuginfo={}", debuginfo_level));
|
||||
}
|
||||
|
||||
if env::var_os("RUSTC_DENY_WARNINGS").is_some() &&
|
||||
env::var_os("RUSTC_EXTERNAL_TOOL").is_none() {
|
||||
// When extending this list, search for `NO-RUSTC-WRAPPER` and add the new lints
|
||||
// there as well, some code doesn't go through this `rustc` wrapper.
|
||||
cmd.arg("-Dwarnings");
|
||||
cmd.arg("-Drust_2018_idioms");
|
||||
cmd.arg("-Dunused_lifetimes");
|
||||
if use_internal_lints(crate_name) {
|
||||
cmd.arg("-Zunstable-options");
|
||||
cmd.arg("-Drustc::internal");
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(target) = target {
|
||||
// The stage0 compiler has a special sysroot distinct from what we
|
||||
// actually downloaded, so we just always pass the `--sysroot` option.
|
||||
cmd.arg("--sysroot").arg(&sysroot);
|
||||
// actually downloaded, so we just always pass the `--sysroot` option,
|
||||
// unless one is already set.
|
||||
if !args.iter().any(|arg| arg == "--sysroot") {
|
||||
cmd.arg("--sysroot").arg(&sysroot);
|
||||
}
|
||||
|
||||
cmd.arg("-Zexternal-macro-backtrace");
|
||||
|
||||
|
|
@ -144,12 +180,6 @@ fn main() {
|
|||
cmd.arg(format!("-Clinker={}", target_linker));
|
||||
}
|
||||
|
||||
let crate_name = args.windows(2)
|
||||
.find(|a| &*a[0] == "--crate-name")
|
||||
.unwrap();
|
||||
let crate_name = &*crate_name[1];
|
||||
maybe_crate = Some(crate_name);
|
||||
|
||||
// If we're compiling specifically the `panic_abort` crate then we pass
|
||||
// the `-C panic=abort` option. Note that we do not do this for any
|
||||
// other crate intentionally as this is the only crate for now that we
|
||||
|
|
@ -162,18 +192,13 @@ fn main() {
|
|||
// `compiler_builtins` are unconditionally compiled with panic=abort to
|
||||
// workaround undefined references to `rust_eh_unwind_resume` generated
|
||||
// otherwise, see issue https://github.com/rust-lang/rust/issues/43095.
|
||||
if crate_name == "panic_abort" ||
|
||||
crate_name == "compiler_builtins" && stage != "0" {
|
||||
if crate_name == Some("panic_abort") ||
|
||||
crate_name == Some("compiler_builtins") && stage != "0" {
|
||||
cmd.arg("-C").arg("panic=abort");
|
||||
}
|
||||
|
||||
// Set various options from config.toml to configure how we're building
|
||||
// code.
|
||||
if env::var("RUSTC_DEBUGINFO") == Ok("true".to_string()) {
|
||||
cmd.arg("-g");
|
||||
} else if env::var("RUSTC_DEBUGINFO_LINES") == Ok("true".to_string()) {
|
||||
cmd.arg("-Cdebuginfo=1");
|
||||
}
|
||||
let debug_assertions = match env::var("RUSTC_DEBUG_ASSERTIONS") {
|
||||
Ok(s) => if s == "true" { "y" } else { "n" },
|
||||
Err(..) => "n",
|
||||
|
|
@ -181,7 +206,7 @@ fn main() {
|
|||
|
||||
// The compiler builtins are pretty sensitive to symbols referenced in
|
||||
// libcore and such, so we never compile them with debug assertions.
|
||||
if crate_name == "compiler_builtins" {
|
||||
if crate_name == Some("compiler_builtins") {
|
||||
cmd.arg("-C").arg("debug-assertions=no");
|
||||
} else {
|
||||
cmd.arg("-C").arg(format!("debug-assertions={}", debug_assertions));
|
||||
|
|
@ -256,24 +281,6 @@ fn main() {
|
|||
cmd.arg("-C").arg("target-feature=-crt-static");
|
||||
}
|
||||
}
|
||||
|
||||
// When running miri tests, we need to generate MIR for all libraries
|
||||
if env::var("TEST_MIRI").ok().map_or(false, |val| val == "true") {
|
||||
// The flags here should be kept in sync with `add_miri_default_args`
|
||||
// in miri's `src/lib.rs`.
|
||||
cmd.arg("-Zalways-encode-mir");
|
||||
cmd.arg("--cfg=miri");
|
||||
// These options are preferred by miri, to be able to perform better validation,
|
||||
// but the bootstrap compiler might not understand them.
|
||||
if stage != "0" {
|
||||
cmd.arg("-Zmir-emit-retag");
|
||||
cmd.arg("-Zmir-opt-level=0");
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(map) = env::var("RUSTC_DEBUGINFO_MAP") {
|
||||
cmd.arg("--remap-path-prefix").arg(&map);
|
||||
}
|
||||
} else {
|
||||
// Override linker if necessary.
|
||||
if let Ok(host_linker) = env::var("RUSTC_HOST_LINKER") {
|
||||
|
|
@ -290,8 +297,9 @@ fn main() {
|
|||
}
|
||||
}
|
||||
|
||||
// This is required for internal lints.
|
||||
cmd.arg("-Zunstable-options");
|
||||
if let Ok(map) = env::var("RUSTC_DEBUGINFO_MAP") {
|
||||
cmd.arg("--remap-path-prefix").arg(&map);
|
||||
}
|
||||
|
||||
// Force all crates compiled by this compiler to (a) be unstable and (b)
|
||||
// allow the `rustc_private` feature to link to other unstable crates
|
||||
|
|
@ -305,13 +313,6 @@ fn main() {
|
|||
cmd.arg("--cfg").arg("parallel_compiler");
|
||||
}
|
||||
|
||||
if env::var_os("RUSTC_DENY_WARNINGS").is_some() && env::var_os("RUSTC_EXTERNAL_TOOL").is_none()
|
||||
{
|
||||
cmd.arg("-Dwarnings");
|
||||
cmd.arg("-Dbare_trait_objects");
|
||||
cmd.arg("-Drust_2018_idioms");
|
||||
}
|
||||
|
||||
if verbose > 1 {
|
||||
eprintln!(
|
||||
"rustc command: {:?}={:?} {:?}",
|
||||
|
|
@ -334,7 +335,7 @@ fn main() {
|
|||
}
|
||||
|
||||
if env::var_os("RUSTC_PRINT_STEP_TIMINGS").is_some() {
|
||||
if let Some(krate) = maybe_crate {
|
||||
if let Some(crate_name) = crate_name {
|
||||
let start = Instant::now();
|
||||
let status = cmd
|
||||
.status()
|
||||
|
|
@ -343,7 +344,7 @@ fn main() {
|
|||
|
||||
let is_test = args.iter().any(|a| a == "--test");
|
||||
eprintln!("[RUSTC-TIMING] {} test:{} {}.{:03}",
|
||||
krate.to_string_lossy(),
|
||||
crate_name,
|
||||
is_test,
|
||||
dur.as_secs(),
|
||||
dur.subsec_nanos() / 1_000_000);
|
||||
|
|
@ -362,6 +363,14 @@ fn main() {
|
|||
std::process::exit(code);
|
||||
}
|
||||
|
||||
// Rustc crates for which internal lints are in effect.
|
||||
fn use_internal_lints(crate_name: Option<&str>) -> bool {
|
||||
crate_name.map_or(false, |crate_name| {
|
||||
crate_name.starts_with("rustc") || crate_name.starts_with("syntax") ||
|
||||
["arena", "fmt_macros"].contains(&crate_name)
|
||||
})
|
||||
}
|
||||
|
||||
#[cfg(unix)]
|
||||
fn exec_cmd(cmd: &mut Command) -> io::Result<i32> {
|
||||
use std::os::unix::process::CommandExt;
|
||||
|
|
|
|||
|
|
@ -2,7 +2,8 @@
|
|||
//!
|
||||
//! See comments in `src/bootstrap/rustc.rs` for more information.
|
||||
|
||||
#![deny(warnings)]
|
||||
// NO-RUSTC-WRAPPER
|
||||
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
|
||||
|
||||
use std::env;
|
||||
use std::process::Command;
|
||||
|
|
|
|||
|
|
@ -735,6 +735,47 @@ class RustBuild(object):
|
|||
"""Set download URL for development environment"""
|
||||
self._download_url = 'https://dev-static.rust-lang.org'
|
||||
|
||||
def check_vendored_status(self):
|
||||
"""Check that vendoring is configured properly"""
|
||||
vendor_dir = os.path.join(self.rust_root, 'vendor')
|
||||
if 'SUDO_USER' in os.environ and not self.use_vendored_sources:
|
||||
if os.environ.get('USER') != os.environ['SUDO_USER']:
|
||||
self.use_vendored_sources = True
|
||||
print('info: looks like you are running this command under `sudo`')
|
||||
print(' and so in order to preserve your $HOME this will now')
|
||||
print(' use vendored sources by default.')
|
||||
if not os.path.exists(vendor_dir):
|
||||
print('error: vendoring required, but vendor directory does not exist.')
|
||||
print(' Run `cargo vendor` without sudo to initialize the '
|
||||
'vendor directory.')
|
||||
raise Exception("{} not found".format(vendor_dir))
|
||||
|
||||
if self.use_vendored_sources:
|
||||
if not os.path.exists('.cargo'):
|
||||
os.makedirs('.cargo')
|
||||
with output('.cargo/config') as cargo_config:
|
||||
cargo_config.write(
|
||||
"[source.crates-io]\n"
|
||||
"replace-with = 'vendored-sources'\n"
|
||||
"registry = 'https://example.com'\n"
|
||||
"\n"
|
||||
"[source.vendored-sources]\n"
|
||||
"directory = '{}/vendor'\n"
|
||||
.format(self.rust_root))
|
||||
else:
|
||||
if os.path.exists('.cargo'):
|
||||
shutil.rmtree('.cargo')
|
||||
|
||||
def ensure_vendored(self):
|
||||
"""Ensure that the vendored sources are available if needed"""
|
||||
vendor_dir = os.path.join(self.rust_root, 'vendor')
|
||||
# Note that this does not handle updating the vendored dependencies if
|
||||
# the rust git repository is updated. Normal development usually does
|
||||
# not use vendoring, so hopefully this isn't too much of a problem.
|
||||
if self.use_vendored_sources and not os.path.exists(vendor_dir):
|
||||
run([self.cargo(), "vendor"],
|
||||
verbose=self.verbose, cwd=self.rust_root)
|
||||
|
||||
|
||||
def bootstrap(help_triggered):
|
||||
"""Configure, fetch, build and run the initial bootstrap"""
|
||||
|
|
@ -776,30 +817,7 @@ def bootstrap(help_triggered):
|
|||
|
||||
build.use_locked_deps = '\nlocked-deps = true' in build.config_toml
|
||||
|
||||
if 'SUDO_USER' in os.environ and not build.use_vendored_sources:
|
||||
if os.environ.get('USER') != os.environ['SUDO_USER']:
|
||||
build.use_vendored_sources = True
|
||||
print('info: looks like you are running this command under `sudo`')
|
||||
print(' and so in order to preserve your $HOME this will now')
|
||||
print(' use vendored sources by default. Note that if this')
|
||||
print(' does not work you should run a normal build first')
|
||||
print(' before running a command like `sudo ./x.py install`')
|
||||
|
||||
if build.use_vendored_sources:
|
||||
if not os.path.exists('.cargo'):
|
||||
os.makedirs('.cargo')
|
||||
with output('.cargo/config') as cargo_config:
|
||||
cargo_config.write("""
|
||||
[source.crates-io]
|
||||
replace-with = 'vendored-sources'
|
||||
registry = 'https://example.com'
|
||||
|
||||
[source.vendored-sources]
|
||||
directory = '{}/vendor'
|
||||
""".format(build.rust_root))
|
||||
else:
|
||||
if os.path.exists('.cargo'):
|
||||
shutil.rmtree('.cargo')
|
||||
build.check_vendored_status()
|
||||
|
||||
data = stage0_data(build.rust_root)
|
||||
build.date = data['date']
|
||||
|
|
@ -815,6 +833,7 @@ def bootstrap(help_triggered):
|
|||
build.build = args.build or build.build_triple()
|
||||
build.download_stage0()
|
||||
sys.stdout.flush()
|
||||
build.ensure_vendored()
|
||||
build.build_bootstrap()
|
||||
sys.stdout.flush()
|
||||
|
||||
|
|
|
|||
|
|
@ -59,7 +59,7 @@ pub trait Step: 'static + Clone + Debug + PartialEq + Eq + Hash {
|
|||
|
||||
const DEFAULT: bool = false;
|
||||
|
||||
/// Run this rule for all hosts without cross compiling.
|
||||
/// If true, then this rule should be skipped if --target was specified, but --host was not
|
||||
const ONLY_HOSTS: bool = false;
|
||||
|
||||
/// Primary function to execute this rule. Can call `builder.ensure()`
|
||||
|
|
@ -145,7 +145,7 @@ impl StepDescription {
|
|||
only_hosts: S::ONLY_HOSTS,
|
||||
should_run: S::should_run,
|
||||
make_run: S::make_run,
|
||||
name: unsafe { ::std::intrinsics::type_name::<S>() },
|
||||
name: std::any::type_name::<S>(),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -163,7 +163,7 @@ impl StepDescription {
|
|||
|
||||
// Determine the targets participating in this rule.
|
||||
let targets = if self.only_hosts {
|
||||
if !builder.config.run_host_only {
|
||||
if builder.config.skip_only_host_steps {
|
||||
return; // don't run anything
|
||||
} else {
|
||||
&builder.hosts
|
||||
|
|
@ -318,6 +318,8 @@ impl<'a> ShouldRun<'a> {
|
|||
pub enum Kind {
|
||||
Build,
|
||||
Check,
|
||||
Clippy,
|
||||
Fix,
|
||||
Test,
|
||||
Bench,
|
||||
Dist,
|
||||
|
|
@ -359,7 +361,7 @@ impl<'a> Builder<'a> {
|
|||
tool::Miri,
|
||||
native::Lld
|
||||
),
|
||||
Kind::Check => describe!(
|
||||
Kind::Check | Kind::Clippy | Kind::Fix => describe!(
|
||||
check::Std,
|
||||
check::Test,
|
||||
check::Rustc,
|
||||
|
|
@ -369,7 +371,6 @@ impl<'a> Builder<'a> {
|
|||
Kind::Test => describe!(
|
||||
test::Tidy,
|
||||
test::Ui,
|
||||
test::RunPass,
|
||||
test::CompileFail,
|
||||
test::RunFail,
|
||||
test::RunPassValgrind,
|
||||
|
|
@ -380,10 +381,8 @@ impl<'a> Builder<'a> {
|
|||
test::Incremental,
|
||||
test::Debuginfo,
|
||||
test::UiFullDeps,
|
||||
test::RunPassFullDeps,
|
||||
test::Rustdoc,
|
||||
test::Pretty,
|
||||
test::RunPassPretty,
|
||||
test::RunFailPretty,
|
||||
test::RunPassValgrindPretty,
|
||||
test::Crate,
|
||||
|
|
@ -403,6 +402,7 @@ impl<'a> Builder<'a> {
|
|||
test::TheBook,
|
||||
test::UnstableBook,
|
||||
test::RustcBook,
|
||||
test::RustcGuide,
|
||||
test::EmbeddedBook,
|
||||
test::EditionGuide,
|
||||
test::Rustfmt,
|
||||
|
|
@ -520,6 +520,8 @@ impl<'a> Builder<'a> {
|
|||
let (kind, paths) = match build.config.cmd {
|
||||
Subcommand::Build { ref paths } => (Kind::Build, &paths[..]),
|
||||
Subcommand::Check { ref paths } => (Kind::Check, &paths[..]),
|
||||
Subcommand::Clippy { ref paths } => (Kind::Clippy, &paths[..]),
|
||||
Subcommand::Fix { ref paths } => (Kind::Fix, &paths[..]),
|
||||
Subcommand::Doc { ref paths } => (Kind::Doc, &paths[..]),
|
||||
Subcommand::Test { ref paths, .. } => (Kind::Test, &paths[..]),
|
||||
Subcommand::Bench { ref paths, .. } => (Kind::Bench, &paths[..]),
|
||||
|
|
@ -541,15 +543,6 @@ impl<'a> Builder<'a> {
|
|||
parent: Cell::new(None),
|
||||
};
|
||||
|
||||
if kind == Kind::Dist {
|
||||
assert!(
|
||||
!builder.config.test_miri,
|
||||
"Do not distribute with miri enabled.\n\
|
||||
The distributed libraries would include all MIR (increasing binary size).
|
||||
The distributed MIR would include validation statements."
|
||||
);
|
||||
}
|
||||
|
||||
builder
|
||||
}
|
||||
|
||||
|
|
@ -577,6 +570,30 @@ impl<'a> Builder<'a> {
|
|||
})
|
||||
}
|
||||
|
||||
/// Similar to `compiler`, except handles the full-bootstrap option to
|
||||
/// silently use the stage1 compiler instead of a stage2 compiler if one is
|
||||
/// requested.
|
||||
///
|
||||
/// Note that this does *not* have the side effect of creating
|
||||
/// `compiler(stage, host)`, unlike `compiler` above which does have such
|
||||
/// a side effect. The returned compiler here can only be used to compile
|
||||
/// new artifacts, it can't be used to rely on the presence of a particular
|
||||
/// sysroot.
|
||||
///
|
||||
/// See `force_use_stage1` for documentation on what each argument is.
|
||||
pub fn compiler_for(
|
||||
&self,
|
||||
stage: u32,
|
||||
host: Interned<String>,
|
||||
target: Interned<String>,
|
||||
) -> Compiler {
|
||||
if self.build.force_use_stage1(Compiler { stage, host }, target) {
|
||||
self.compiler(1, self.config.build)
|
||||
} else {
|
||||
self.compiler(stage, host)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn sysroot(&self, compiler: Compiler) -> Interned<PathBuf> {
|
||||
self.ensure(compile::Sysroot { compiler })
|
||||
}
|
||||
|
|
@ -737,80 +754,20 @@ impl<'a> Builder<'a> {
|
|||
let mut cargo = Command::new(&self.initial_cargo);
|
||||
let out_dir = self.stage_out(compiler, mode);
|
||||
|
||||
// command specific path, we call clear_if_dirty with this
|
||||
let mut my_out = match cmd {
|
||||
"build" => self.cargo_out(compiler, mode, target),
|
||||
|
||||
// This is the intended out directory for crate documentation.
|
||||
"doc" | "rustdoc" => self.crate_doc_out(target),
|
||||
|
||||
_ => self.stage_out(compiler, mode),
|
||||
};
|
||||
|
||||
// This is for the original compiler, but if we're forced to use stage 1, then
|
||||
// std/test/rustc stamps won't exist in stage 2, so we need to get those from stage 1, since
|
||||
// we copy the libs forward.
|
||||
let cmp = if self.force_use_stage1(compiler, target) {
|
||||
self.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler
|
||||
};
|
||||
|
||||
let libstd_stamp = match cmd {
|
||||
"check" => check::libstd_stamp(self, cmp, target),
|
||||
_ => compile::libstd_stamp(self, cmp, target),
|
||||
};
|
||||
|
||||
let libtest_stamp = match cmd {
|
||||
"check" => check::libtest_stamp(self, cmp, target),
|
||||
_ => compile::libstd_stamp(self, cmp, target),
|
||||
};
|
||||
|
||||
let librustc_stamp = match cmd {
|
||||
"check" => check::librustc_stamp(self, cmp, target),
|
||||
_ => compile::librustc_stamp(self, cmp, target),
|
||||
};
|
||||
// Codegen backends are not yet tracked by -Zbinary-dep-depinfo,
|
||||
// so we need to explicitly clear out if they've been updated.
|
||||
for backend in self.codegen_backends(compiler) {
|
||||
self.clear_if_dirty(&out_dir, &backend);
|
||||
}
|
||||
|
||||
if cmd == "doc" || cmd == "rustdoc" {
|
||||
if mode == Mode::Rustc || mode == Mode::ToolRustc || mode == Mode::Codegen {
|
||||
let my_out = match mode {
|
||||
// This is the intended out directory for compiler documentation.
|
||||
my_out = self.compiler_doc_out(target);
|
||||
}
|
||||
Mode::Rustc | Mode::ToolRustc | Mode::Codegen => self.compiler_doc_out(target),
|
||||
_ => self.crate_doc_out(target),
|
||||
};
|
||||
let rustdoc = self.rustdoc(compiler);
|
||||
self.clear_if_dirty(&my_out, &rustdoc);
|
||||
} else if cmd != "test" {
|
||||
match mode {
|
||||
Mode::Std => {
|
||||
self.clear_if_dirty(&my_out, &self.rustc(compiler));
|
||||
for backend in self.codegen_backends(compiler) {
|
||||
self.clear_if_dirty(&my_out, &backend);
|
||||
}
|
||||
},
|
||||
Mode::Test => {
|
||||
self.clear_if_dirty(&my_out, &libstd_stamp);
|
||||
},
|
||||
Mode::Rustc => {
|
||||
self.clear_if_dirty(&my_out, &self.rustc(compiler));
|
||||
self.clear_if_dirty(&my_out, &libstd_stamp);
|
||||
self.clear_if_dirty(&my_out, &libtest_stamp);
|
||||
},
|
||||
Mode::Codegen => {
|
||||
self.clear_if_dirty(&my_out, &librustc_stamp);
|
||||
},
|
||||
Mode::ToolBootstrap => { },
|
||||
Mode::ToolStd => {
|
||||
self.clear_if_dirty(&my_out, &libstd_stamp);
|
||||
},
|
||||
Mode::ToolTest => {
|
||||
self.clear_if_dirty(&my_out, &libstd_stamp);
|
||||
self.clear_if_dirty(&my_out, &libtest_stamp);
|
||||
},
|
||||
Mode::ToolRustc => {
|
||||
self.clear_if_dirty(&my_out, &libstd_stamp);
|
||||
self.clear_if_dirty(&my_out, &libtest_stamp);
|
||||
self.clear_if_dirty(&my_out, &librustc_stamp);
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
cargo
|
||||
|
|
@ -831,9 +788,9 @@ impl<'a> Builder<'a> {
|
|||
assert_eq!(target, compiler.host);
|
||||
}
|
||||
|
||||
// Set a flag for `check` so that certain build scripts can do less work
|
||||
// (e.g., not building/requiring LLVM).
|
||||
if cmd == "check" {
|
||||
// Set a flag for `check`/`clippy`/`fix`, so that certain build
|
||||
// scripts can do less work (e.g. not building/requiring LLVM).
|
||||
if cmd == "check" || cmd == "clippy" || cmd == "fix" {
|
||||
cargo.env("RUST_CHECK", "1");
|
||||
}
|
||||
|
||||
|
|
@ -848,6 +805,19 @@ impl<'a> Builder<'a> {
|
|||
},
|
||||
}
|
||||
|
||||
// This tells Cargo (and in turn, rustc) to output more complete
|
||||
// dependency information. Most importantly for rustbuild, this
|
||||
// includes sysroot artifacts, like libstd, which means that we don't
|
||||
// need to track those in rustbuild (an error prone process!). This
|
||||
// feature is currently unstable as there may be some bugs and such, but
|
||||
// it represents a big improvement in rustbuild's reliability on
|
||||
// rebuilds, so we're using it here.
|
||||
//
|
||||
// For some additional context, see #63470 (the PR originally adding
|
||||
// this), as well as #63012 which is the tracking issue for this
|
||||
// feature on the rustc side.
|
||||
cargo.arg("-Zbinary-dep-depinfo");
|
||||
|
||||
cargo.arg("-j").arg(self.jobs().to_string());
|
||||
// Remove make-related flags to ensure Cargo can correctly set things up
|
||||
cargo.env_remove("MAKEFLAGS");
|
||||
|
|
@ -898,6 +868,11 @@ impl<'a> Builder<'a> {
|
|||
extra_args.push_str(&s);
|
||||
}
|
||||
|
||||
if cmd == "clippy" {
|
||||
extra_args.push_str("-Zforce-unstable-if-unmarked -Zunstable-options \
|
||||
--json-rendered=termcolor");
|
||||
}
|
||||
|
||||
if !extra_args.is_empty() {
|
||||
cargo.env(
|
||||
"RUSTFLAGS",
|
||||
|
|
@ -954,7 +929,6 @@ impl<'a> Builder<'a> {
|
|||
PathBuf::from("/path/to/nowhere/rustdoc/not/required")
|
||||
},
|
||||
)
|
||||
.env("TEST_MIRI", self.config.test_miri.to_string())
|
||||
.env("RUSTC_ERROR_METADATA_DST", self.extended_error_dir());
|
||||
|
||||
if let Some(host_linker) = self.linker(compiler.host) {
|
||||
|
|
@ -963,29 +937,19 @@ impl<'a> Builder<'a> {
|
|||
if let Some(target_linker) = self.linker(target) {
|
||||
cargo.env("RUSTC_TARGET_LINKER", target_linker);
|
||||
}
|
||||
if let Some(ref error_format) = self.config.rustc_error_format {
|
||||
cargo.env("RUSTC_ERROR_FORMAT", error_format);
|
||||
}
|
||||
if cmd != "build" && cmd != "check" && cmd != "rustc" && want_rustdoc {
|
||||
if !(["build", "check", "clippy", "fix", "rustc"].contains(&cmd)) && want_rustdoc {
|
||||
cargo.env("RUSTDOC_LIBDIR", self.rustc_libdir(compiler));
|
||||
}
|
||||
|
||||
if mode.is_tool() {
|
||||
// Tools like cargo and rls don't get debuginfo by default right now, but this can be
|
||||
// enabled in the config. Adding debuginfo makes them several times larger.
|
||||
if self.config.rust_debuginfo_tools {
|
||||
cargo.env("RUSTC_DEBUGINFO", self.config.rust_debuginfo.to_string());
|
||||
cargo.env(
|
||||
"RUSTC_DEBUGINFO_LINES",
|
||||
self.config.rust_debuginfo_lines.to_string(),
|
||||
);
|
||||
}
|
||||
} else {
|
||||
cargo.env("RUSTC_DEBUGINFO", self.config.rust_debuginfo.to_string());
|
||||
cargo.env(
|
||||
"RUSTC_DEBUGINFO_LINES",
|
||||
self.config.rust_debuginfo_lines.to_string(),
|
||||
);
|
||||
let debuginfo_level = match mode {
|
||||
Mode::Rustc | Mode::Codegen => self.config.rust_debuginfo_level_rustc,
|
||||
Mode::Std | Mode::Test => self.config.rust_debuginfo_level_std,
|
||||
Mode::ToolBootstrap | Mode::ToolStd |
|
||||
Mode::ToolTest | Mode::ToolRustc => self.config.rust_debuginfo_level_tools,
|
||||
};
|
||||
cargo.env("RUSTC_DEBUGINFO_LEVEL", debuginfo_level.to_string());
|
||||
|
||||
if !mode.is_tool() {
|
||||
cargo.env("RUSTC_FORCE_UNSTABLE", "1");
|
||||
|
||||
// Currently the compiler depends on crates from crates.io, and
|
||||
|
|
@ -1305,654 +1269,4 @@ impl<'a> Builder<'a> {
|
|||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod __test {
|
||||
use super::*;
|
||||
use crate::config::Config;
|
||||
use std::thread;
|
||||
|
||||
use pretty_assertions::assert_eq;
|
||||
|
||||
fn configure(host: &[&str], target: &[&str]) -> Config {
|
||||
let mut config = Config::default_opts();
|
||||
// don't save toolstates
|
||||
config.save_toolstates = None;
|
||||
config.run_host_only = true;
|
||||
config.dry_run = true;
|
||||
// try to avoid spurious failures in dist where we create/delete each others file
|
||||
let dir = config.out.join("tmp-rustbuild-tests").join(
|
||||
&thread::current()
|
||||
.name()
|
||||
.unwrap_or("unknown")
|
||||
.replace(":", "-"),
|
||||
);
|
||||
t!(fs::create_dir_all(&dir));
|
||||
config.out = dir;
|
||||
config.build = INTERNER.intern_str("A");
|
||||
config.hosts = vec![config.build]
|
||||
.clone()
|
||||
.into_iter()
|
||||
.chain(host.iter().map(|s| INTERNER.intern_str(s)))
|
||||
.collect::<Vec<_>>();
|
||||
config.targets = config
|
||||
.hosts
|
||||
.clone()
|
||||
.into_iter()
|
||||
.chain(target.iter().map(|s| INTERNER.intern_str(s)))
|
||||
.collect::<Vec<_>>();
|
||||
config
|
||||
}
|
||||
|
||||
fn first<A, B>(v: Vec<(A, B)>) -> Vec<A> {
|
||||
v.into_iter().map(|(a, _)| a).collect::<Vec<_>>()
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_baseline() {
|
||||
let build = Build::new(configure(&[], &[]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[dist::Docs { stage: 2, host: a },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_targets() {
|
||||
let build = Build::new(configure(&[], &["B"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { stage: 2, host: a },
|
||||
dist::Docs { stage: 2, host: b },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a }, dist::Mingw { host: b },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_hosts() {
|
||||
let build = Build::new(configure(&["B"], &[]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { stage: 2, host: a },
|
||||
dist::Docs { stage: 2, host: b },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a }, dist::Mingw { host: b },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_targets_and_hosts() {
|
||||
let build = Build::new(configure(&["B"], &["C"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { stage: 2, host: a },
|
||||
dist::Docs { stage: 2, host: b },
|
||||
dist::Docs { stage: 2, host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[
|
||||
dist::Mingw { host: a },
|
||||
dist::Mingw { host: b },
|
||||
dist::Mingw { host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_target_flag() {
|
||||
let mut config = configure(&["B"], &["C"]);
|
||||
config.run_host_only = false; // as-if --target=C was passed
|
||||
let build = Build::new(config);
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { stage: 2, host: a },
|
||||
dist::Docs { stage: 2, host: b },
|
||||
dist::Docs { stage: 2, host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[
|
||||
dist::Mingw { host: a },
|
||||
dist::Mingw { host: b },
|
||||
dist::Mingw { host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Rustc>()), &[]);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_same_targets_and_hosts() {
|
||||
let build = Build::new(configure(&["B"], &["B"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { stage: 2, host: a },
|
||||
dist::Docs { stage: 2, host: b },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a }, dist::Mingw { host: b },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Std>()),
|
||||
&[
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Test>()),
|
||||
&[
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Assemble>()),
|
||||
&[
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 0 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 1 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 2 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: b, stage: 2 },
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_default() {
|
||||
let build = Build::new(configure(&["B"], &["C"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Build), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert!(!builder.cache.all::<compile::Std>().is_empty());
|
||||
assert!(!builder.cache.all::<compile::Assemble>().is_empty());
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Rustc>()),
|
||||
&[
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: b,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Test>()),
|
||||
&[
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_with_target_flag() {
|
||||
let mut config = configure(&["B"], &["C"]);
|
||||
config.run_host_only = false;
|
||||
let build = Build::new(config);
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Build), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert!(!builder.cache.all::<compile::Std>().is_empty());
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Assemble>()),
|
||||
&[
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 0 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 1 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: b, stage: 1 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 2 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: b, stage: 2 },
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Rustc>()),
|
||||
&[
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: b,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Test>()),
|
||||
&[
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_with_no_doc_stage0() {
|
||||
let mut config = configure(&[], &[]);
|
||||
config.stage = Some(0);
|
||||
config.cmd = Subcommand::Test {
|
||||
paths: vec!["src/libstd".into()],
|
||||
test_args: vec![],
|
||||
rustc_args: vec![],
|
||||
fail_fast: true,
|
||||
doc_tests: DocTests::No,
|
||||
bless: false,
|
||||
compare_mode: None,
|
||||
rustfix_coverage: false,
|
||||
};
|
||||
|
||||
let build = Build::new(config);
|
||||
let mut builder = Builder::new(&build);
|
||||
|
||||
let host = INTERNER.intern_str("A");
|
||||
|
||||
builder.run_step_descriptions(
|
||||
&[StepDescription::from::<test::Crate>()],
|
||||
&["src/libstd".into()],
|
||||
);
|
||||
|
||||
// Ensure we don't build any compiler artifacts.
|
||||
assert!(!builder.cache.contains::<compile::Rustc>());
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<test::Crate>()),
|
||||
&[test::Crate {
|
||||
compiler: Compiler { host, stage: 0 },
|
||||
target: host,
|
||||
mode: Mode::Std,
|
||||
test_kind: test::TestKind::Test,
|
||||
krate: INTERNER.intern_str("std"),
|
||||
},]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_exclude() {
|
||||
let mut config = configure(&[], &[]);
|
||||
config.exclude = vec![
|
||||
"src/test/run-pass".into(),
|
||||
"src/tools/tidy".into(),
|
||||
];
|
||||
config.cmd = Subcommand::Test {
|
||||
paths: Vec::new(),
|
||||
test_args: Vec::new(),
|
||||
rustc_args: Vec::new(),
|
||||
fail_fast: true,
|
||||
doc_tests: DocTests::No,
|
||||
bless: false,
|
||||
compare_mode: None,
|
||||
rustfix_coverage: false,
|
||||
};
|
||||
|
||||
let build = Build::new(config);
|
||||
let builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Test), &[]);
|
||||
|
||||
// Ensure we have really excluded run-pass & tidy
|
||||
assert!(!builder.cache.contains::<test::RunPass>());
|
||||
assert!(!builder.cache.contains::<test::Tidy>());
|
||||
|
||||
// Ensure other tests are not affected.
|
||||
assert!(builder.cache.contains::<test::RunPassFullDeps>());
|
||||
assert!(builder.cache.contains::<test::RustdocUi>());
|
||||
}
|
||||
}
|
||||
mod tests;
|
||||
|
|
|
|||
655
src/bootstrap/builder/tests.rs
Normal file
655
src/bootstrap/builder/tests.rs
Normal file
|
|
@ -0,0 +1,655 @@
|
|||
use super::*;
|
||||
use crate::config::Config;
|
||||
use std::thread;
|
||||
|
||||
use pretty_assertions::assert_eq;
|
||||
|
||||
fn configure(host: &[&str], target: &[&str]) -> Config {
|
||||
let mut config = Config::default_opts();
|
||||
// don't save toolstates
|
||||
config.save_toolstates = None;
|
||||
config.skip_only_host_steps = false;
|
||||
config.dry_run = true;
|
||||
// try to avoid spurious failures in dist where we create/delete each others file
|
||||
let dir = config.out.join("tmp-rustbuild-tests").join(
|
||||
&thread::current()
|
||||
.name()
|
||||
.unwrap_or("unknown")
|
||||
.replace(":", "-"),
|
||||
);
|
||||
t!(fs::create_dir_all(&dir));
|
||||
config.out = dir;
|
||||
config.build = INTERNER.intern_str("A");
|
||||
config.hosts = vec![config.build]
|
||||
.clone()
|
||||
.into_iter()
|
||||
.chain(host.iter().map(|s| INTERNER.intern_str(s)))
|
||||
.collect::<Vec<_>>();
|
||||
config.targets = config
|
||||
.hosts
|
||||
.clone()
|
||||
.into_iter()
|
||||
.chain(target.iter().map(|s| INTERNER.intern_str(s)))
|
||||
.collect::<Vec<_>>();
|
||||
config
|
||||
}
|
||||
|
||||
fn first<A, B>(v: Vec<(A, B)>) -> Vec<A> {
|
||||
v.into_iter().map(|(a, _)| a).collect::<Vec<_>>()
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_baseline() {
|
||||
let build = Build::new(configure(&[], &[]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[dist::Docs { host: a },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_targets() {
|
||||
let build = Build::new(configure(&[], &["B"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { host: a },
|
||||
dist::Docs { host: b },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a }, dist::Mingw { host: b },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_hosts() {
|
||||
let build = Build::new(configure(&["B"], &[]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { host: a },
|
||||
dist::Docs { host: b },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a }, dist::Mingw { host: b },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_only_cross_host() {
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let mut build = Build::new(configure(&["B"], &[]));
|
||||
build.config.docs = false;
|
||||
build.config.extended = true;
|
||||
build.hosts = vec![b];
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Rustc>()),
|
||||
&[
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_targets_and_hosts() {
|
||||
let build = Build::new(configure(&["B"], &["C"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { host: a },
|
||||
dist::Docs { host: b },
|
||||
dist::Docs { host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[
|
||||
dist::Mingw { host: a },
|
||||
dist::Mingw { host: b },
|
||||
dist::Mingw { host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_target_flag() {
|
||||
let mut config = configure(&["B"], &["C"]);
|
||||
config.skip_only_host_steps = true; // as-if --target=C was passed
|
||||
let build = Build::new(config);
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { host: a },
|
||||
dist::Docs { host: b },
|
||||
dist::Docs { host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[
|
||||
dist::Mingw { host: a },
|
||||
dist::Mingw { host: b },
|
||||
dist::Mingw { host: c },
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Rustc>()), &[]);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn dist_with_same_targets_and_hosts() {
|
||||
let build = Build::new(configure(&["B"], &["B"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Dist), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Docs>()),
|
||||
&[
|
||||
dist::Docs { host: a },
|
||||
dist::Docs { host: b },
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Mingw>()),
|
||||
&[dist::Mingw { host: a }, dist::Mingw { host: b },]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Rustc>()),
|
||||
&[
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 }
|
||||
},
|
||||
dist::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 }
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<dist::Std>()),
|
||||
&[
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
dist::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Std>()),
|
||||
&[
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Std {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Test>()),
|
||||
&[
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Assemble>()),
|
||||
&[
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 0 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 1 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 2 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: b, stage: 2 },
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_default() {
|
||||
let build = Build::new(configure(&["B"], &["C"]));
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Build), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert!(!builder.cache.all::<compile::Std>().is_empty());
|
||||
assert!(!builder.cache.all::<compile::Assemble>().is_empty());
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Rustc>()),
|
||||
&[
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Test>()),
|
||||
&[
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_with_target_flag() {
|
||||
let mut config = configure(&["B"], &["C"]);
|
||||
config.skip_only_host_steps = true;
|
||||
let build = Build::new(config);
|
||||
let mut builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Build), &[]);
|
||||
|
||||
let a = INTERNER.intern_str("A");
|
||||
let b = INTERNER.intern_str("B");
|
||||
let c = INTERNER.intern_str("C");
|
||||
|
||||
assert!(!builder.cache.all::<compile::Std>().is_empty());
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Assemble>()),
|
||||
&[
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 0 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 1 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: a, stage: 2 },
|
||||
},
|
||||
compile::Assemble {
|
||||
target_compiler: Compiler { host: b, stage: 2 },
|
||||
},
|
||||
]
|
||||
);
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Rustc>()),
|
||||
&[
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Rustc {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
]
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<compile::Test>()),
|
||||
&[
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 0 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: a,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 1 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: b,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: a, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
compile::Test {
|
||||
compiler: Compiler { host: b, stage: 2 },
|
||||
target: c,
|
||||
},
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_with_no_doc_stage0() {
|
||||
let mut config = configure(&[], &[]);
|
||||
config.stage = Some(0);
|
||||
config.cmd = Subcommand::Test {
|
||||
paths: vec!["src/libstd".into()],
|
||||
test_args: vec![],
|
||||
rustc_args: vec![],
|
||||
fail_fast: true,
|
||||
doc_tests: DocTests::No,
|
||||
bless: false,
|
||||
compare_mode: None,
|
||||
rustfix_coverage: false,
|
||||
pass: None,
|
||||
};
|
||||
|
||||
let build = Build::new(config);
|
||||
let mut builder = Builder::new(&build);
|
||||
|
||||
let host = INTERNER.intern_str("A");
|
||||
|
||||
builder.run_step_descriptions(
|
||||
&[StepDescription::from::<test::Crate>()],
|
||||
&["src/libstd".into()],
|
||||
);
|
||||
|
||||
// Ensure we don't build any compiler artifacts.
|
||||
assert!(!builder.cache.contains::<compile::Rustc>());
|
||||
assert_eq!(
|
||||
first(builder.cache.all::<test::Crate>()),
|
||||
&[test::Crate {
|
||||
compiler: Compiler { host, stage: 0 },
|
||||
target: host,
|
||||
mode: Mode::Std,
|
||||
test_kind: test::TestKind::Test,
|
||||
krate: INTERNER.intern_str("std"),
|
||||
},]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_exclude() {
|
||||
let mut config = configure(&[], &[]);
|
||||
config.exclude = vec![
|
||||
"src/tools/tidy".into(),
|
||||
];
|
||||
config.cmd = Subcommand::Test {
|
||||
paths: Vec::new(),
|
||||
test_args: Vec::new(),
|
||||
rustc_args: Vec::new(),
|
||||
fail_fast: true,
|
||||
doc_tests: DocTests::No,
|
||||
bless: false,
|
||||
compare_mode: None,
|
||||
rustfix_coverage: false,
|
||||
pass: None,
|
||||
};
|
||||
|
||||
let build = Build::new(config);
|
||||
let builder = Builder::new(&build);
|
||||
builder.run_step_descriptions(&Builder::get_step_descriptions(Kind::Test), &[]);
|
||||
|
||||
// Ensure we have really excluded tidy
|
||||
assert!(!builder.cache.contains::<test::Tidy>());
|
||||
|
||||
// Ensure other tests are not affected.
|
||||
assert!(builder.cache.contains::<test::RustdocUi>());
|
||||
}
|
||||
|
|
@ -266,8 +266,10 @@ impl Cache {
|
|||
.expect("invalid type mapped");
|
||||
stepcache.get(step).cloned()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
#[cfg(test)]
|
||||
impl Cache {
|
||||
pub fn all<S: Ord + Copy + Step>(&mut self) -> Vec<(S, S::Output)> {
|
||||
let cache = self.0.get_mut();
|
||||
let type_id = TypeId::of::<S>();
|
||||
|
|
@ -279,7 +281,6 @@ impl Cache {
|
|||
v
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
pub fn contains<S: Step>(&self) -> bool {
|
||||
self.0.borrow().contains_key(&TypeId::of::<S>())
|
||||
}
|
||||
|
|
|
|||
|
|
@ -45,6 +45,8 @@ fn cc2ar(cc: &Path, target: &str) -> Option<PathBuf> {
|
|||
Some(PathBuf::from("ar"))
|
||||
} else if target.contains("openbsd") {
|
||||
Some(PathBuf::from("ar"))
|
||||
} else if target.contains("vxworks") {
|
||||
Some(PathBuf::from("wr-ar"))
|
||||
} else {
|
||||
let parent = cc.parent().unwrap();
|
||||
let file = cc.file_name().unwrap().to_str().unwrap();
|
||||
|
|
@ -95,30 +97,40 @@ pub fn find(build: &mut Build) {
|
|||
};
|
||||
|
||||
build.cc.insert(target, compiler);
|
||||
let cflags = build.cflags(target, GitRepo::Rustc);
|
||||
|
||||
// If we use llvm-libunwind, we will need a C++ compiler as well for all targets
|
||||
// We'll need one anyways if the target triple is also a host triple
|
||||
let mut cfg = cc::Build::new();
|
||||
cfg.cargo_metadata(false).opt_level(2).warnings(false).debug(false).cpp(true)
|
||||
.target(&target).host(&build.build);
|
||||
|
||||
let cxx_configured = if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) {
|
||||
cfg.compiler(cxx);
|
||||
true
|
||||
} else if build.hosts.contains(&target) || build.build == target {
|
||||
set_compiler(&mut cfg, Language::CPlusPlus, target, config, build);
|
||||
true
|
||||
} else {
|
||||
false
|
||||
};
|
||||
|
||||
if cxx_configured {
|
||||
let compiler = cfg.get_compiler();
|
||||
build.cxx.insert(target, compiler);
|
||||
}
|
||||
|
||||
build.verbose(&format!("CC_{} = {:?}", &target, build.cc(target)));
|
||||
build.verbose(&format!("CFLAGS_{} = {:?}", &target, build.cflags(target, GitRepo::Rustc)));
|
||||
build.verbose(&format!("CFLAGS_{} = {:?}", &target, cflags));
|
||||
if let Ok(cxx) = build.cxx(target) {
|
||||
build.verbose(&format!("CXX_{} = {:?}", &target, cxx));
|
||||
build.verbose(&format!("CXXFLAGS_{} = {:?}", &target, cflags));
|
||||
}
|
||||
if let Some(ar) = ar {
|
||||
build.verbose(&format!("AR_{} = {:?}", &target, ar));
|
||||
build.ar.insert(target, ar);
|
||||
}
|
||||
}
|
||||
|
||||
// For all host triples we need to find a C++ compiler as well
|
||||
let hosts = build.hosts.iter().cloned().chain(iter::once(build.build)).collect::<HashSet<_>>();
|
||||
for host in hosts.into_iter() {
|
||||
let mut cfg = cc::Build::new();
|
||||
cfg.cargo_metadata(false).opt_level(2).warnings(false).debug(false).cpp(true)
|
||||
.target(&host).host(&build.build);
|
||||
let config = build.config.target_config.get(&host);
|
||||
if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) {
|
||||
cfg.compiler(cxx);
|
||||
} else {
|
||||
set_compiler(&mut cfg, Language::CPlusPlus, host, config, build);
|
||||
}
|
||||
let compiler = cfg.get_compiler();
|
||||
build.verbose(&format!("CXX_{} = {:?}", host, compiler.path()));
|
||||
build.cxx.insert(host, compiler);
|
||||
}
|
||||
}
|
||||
|
||||
fn set_compiler(cfg: &mut cc::Build,
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@ use build_helper::output;
|
|||
use crate::Build;
|
||||
|
||||
// The version number
|
||||
pub const CFG_RELEASE_NUM: &str = "1.36.0";
|
||||
pub const CFG_RELEASE_NUM: &str = "1.39.0";
|
||||
|
||||
pub struct GitInfo {
|
||||
inner: Option<Info>,
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
//! Implementation of compiling the compiler and standard library, in "check" mode.
|
||||
//! Implementation of compiling the compiler and standard library, in "check"-based modes.
|
||||
|
||||
use crate::compile::{run_cargo, std_cargo, test_cargo, rustc_cargo, rustc_cargo_env,
|
||||
add_to_sysroot};
|
||||
use crate::builder::{RunConfig, Builder, ShouldRun, Step};
|
||||
use crate::builder::{RunConfig, Builder, Kind, ShouldRun, Step};
|
||||
use crate::tool::{prepare_tool_cargo, SourceType};
|
||||
use crate::{Compiler, Mode};
|
||||
use crate::cache::{INTERNER, Interned};
|
||||
|
|
@ -13,6 +13,22 @@ pub struct Std {
|
|||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
fn args(kind: Kind) -> Vec<String> {
|
||||
match kind {
|
||||
Kind::Clippy => vec!["--".to_owned(), "--cap-lints".to_owned(), "warn".to_owned()],
|
||||
_ => Vec::new()
|
||||
}
|
||||
}
|
||||
|
||||
fn cargo_subcommand(kind: Kind) -> &'static str {
|
||||
match kind {
|
||||
Kind::Check => "check",
|
||||
Kind::Clippy => "clippy",
|
||||
Kind::Fix => "fix",
|
||||
_ => unreachable!()
|
||||
}
|
||||
}
|
||||
|
||||
impl Step for Std {
|
||||
type Output = ();
|
||||
const DEFAULT: bool = true;
|
||||
|
|
@ -31,13 +47,13 @@ impl Step for Std {
|
|||
let target = self.target;
|
||||
let compiler = builder.compiler(0, builder.config.build);
|
||||
|
||||
let mut cargo = builder.cargo(compiler, Mode::Std, target, "check");
|
||||
let mut cargo = builder.cargo(compiler, Mode::Std, target, cargo_subcommand(builder.kind));
|
||||
std_cargo(builder, &compiler, target, &mut cargo);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-std", compiler.stage));
|
||||
builder.info(&format!("Checking std artifacts ({} -> {})", &compiler.host, target));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
args(builder.kind),
|
||||
&libstd_stamp(builder, compiler, target),
|
||||
true);
|
||||
|
||||
|
|
@ -78,13 +94,14 @@ impl Step for Rustc {
|
|||
|
||||
builder.ensure(Test { target });
|
||||
|
||||
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "check");
|
||||
let mut cargo = builder.cargo(compiler, Mode::Rustc, target,
|
||||
cargo_subcommand(builder.kind));
|
||||
rustc_cargo(builder, &mut cargo);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-rustc", compiler.stage));
|
||||
builder.info(&format!("Checking compiler artifacts ({} -> {})", &compiler.host, target));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
args(builder.kind),
|
||||
&librustc_stamp(builder, compiler, target),
|
||||
true);
|
||||
|
||||
|
|
@ -127,15 +144,16 @@ impl Step for CodegenBackend {
|
|||
|
||||
builder.ensure(Rustc { target });
|
||||
|
||||
let mut cargo = builder.cargo(compiler, Mode::Codegen, target, "check");
|
||||
let mut cargo = builder.cargo(compiler, Mode::Codegen, target,
|
||||
cargo_subcommand(builder.kind));
|
||||
cargo.arg("--manifest-path").arg(builder.src.join("src/librustc_codegen_llvm/Cargo.toml"));
|
||||
rustc_cargo_env(builder, &mut cargo);
|
||||
|
||||
// We won't build LLVM if it's not available, as it shouldn't affect `check`.
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-rustc_codegen_llvm", compiler.stage));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
args(builder.kind),
|
||||
&codegen_backend_stamp(builder, compiler, target, backend),
|
||||
true);
|
||||
}
|
||||
|
|
@ -166,13 +184,13 @@ impl Step for Test {
|
|||
|
||||
builder.ensure(Std { target });
|
||||
|
||||
let mut cargo = builder.cargo(compiler, Mode::Test, target, "check");
|
||||
let mut cargo = builder.cargo(compiler, Mode::Test, target, cargo_subcommand(builder.kind));
|
||||
test_cargo(builder, &compiler, target, &mut cargo);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-test", compiler.stage));
|
||||
builder.info(&format!("Checking test artifacts ({} -> {})", &compiler.host, target));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
args(builder.kind),
|
||||
&libtest_stamp(builder, compiler, target),
|
||||
true);
|
||||
|
||||
|
|
@ -212,22 +230,21 @@ impl Step for Rustdoc {
|
|||
compiler,
|
||||
Mode::ToolRustc,
|
||||
target,
|
||||
"check",
|
||||
cargo_subcommand(builder.kind),
|
||||
"src/tools/rustdoc",
|
||||
SourceType::InTree,
|
||||
&[]);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-rustdoc", compiler.stage));
|
||||
println!("Checking rustdoc artifacts ({} -> {})", &compiler.host, target);
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
args(builder.kind),
|
||||
&rustdoc_stamp(builder, compiler, target),
|
||||
true);
|
||||
|
||||
let libdir = builder.sysroot_libdir(compiler, target);
|
||||
let hostdir = builder.sysroot_libdir(compiler, compiler.host);
|
||||
add_to_sysroot(&builder, &libdir, &hostdir, &rustdoc_stamp(builder, compiler, target));
|
||||
builder.cargo(compiler, Mode::ToolRustc, target, "clean");
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ use std::path::{Path, PathBuf};
|
|||
use std::process::{Command, Stdio, exit};
|
||||
use std::str;
|
||||
|
||||
use build_helper::{output, mtime, t, up_to_date};
|
||||
use build_helper::{output, t, up_to_date};
|
||||
use filetime::FileTime;
|
||||
use serde::Deserialize;
|
||||
use serde_json;
|
||||
|
|
@ -70,20 +70,20 @@ impl Step for Std {
|
|||
|
||||
builder.ensure(StartupObjects { compiler, target });
|
||||
|
||||
if builder.force_use_stage1(compiler, target) {
|
||||
let from = builder.compiler(1, builder.config.build);
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
if compiler_to_use != compiler {
|
||||
builder.ensure(Std {
|
||||
compiler: from,
|
||||
compiler: compiler_to_use,
|
||||
target,
|
||||
});
|
||||
builder.info(&format!("Uplifting stage1 std ({} -> {})", from.host, target));
|
||||
builder.info(&format!("Uplifting stage1 std ({} -> {})", compiler_to_use.host, target));
|
||||
|
||||
// Even if we're not building std this stage, the new sysroot must
|
||||
// still contain the third party objects needed by various targets.
|
||||
copy_third_party_objects(builder, &compiler, target);
|
||||
|
||||
builder.ensure(StdLink {
|
||||
compiler: from,
|
||||
compiler: compiler_to_use,
|
||||
target_compiler: compiler,
|
||||
target,
|
||||
});
|
||||
|
|
@ -95,11 +95,11 @@ impl Step for Std {
|
|||
let mut cargo = builder.cargo(compiler, Mode::Std, target, "build");
|
||||
std_cargo(builder, &compiler, target, &mut cargo);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-std", compiler.stage));
|
||||
builder.info(&format!("Building stage{} std artifacts ({} -> {})", compiler.stage,
|
||||
&compiler.host, target));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
vec![],
|
||||
&libstd_stamp(builder, compiler, target),
|
||||
false);
|
||||
|
||||
|
|
@ -161,7 +161,33 @@ pub fn std_cargo(builder: &Builder<'_>,
|
|||
cargo.env("MACOSX_DEPLOYMENT_TARGET", target);
|
||||
}
|
||||
|
||||
// Determine if we're going to compile in optimized C intrinsics to
|
||||
// the `compiler-builtins` crate. These intrinsics live in LLVM's
|
||||
// `compiler-rt` repository, but our `src/llvm-project` submodule isn't
|
||||
// always checked out, so we need to conditionally look for this. (e.g. if
|
||||
// an external LLVM is used we skip the LLVM submodule checkout).
|
||||
//
|
||||
// Note that this shouldn't affect the correctness of `compiler-builtins`,
|
||||
// but only its speed. Some intrinsics in C haven't been translated to Rust
|
||||
// yet but that's pretty rare. Other intrinsics have optimized
|
||||
// implementations in C which have only had slower versions ported to Rust,
|
||||
// so we favor the C version where we can, but it's not critical.
|
||||
//
|
||||
// If `compiler-rt` is available ensure that the `c` feature of the
|
||||
// `compiler-builtins` crate is enabled and it's configured to learn where
|
||||
// `compiler-rt` is located.
|
||||
let compiler_builtins_root = builder.src.join("src/llvm-project/compiler-rt");
|
||||
let compiler_builtins_c_feature = if compiler_builtins_root.exists() {
|
||||
cargo.env("RUST_COMPILER_RT_ROOT", &compiler_builtins_root);
|
||||
" compiler-builtins-c".to_string()
|
||||
} else {
|
||||
String::new()
|
||||
};
|
||||
|
||||
if builder.no_std(target) == Some(true) {
|
||||
let mut features = "compiler-builtins-mem".to_string();
|
||||
features.push_str(&compiler_builtins_c_feature);
|
||||
|
||||
// for no-std targets we only compile a few no_std crates
|
||||
cargo
|
||||
.args(&["-p", "alloc"])
|
||||
|
|
@ -170,7 +196,8 @@ pub fn std_cargo(builder: &Builder<'_>,
|
|||
.arg("--features")
|
||||
.arg("compiler-builtins-mem compiler-builtins-c");
|
||||
} else {
|
||||
let features = builder.std_features();
|
||||
let mut features = builder.std_features();
|
||||
features.push_str(&compiler_builtins_c_feature);
|
||||
|
||||
if compiler.stage != 0 && builder.config.sanitizers {
|
||||
// This variable is used by the sanitizer runtime crates, e.g.
|
||||
|
|
@ -247,8 +274,6 @@ impl Step for StdLink {
|
|||
// for reason why the sanitizers are not built in stage0.
|
||||
copy_apple_sanitizer_dylibs(builder, &builder.native_dir(target), "osx", &libdir);
|
||||
}
|
||||
|
||||
builder.cargo(target_compiler, Mode::ToolStd, target, "clean");
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -298,7 +323,7 @@ impl Step for StartupObjects {
|
|||
fn run(self, builder: &Builder<'_>) {
|
||||
let for_compiler = self.compiler;
|
||||
let target = self.target;
|
||||
if !target.contains("pc-windows-gnu") {
|
||||
if !target.contains("windows-gnu") {
|
||||
return
|
||||
}
|
||||
|
||||
|
|
@ -313,7 +338,7 @@ impl Step for StartupObjects {
|
|||
if !up_to_date(src_file, dst_file) {
|
||||
let mut cmd = Command::new(&builder.initial_rustc);
|
||||
builder.run(cmd.env("RUSTC_BOOTSTRAP", "1")
|
||||
.arg("--cfg").arg("stage0")
|
||||
.arg("--cfg").arg("bootstrap")
|
||||
.arg("--target").arg(target)
|
||||
.arg("--emit=obj")
|
||||
.arg("-o").arg(dst_file)
|
||||
|
|
@ -375,15 +400,16 @@ impl Step for Test {
|
|||
return;
|
||||
}
|
||||
|
||||
if builder.force_use_stage1(compiler, target) {
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
if compiler_to_use != compiler {
|
||||
builder.ensure(Test {
|
||||
compiler: builder.compiler(1, builder.config.build),
|
||||
compiler: compiler_to_use,
|
||||
target,
|
||||
});
|
||||
builder.info(
|
||||
&format!("Uplifting stage1 test ({} -> {})", builder.config.build, target));
|
||||
builder.ensure(TestLink {
|
||||
compiler: builder.compiler(1, builder.config.build),
|
||||
compiler: compiler_to_use,
|
||||
target_compiler: compiler,
|
||||
target,
|
||||
});
|
||||
|
|
@ -393,11 +419,11 @@ impl Step for Test {
|
|||
let mut cargo = builder.cargo(compiler, Mode::Test, target, "build");
|
||||
test_cargo(builder, &compiler, target, &mut cargo);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-test", compiler.stage));
|
||||
builder.info(&format!("Building stage{} test artifacts ({} -> {})", compiler.stage,
|
||||
&compiler.host, target));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
vec![],
|
||||
&libtest_stamp(builder, compiler, target),
|
||||
false);
|
||||
|
||||
|
|
@ -452,8 +478,6 @@ impl Step for TestLink {
|
|||
&builder.sysroot_libdir(target_compiler, compiler.host),
|
||||
&libtest_stamp(builder, compiler, target)
|
||||
);
|
||||
|
||||
builder.cargo(target_compiler, Mode::ToolTest, target, "clean");
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -500,15 +524,16 @@ impl Step for Rustc {
|
|||
return;
|
||||
}
|
||||
|
||||
if builder.force_use_stage1(compiler, target) {
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
if compiler_to_use != compiler {
|
||||
builder.ensure(Rustc {
|
||||
compiler: builder.compiler(1, builder.config.build),
|
||||
compiler: compiler_to_use,
|
||||
target,
|
||||
});
|
||||
builder.info(&format!("Uplifting stage1 rustc ({} -> {})",
|
||||
builder.config.build, target));
|
||||
builder.ensure(RustcLink {
|
||||
compiler: builder.compiler(1, builder.config.build),
|
||||
compiler: compiler_to_use,
|
||||
target_compiler: compiler,
|
||||
target,
|
||||
});
|
||||
|
|
@ -524,11 +549,11 @@ impl Step for Rustc {
|
|||
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "build");
|
||||
rustc_cargo(builder, &mut cargo);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-rustc", compiler.stage));
|
||||
builder.info(&format!("Building stage{} compiler artifacts ({} -> {})",
|
||||
compiler.stage, &compiler.host, target));
|
||||
run_cargo(builder,
|
||||
&mut cargo,
|
||||
vec![],
|
||||
&librustc_stamp(builder, compiler, target),
|
||||
false);
|
||||
|
||||
|
|
@ -559,13 +584,6 @@ pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Command) {
|
|||
let libdir_relative = builder.config.libdir_relative().unwrap_or(Path::new("lib"));
|
||||
cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative);
|
||||
|
||||
// If we're not building a compiler with debugging information then remove
|
||||
// these two env vars which would be set otherwise.
|
||||
if builder.config.rust_debuginfo_only_std {
|
||||
cargo.env_remove("RUSTC_DEBUGINFO");
|
||||
cargo.env_remove("RUSTC_DEBUGINFO_LINES");
|
||||
}
|
||||
|
||||
if let Some(ref ver_date) = builder.rust_info.commit_date() {
|
||||
cargo.env("CFG_VER_DATE", ver_date);
|
||||
}
|
||||
|
|
@ -617,7 +635,6 @@ impl Step for RustcLink {
|
|||
&builder.sysroot_libdir(target_compiler, compiler.host),
|
||||
&librustc_stamp(builder, compiler, target)
|
||||
);
|
||||
builder.cargo(target_compiler, Mode::ToolRustc, target, "clean");
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -664,9 +681,10 @@ impl Step for CodegenBackend {
|
|||
return;
|
||||
}
|
||||
|
||||
if builder.force_use_stage1(compiler, target) {
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
if compiler_to_use != compiler {
|
||||
builder.ensure(CodegenBackend {
|
||||
compiler: builder.compiler(1, builder.config.build),
|
||||
compiler: compiler_to_use,
|
||||
target,
|
||||
backend,
|
||||
});
|
||||
|
|
@ -684,9 +702,9 @@ impl Step for CodegenBackend {
|
|||
|
||||
let tmp_stamp = out_dir.join(".tmp.stamp");
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-rustc_codegen_llvm", compiler.stage));
|
||||
let files = run_cargo(builder,
|
||||
cargo.arg("--features").arg(features),
|
||||
vec![],
|
||||
&tmp_stamp,
|
||||
false);
|
||||
if builder.config.dry_run {
|
||||
|
|
@ -748,6 +766,10 @@ pub fn build_codegen_backend(builder: &Builder<'_>,
|
|||
cargo.env("CFG_LLVM_ROOT", s);
|
||||
}
|
||||
}
|
||||
// Some LLVM linker flags (-L and -l) may be needed to link librustc_llvm.
|
||||
if let Some(ref s) = builder.config.llvm_ldflags {
|
||||
cargo.env("LLVM_LINKER_FLAGS", s);
|
||||
}
|
||||
// Building with a static libstdc++ is only supported on linux right now,
|
||||
// not for MSVC or macOS
|
||||
if builder.config.llvm_static_stdcpp &&
|
||||
|
|
@ -768,6 +790,9 @@ pub fn build_codegen_backend(builder: &Builder<'_>,
|
|||
if builder.config.llvm_use_libcxx {
|
||||
cargo.env("LLVM_USE_LIBCXX", "1");
|
||||
}
|
||||
if builder.config.llvm_optimize && !builder.config.llvm_release_debuginfo {
|
||||
cargo.env("LLVM_NDEBUG", "1");
|
||||
}
|
||||
}
|
||||
_ => panic!("unknown backend: {}", backend),
|
||||
}
|
||||
|
|
@ -1057,6 +1082,7 @@ pub fn add_to_sysroot(
|
|||
|
||||
pub fn run_cargo(builder: &Builder<'_>,
|
||||
cargo: &mut Command,
|
||||
tail_args: Vec<String>,
|
||||
stamp: &Path,
|
||||
is_check: bool)
|
||||
-> Vec<PathBuf>
|
||||
|
|
@ -1079,7 +1105,7 @@ pub fn run_cargo(builder: &Builder<'_>,
|
|||
// files we need to probe for later.
|
||||
let mut deps = Vec::new();
|
||||
let mut toplevel = Vec::new();
|
||||
let ok = stream_cargo(builder, cargo, &mut |msg| {
|
||||
let ok = stream_cargo(builder, cargo, tail_args, &mut |msg| {
|
||||
let (filenames, crate_types) = match msg {
|
||||
CargoMessage::CompilerArtifact {
|
||||
filenames,
|
||||
|
|
@ -1094,6 +1120,7 @@ pub fn run_cargo(builder: &Builder<'_>,
|
|||
// Skip files like executables
|
||||
if !filename.ends_with(".rlib") &&
|
||||
!filename.ends_with(".lib") &&
|
||||
!filename.ends_with(".a") &&
|
||||
!is_dylib(&filename) &&
|
||||
!(is_check && filename.ends_with(".rmeta")) {
|
||||
continue;
|
||||
|
|
@ -1173,41 +1200,13 @@ pub fn run_cargo(builder: &Builder<'_>,
|
|||
deps.push((path_to_add.into(), false));
|
||||
}
|
||||
|
||||
// Now we want to update the contents of the stamp file, if necessary. First
|
||||
// we read off the previous contents along with its mtime. If our new
|
||||
// contents (the list of files to copy) is different or if any dep's mtime
|
||||
// is newer then we rewrite the stamp file.
|
||||
deps.sort();
|
||||
let stamp_contents = fs::read(stamp);
|
||||
let stamp_mtime = mtime(&stamp);
|
||||
let mut new_contents = Vec::new();
|
||||
let mut max = None;
|
||||
let mut max_path = None;
|
||||
for (dep, proc_macro) in deps.iter() {
|
||||
let mtime = mtime(dep);
|
||||
if Some(mtime) > max {
|
||||
max = Some(mtime);
|
||||
max_path = Some(dep.clone());
|
||||
}
|
||||
new_contents.extend(if *proc_macro { b"h" } else { b"t" });
|
||||
new_contents.extend(dep.to_str().unwrap().as_bytes());
|
||||
new_contents.extend(b"\0");
|
||||
}
|
||||
let max = max.unwrap();
|
||||
let max_path = max_path.unwrap();
|
||||
let contents_equal = stamp_contents
|
||||
.map(|contents| contents == new_contents)
|
||||
.unwrap_or_default();
|
||||
if contents_equal && max <= stamp_mtime {
|
||||
builder.verbose(&format!("not updating {:?}; contents equal and {:?} <= {:?}",
|
||||
stamp, max, stamp_mtime));
|
||||
return deps.into_iter().map(|(d, _)| d).collect()
|
||||
}
|
||||
if max > stamp_mtime {
|
||||
builder.verbose(&format!("updating {:?} as {:?} changed", stamp, max_path));
|
||||
} else {
|
||||
builder.verbose(&format!("updating {:?} as deps changed", stamp));
|
||||
}
|
||||
t!(fs::write(&stamp, &new_contents));
|
||||
deps.into_iter().map(|(d, _)| d).collect()
|
||||
}
|
||||
|
|
@ -1215,6 +1214,7 @@ pub fn run_cargo(builder: &Builder<'_>,
|
|||
pub fn stream_cargo(
|
||||
builder: &Builder<'_>,
|
||||
cargo: &mut Command,
|
||||
tail_args: Vec<String>,
|
||||
cb: &mut dyn FnMut(CargoMessage<'_>),
|
||||
) -> bool {
|
||||
if builder.config.dry_run {
|
||||
|
|
@ -1222,8 +1222,16 @@ pub fn stream_cargo(
|
|||
}
|
||||
// Instruct Cargo to give us json messages on stdout, critically leaving
|
||||
// stderr as piped so we can get those pretty colors.
|
||||
cargo.arg("--message-format").arg("json")
|
||||
.stdout(Stdio::piped());
|
||||
let mut message_format = String::from("json-render-diagnostics");
|
||||
if let Some(s) = &builder.config.rustc_error_format {
|
||||
message_format.push_str(",json-diagnostic-");
|
||||
message_format.push_str(s);
|
||||
}
|
||||
cargo.arg("--message-format").arg(message_format).stdout(Stdio::piped());
|
||||
|
||||
for arg in tail_args {
|
||||
cargo.arg(arg);
|
||||
}
|
||||
|
||||
builder.verbose(&format!("running: {:?}", cargo));
|
||||
let mut child = match cargo.spawn() {
|
||||
|
|
@ -1271,5 +1279,5 @@ pub enum CargoMessage<'a> {
|
|||
},
|
||||
BuildScriptExecuted {
|
||||
package_id: Cow<'a, str>,
|
||||
}
|
||||
},
|
||||
}
|
||||
|
|
|
|||
|
|
@ -11,7 +11,6 @@ use std::process;
|
|||
use std::cmp;
|
||||
|
||||
use build_helper::t;
|
||||
use num_cpus;
|
||||
use toml;
|
||||
use serde::Deserialize;
|
||||
use crate::cache::{INTERNER, Interned};
|
||||
|
|
@ -52,7 +51,7 @@ pub struct Config {
|
|||
pub test_compare_mode: bool,
|
||||
pub llvm_libunwind: bool,
|
||||
|
||||
pub run_host_only: bool,
|
||||
pub skip_only_host_steps: bool,
|
||||
|
||||
pub on_fail: Option<String>,
|
||||
pub stage: Option<u32>,
|
||||
|
|
@ -76,7 +75,7 @@ pub struct Config {
|
|||
pub llvm_link_shared: bool,
|
||||
pub llvm_clang_cl: Option<String>,
|
||||
pub llvm_targets: Option<String>,
|
||||
pub llvm_experimental_targets: String,
|
||||
pub llvm_experimental_targets: Option<String>,
|
||||
pub llvm_link_jobs: Option<u32>,
|
||||
pub llvm_version_suffix: Option<String>,
|
||||
pub llvm_use_linker: Option<String>,
|
||||
|
|
@ -96,15 +95,14 @@ pub struct Config {
|
|||
pub rust_codegen_units: Option<u32>,
|
||||
pub rust_codegen_units_std: Option<u32>,
|
||||
pub rust_debug_assertions: bool,
|
||||
pub rust_debuginfo: bool,
|
||||
pub rust_debuginfo_lines: bool,
|
||||
pub rust_debuginfo_only_std: bool,
|
||||
pub rust_debuginfo_tools: bool,
|
||||
pub rust_debuginfo_level_rustc: u32,
|
||||
pub rust_debuginfo_level_std: u32,
|
||||
pub rust_debuginfo_level_tools: u32,
|
||||
pub rust_debuginfo_level_tests: u32,
|
||||
pub rust_rpath: bool,
|
||||
pub rustc_parallel: bool,
|
||||
pub rustc_default_linker: Option<String>,
|
||||
pub rust_optimize_tests: bool,
|
||||
pub rust_debuginfo_tests: bool,
|
||||
pub rust_dist_src: bool,
|
||||
pub rust_codegen_backends: Vec<Interned<String>>,
|
||||
pub rust_codegen_backends_dir: String,
|
||||
|
|
@ -130,7 +128,6 @@ pub struct Config {
|
|||
pub low_priority: bool,
|
||||
pub channel: String,
|
||||
pub verbose_tests: bool,
|
||||
pub test_miri: bool,
|
||||
pub save_toolstates: Option<PathBuf>,
|
||||
pub print_step_timings: bool,
|
||||
pub missing_tools: bool,
|
||||
|
|
@ -300,10 +297,11 @@ struct Rust {
|
|||
codegen_units: Option<u32>,
|
||||
codegen_units_std: Option<u32>,
|
||||
debug_assertions: Option<bool>,
|
||||
debuginfo: Option<bool>,
|
||||
debuginfo_lines: Option<bool>,
|
||||
debuginfo_only_std: Option<bool>,
|
||||
debuginfo_tools: Option<bool>,
|
||||
debuginfo_level: Option<u32>,
|
||||
debuginfo_level_rustc: Option<u32>,
|
||||
debuginfo_level_std: Option<u32>,
|
||||
debuginfo_level_tools: Option<u32>,
|
||||
debuginfo_level_tests: Option<u32>,
|
||||
parallel_compiler: Option<bool>,
|
||||
backtrace: Option<bool>,
|
||||
default_linker: Option<String>,
|
||||
|
|
@ -311,13 +309,11 @@ struct Rust {
|
|||
musl_root: Option<String>,
|
||||
rpath: Option<bool>,
|
||||
optimize_tests: Option<bool>,
|
||||
debuginfo_tests: Option<bool>,
|
||||
codegen_tests: Option<bool>,
|
||||
ignore_git: Option<bool>,
|
||||
debug: Option<bool>,
|
||||
dist_src: Option<bool>,
|
||||
verbose_tests: Option<bool>,
|
||||
test_miri: Option<bool>,
|
||||
incremental: Option<bool>,
|
||||
save_toolstates: Option<String>,
|
||||
codegen_backends: Option<Vec<String>>,
|
||||
|
|
@ -377,7 +373,6 @@ impl Config {
|
|||
config.codegen_tests = true;
|
||||
config.ignore_git = false;
|
||||
config.rust_dist_src = true;
|
||||
config.test_miri = false;
|
||||
config.rust_codegen_backends = vec![INTERNER.intern_str("llvm")];
|
||||
config.rust_codegen_backends_dir = "codegen-backends".to_owned();
|
||||
config.deny_warnings = true;
|
||||
|
|
@ -402,12 +397,12 @@ impl Config {
|
|||
config.rustc_error_format = flags.rustc_error_format;
|
||||
config.on_fail = flags.on_fail;
|
||||
config.stage = flags.stage;
|
||||
config.jobs = flags.jobs;
|
||||
config.jobs = flags.jobs.map(threads_from_config);
|
||||
config.cmd = flags.cmd;
|
||||
config.incremental = flags.incremental;
|
||||
config.dry_run = flags.dry_run;
|
||||
config.keep_stage = flags.keep_stage;
|
||||
if let Some(value) = flags.warnings {
|
||||
if let Some(value) = flags.deny_warnings {
|
||||
config.deny_warnings = value;
|
||||
}
|
||||
|
||||
|
|
@ -418,7 +413,9 @@ impl Config {
|
|||
}
|
||||
|
||||
// If --target was specified but --host wasn't specified, don't run any host-only tests.
|
||||
config.run_host_only = !(flags.host.is_empty() && !flags.target.is_empty());
|
||||
let has_hosts = !flags.host.is_empty();
|
||||
let has_targets = !flags.target.is_empty();
|
||||
config.skip_only_host_steps = !has_hosts && has_targets;
|
||||
|
||||
let toml = file.map(|file| {
|
||||
let contents = t!(fs::read_to_string(&file));
|
||||
|
|
@ -495,12 +492,13 @@ impl Config {
|
|||
// Store off these values as options because if they're not provided
|
||||
// we'll infer default values for them later
|
||||
let mut llvm_assertions = None;
|
||||
let mut debuginfo_lines = None;
|
||||
let mut debuginfo_only_std = None;
|
||||
let mut debuginfo_tools = None;
|
||||
let mut debug = None;
|
||||
let mut debuginfo = None;
|
||||
let mut debug_assertions = None;
|
||||
let mut debuginfo_level = None;
|
||||
let mut debuginfo_level_rustc = None;
|
||||
let mut debuginfo_level_std = None;
|
||||
let mut debuginfo_level_tools = None;
|
||||
let mut debuginfo_level_tests = None;
|
||||
let mut optimize = None;
|
||||
let mut ignore_git = None;
|
||||
|
||||
|
|
@ -523,8 +521,7 @@ impl Config {
|
|||
set(&mut config.llvm_static_stdcpp, llvm.static_libstdcpp);
|
||||
set(&mut config.llvm_link_shared, llvm.link_shared);
|
||||
config.llvm_targets = llvm.targets.clone();
|
||||
config.llvm_experimental_targets = llvm.experimental_targets.clone()
|
||||
.unwrap_or_else(|| "WebAssembly;RISCV".to_string());
|
||||
config.llvm_experimental_targets = llvm.experimental_targets.clone();
|
||||
config.llvm_link_jobs = llvm.link_jobs;
|
||||
config.llvm_version_suffix = llvm.version_suffix.clone();
|
||||
config.llvm_clang_cl = llvm.clang_cl.clone();
|
||||
|
|
@ -540,14 +537,14 @@ impl Config {
|
|||
if let Some(ref rust) = toml.rust {
|
||||
debug = rust.debug;
|
||||
debug_assertions = rust.debug_assertions;
|
||||
debuginfo = rust.debuginfo;
|
||||
debuginfo_lines = rust.debuginfo_lines;
|
||||
debuginfo_only_std = rust.debuginfo_only_std;
|
||||
debuginfo_tools = rust.debuginfo_tools;
|
||||
debuginfo_level = rust.debuginfo_level;
|
||||
debuginfo_level_rustc = rust.debuginfo_level_rustc;
|
||||
debuginfo_level_std = rust.debuginfo_level_std;
|
||||
debuginfo_level_tools = rust.debuginfo_level_tools;
|
||||
debuginfo_level_tests = rust.debuginfo_level_tests;
|
||||
optimize = rust.optimize;
|
||||
ignore_git = rust.ignore_git;
|
||||
set(&mut config.rust_optimize_tests, rust.optimize_tests);
|
||||
set(&mut config.rust_debuginfo_tests, rust.debuginfo_tests);
|
||||
set(&mut config.codegen_tests, rust.codegen_tests);
|
||||
set(&mut config.rust_rpath, rust.rpath);
|
||||
set(&mut config.jemalloc, rust.jemalloc);
|
||||
|
|
@ -557,7 +554,6 @@ impl Config {
|
|||
set(&mut config.channel, rust.channel.clone());
|
||||
set(&mut config.rust_dist_src, rust.dist_src);
|
||||
set(&mut config.verbose_tests, rust.verbose_tests);
|
||||
set(&mut config.test_miri, rust.test_miri);
|
||||
// in the case "false" is set explicitly, do not overwrite the command line args
|
||||
if let Some(true) = rust.incremental {
|
||||
config.incremental = true;
|
||||
|
|
@ -570,7 +566,7 @@ impl Config {
|
|||
config.rustc_default_linker = rust.default_linker.clone();
|
||||
config.musl_root = rust.musl_root.clone().map(PathBuf::from);
|
||||
config.save_toolstates = rust.save_toolstates.clone().map(PathBuf::from);
|
||||
set(&mut config.deny_warnings, rust.deny_warnings.or(flags.warnings));
|
||||
set(&mut config.deny_warnings, flags.deny_warnings.or(rust.deny_warnings));
|
||||
set(&mut config.backtrace_on_ice, rust.backtrace_on_ice);
|
||||
set(&mut config.rust_verify_llvm_ir, rust.verify_llvm_ir);
|
||||
set(&mut config.rust_remap_debuginfo, rust.remap_debuginfo);
|
||||
|
|
@ -583,13 +579,8 @@ impl Config {
|
|||
|
||||
set(&mut config.rust_codegen_backends_dir, rust.codegen_backends_dir.clone());
|
||||
|
||||
match rust.codegen_units {
|
||||
Some(0) => config.rust_codegen_units = Some(num_cpus::get() as u32),
|
||||
Some(n) => config.rust_codegen_units = Some(n),
|
||||
None => {}
|
||||
}
|
||||
|
||||
config.rust_codegen_units_std = rust.codegen_units_std;
|
||||
config.rust_codegen_units = rust.codegen_units.map(threads_from_config);
|
||||
config.rust_codegen_units_std = rust.codegen_units_std.map(threads_from_config);
|
||||
}
|
||||
|
||||
if let Some(ref t) = toml.target {
|
||||
|
|
@ -639,18 +630,19 @@ impl Config {
|
|||
let default = true;
|
||||
config.rust_optimize = optimize.unwrap_or(default);
|
||||
|
||||
let default = match &config.channel[..] {
|
||||
"stable" | "beta" | "nightly" => true,
|
||||
_ => false,
|
||||
};
|
||||
config.rust_debuginfo_lines = debuginfo_lines.unwrap_or(default);
|
||||
config.rust_debuginfo_only_std = debuginfo_only_std.unwrap_or(default);
|
||||
config.rust_debuginfo_tools = debuginfo_tools.unwrap_or(false);
|
||||
|
||||
let default = debug == Some(true);
|
||||
config.rust_debuginfo = debuginfo.unwrap_or(default);
|
||||
config.rust_debug_assertions = debug_assertions.unwrap_or(default);
|
||||
|
||||
let with_defaults = |debuginfo_level_specific: Option<u32>| {
|
||||
debuginfo_level_specific
|
||||
.or(debuginfo_level)
|
||||
.unwrap_or(if debug == Some(true) { 2 } else { 0 })
|
||||
};
|
||||
config.rust_debuginfo_level_rustc = with_defaults(debuginfo_level_rustc);
|
||||
config.rust_debuginfo_level_std = with_defaults(debuginfo_level_std);
|
||||
config.rust_debuginfo_level_tools = with_defaults(debuginfo_level_tools);
|
||||
config.rust_debuginfo_level_tests = debuginfo_level_tests.unwrap_or(0);
|
||||
|
||||
let default = config.channel == "dev";
|
||||
config.ignore_git = ignore_git.unwrap_or(default);
|
||||
|
||||
|
|
@ -687,3 +679,10 @@ fn set<T>(field: &mut T, val: Option<T>) {
|
|||
*field = v;
|
||||
}
|
||||
}
|
||||
|
||||
fn threads_from_config(v: u32) -> u32 {
|
||||
match v {
|
||||
0 => num_cpus::get() as u32,
|
||||
n => n,
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -36,8 +36,6 @@ o("docs", "build.docs", "build standard library documentation")
|
|||
o("compiler-docs", "build.compiler-docs", "build compiler documentation")
|
||||
o("optimize-tests", "rust.optimize-tests", "build tests with optimizations")
|
||||
o("parallel-compiler", "rust.parallel-compiler", "build a multi-threaded rustc")
|
||||
o("test-miri", "rust.test-miri", "run miri's test suite")
|
||||
o("debuginfo-tests", "rust.debuginfo-tests", "build tests with debugger metadata")
|
||||
o("verbose-tests", "rust.verbose-tests", "enable verbose output when running tests")
|
||||
o("ccache", "llvm.ccache", "invoke gcc/clang via ccache to reuse object files between builds")
|
||||
o("sccache", None, "invoke gcc/clang via sccache to reuse object files between builds")
|
||||
|
|
@ -77,10 +75,11 @@ o("optimize-llvm", "llvm.optimize", "build optimized LLVM")
|
|||
o("llvm-assertions", "llvm.assertions", "build LLVM with assertions")
|
||||
o("debug-assertions", "rust.debug-assertions", "build with debugging assertions")
|
||||
o("llvm-release-debuginfo", "llvm.release-debuginfo", "build LLVM with debugger metadata")
|
||||
o("debuginfo", "rust.debuginfo", "build with debugger metadata")
|
||||
o("debuginfo-lines", "rust.debuginfo-lines", "build with line number debugger metadata")
|
||||
o("debuginfo-only-std", "rust.debuginfo-only-std", "build only libstd with debugging information")
|
||||
o("debuginfo-tools", "rust.debuginfo-tools", "build extended tools with debugging information")
|
||||
v("debuginfo-level", "rust.debuginfo-level", "debuginfo level for Rust code")
|
||||
v("debuginfo-level-rustc", "rust.debuginfo-level-rustc", "debuginfo level for the compiler")
|
||||
v("debuginfo-level-std", "rust.debuginfo-level-std", "debuginfo level for the standard library")
|
||||
v("debuginfo-level-tools", "rust.debuginfo-level-tools", "debuginfo level for the tools")
|
||||
v("debuginfo-level-tests", "rust.debuginfo-level-tests", "debuginfo level for the test suites run with compiletest")
|
||||
v("save-toolstates", "rust.save-toolstates", "save build and test status of external tools into this file")
|
||||
|
||||
v("prefix", "install.prefix", "set installation prefix")
|
||||
|
|
@ -125,7 +124,9 @@ v("musl-root-armhf", "target.arm-unknown-linux-musleabihf.musl-root",
|
|||
"arm-unknown-linux-musleabihf install directory")
|
||||
v("musl-root-armv5te", "target.armv5te-unknown-linux-musleabi.musl-root",
|
||||
"armv5te-unknown-linux-musleabi install directory")
|
||||
v("musl-root-armv7", "target.armv7-unknown-linux-musleabihf.musl-root",
|
||||
v("musl-root-armv7", "target.armv7-unknown-linux-musleabi.musl-root",
|
||||
"armv7-unknown-linux-musleabi install directory")
|
||||
v("musl-root-armv7hf", "target.armv7-unknown-linux-musleabihf.musl-root",
|
||||
"armv7-unknown-linux-musleabihf install directory")
|
||||
v("musl-root-aarch64", "target.aarch64-unknown-linux-musl.musl-root",
|
||||
"aarch64-unknown-linux-musl install directory")
|
||||
|
|
|
|||
|
|
@ -68,7 +68,6 @@ fn missing_tool(tool_name: &str, skip: bool) {
|
|||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Docs {
|
||||
pub stage: u32,
|
||||
pub host: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -82,7 +81,6 @@ impl Step for Docs {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Docs {
|
||||
stage: run.builder.top_stage,
|
||||
host: run.target,
|
||||
});
|
||||
}
|
||||
|
|
@ -130,7 +128,6 @@ impl Step for Docs {
|
|||
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct RustcDocs {
|
||||
pub stage: u32,
|
||||
pub host: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -144,7 +141,6 @@ impl Step for RustcDocs {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(RustcDocs {
|
||||
stage: run.builder.top_stage,
|
||||
host: run.target,
|
||||
});
|
||||
}
|
||||
|
|
@ -647,7 +643,11 @@ impl Step for Std {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Std {
|
||||
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
|
@ -737,7 +737,14 @@ impl Step for Analysis {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Analysis {
|
||||
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
|
||||
// Find the actual compiler (handling the full bootstrap option) which
|
||||
// produced the save-analysis data because that data isn't copied
|
||||
// through the sysroot uplifting.
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
|
@ -757,14 +764,6 @@ impl Step for Analysis {
|
|||
|
||||
builder.ensure(Std { compiler, target });
|
||||
|
||||
// Package save-analysis from stage1 if not doing a full bootstrap, as the
|
||||
// stage2 artifacts is simply copied from stage1 in that case.
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler.clone()
|
||||
};
|
||||
|
||||
let image = tmpdir(builder).join(format!("{}-{}-image", name, target));
|
||||
|
||||
let src = builder.stage_out(compiler, Mode::Std)
|
||||
|
|
@ -805,6 +804,7 @@ fn copy_src_dirs(builder: &Builder<'_>, src_dirs: &[&str], exclude_dirs: &[&str]
|
|||
|
||||
const LLVM_PROJECTS: &[&str] = &[
|
||||
"llvm-project/clang", "llvm-project\\clang",
|
||||
"llvm-project/libunwind", "llvm-project\\libunwind",
|
||||
"llvm-project/lld", "llvm-project\\lld",
|
||||
"llvm-project/lldb", "llvm-project\\lldb",
|
||||
"llvm-project/llvm", "llvm-project\\llvm",
|
||||
|
|
@ -903,7 +903,7 @@ impl Step for Src {
|
|||
"src/libtest",
|
||||
"src/libterm",
|
||||
"src/libprofiler_builtins",
|
||||
"src/stdsimd",
|
||||
"src/stdarch",
|
||||
"src/libproc_macro",
|
||||
"src/tools/rustc-std-workspace-core",
|
||||
"src/tools/rustc-std-workspace-alloc",
|
||||
|
|
@ -935,8 +935,6 @@ impl Step for Src {
|
|||
}
|
||||
}
|
||||
|
||||
const CARGO_VENDOR_VERSION: &str = "0.1.22";
|
||||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct PlainSourceTarball;
|
||||
|
||||
|
|
@ -998,26 +996,6 @@ impl Step for PlainSourceTarball {
|
|||
|
||||
// If we're building from git sources, we need to vendor a complete distribution.
|
||||
if builder.rust_info.is_git() {
|
||||
// Get cargo-vendor installed, if it isn't already.
|
||||
let mut has_cargo_vendor = false;
|
||||
let mut cmd = Command::new(&builder.initial_cargo);
|
||||
for line in output(cmd.arg("install").arg("--list")).lines() {
|
||||
has_cargo_vendor |= line.starts_with("cargo-vendor ");
|
||||
}
|
||||
if !has_cargo_vendor {
|
||||
let mut cmd = builder.cargo(
|
||||
builder.compiler(0, builder.config.build),
|
||||
Mode::ToolBootstrap,
|
||||
builder.config.build,
|
||||
"install"
|
||||
);
|
||||
cmd.arg("--force")
|
||||
.arg("--debug")
|
||||
.arg("--vers").arg(CARGO_VENDOR_VERSION)
|
||||
.arg("cargo-vendor");
|
||||
builder.run(&mut cmd);
|
||||
}
|
||||
|
||||
// Vendor all Cargo dependencies
|
||||
let mut cmd = Command::new(&builder.initial_cargo);
|
||||
cmd.arg("vendor")
|
||||
|
|
@ -1066,7 +1044,7 @@ pub fn sanitize_sh(path: &Path) -> String {
|
|||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Cargo {
|
||||
pub stage: u32,
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -1080,16 +1058,20 @@ impl Step for Cargo {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Cargo {
|
||||
stage: run.builder.top_stage,
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> PathBuf {
|
||||
let stage = self.stage;
|
||||
let compiler = self.compiler;
|
||||
let target = self.target;
|
||||
|
||||
builder.info(&format!("Dist cargo stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist cargo stage{} ({})", compiler.stage, target));
|
||||
let src = builder.src.join("src/tools/cargo");
|
||||
let etc = src.join("src/etc");
|
||||
let release_num = builder.release_num("cargo");
|
||||
|
|
@ -1104,10 +1086,7 @@ impl Step for Cargo {
|
|||
// Prepare the image directory
|
||||
builder.create_dir(&image.join("share/zsh/site-functions"));
|
||||
builder.create_dir(&image.join("etc/bash_completion.d"));
|
||||
let cargo = builder.ensure(tool::Cargo {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target
|
||||
});
|
||||
let cargo = builder.ensure(tool::Cargo { compiler, target });
|
||||
builder.install(&cargo, &image.join("bin"), 0o755);
|
||||
for man in t!(etc.join("man").read_dir()) {
|
||||
let man = t!(man);
|
||||
|
|
@ -1152,7 +1131,7 @@ impl Step for Cargo {
|
|||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Rls {
|
||||
pub stage: u32,
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -1166,17 +1145,21 @@ impl Step for Rls {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Rls {
|
||||
stage: run.builder.top_stage,
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
|
||||
let stage = self.stage;
|
||||
let compiler = self.compiler;
|
||||
let target = self.target;
|
||||
assert!(builder.config.extended);
|
||||
|
||||
builder.info(&format!("Dist RLS stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist RLS stage{} ({})", compiler.stage, target));
|
||||
let src = builder.src.join("src/tools/rls");
|
||||
let release_num = builder.release_num("rls");
|
||||
let name = pkgname(builder, "rls");
|
||||
|
|
@ -1191,8 +1174,9 @@ impl Step for Rls {
|
|||
// We expect RLS to build, because we've exited this step above if tool
|
||||
// state for RLS isn't testing.
|
||||
let rls = builder.ensure(tool::Rls {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target, extra_features: Vec::new()
|
||||
compiler,
|
||||
target,
|
||||
extra_features: Vec::new(),
|
||||
}).or_else(|| { missing_tool("RLS", builder.build.config.missing_tools); None })?;
|
||||
|
||||
builder.install(&rls, &image.join("bin"), 0o755);
|
||||
|
|
@ -1231,7 +1215,7 @@ impl Step for Rls {
|
|||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Clippy {
|
||||
pub stage: u32,
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -1245,17 +1229,21 @@ impl Step for Clippy {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Clippy {
|
||||
stage: run.builder.top_stage,
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
|
||||
let stage = self.stage;
|
||||
let compiler = self.compiler;
|
||||
let target = self.target;
|
||||
assert!(builder.config.extended);
|
||||
|
||||
builder.info(&format!("Dist clippy stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist clippy stage{} ({})", compiler.stage, target));
|
||||
let src = builder.src.join("src/tools/clippy");
|
||||
let release_num = builder.release_num("clippy");
|
||||
let name = pkgname(builder, "clippy");
|
||||
|
|
@ -1270,11 +1258,12 @@ impl Step for Clippy {
|
|||
// We expect clippy to build, because we've exited this step above if tool
|
||||
// state for clippy isn't testing.
|
||||
let clippy = builder.ensure(tool::Clippy {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target, extra_features: Vec::new()
|
||||
compiler,
|
||||
target,
|
||||
extra_features: Vec::new(),
|
||||
}).or_else(|| { missing_tool("clippy", builder.build.config.missing_tools); None })?;
|
||||
let cargoclippy = builder.ensure(tool::CargoClippy {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
compiler,
|
||||
target, extra_features: Vec::new()
|
||||
}).or_else(|| { missing_tool("cargo clippy", builder.build.config.missing_tools); None })?;
|
||||
|
||||
|
|
@ -1315,7 +1304,7 @@ impl Step for Clippy {
|
|||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Miri {
|
||||
pub stage: u32,
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -1329,17 +1318,21 @@ impl Step for Miri {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Miri {
|
||||
stage: run.builder.top_stage,
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
|
||||
let stage = self.stage;
|
||||
let compiler = self.compiler;
|
||||
let target = self.target;
|
||||
assert!(builder.config.extended);
|
||||
|
||||
builder.info(&format!("Dist miri stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist miri stage{} ({})", compiler.stage, target));
|
||||
let src = builder.src.join("src/tools/miri");
|
||||
let release_num = builder.release_num("miri");
|
||||
let name = pkgname(builder, "miri");
|
||||
|
|
@ -1354,12 +1347,14 @@ impl Step for Miri {
|
|||
// We expect miri to build, because we've exited this step above if tool
|
||||
// state for miri isn't testing.
|
||||
let miri = builder.ensure(tool::Miri {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target, extra_features: Vec::new()
|
||||
compiler,
|
||||
target,
|
||||
extra_features: Vec::new(),
|
||||
}).or_else(|| { missing_tool("miri", builder.build.config.missing_tools); None })?;
|
||||
let cargomiri = builder.ensure(tool::CargoMiri {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target, extra_features: Vec::new()
|
||||
compiler,
|
||||
target,
|
||||
extra_features: Vec::new()
|
||||
}).or_else(|| { missing_tool("cargo miri", builder.build.config.missing_tools); None })?;
|
||||
|
||||
builder.install(&miri, &image.join("bin"), 0o755);
|
||||
|
|
@ -1399,7 +1394,7 @@ impl Step for Miri {
|
|||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Rustfmt {
|
||||
pub stage: u32,
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -1413,16 +1408,20 @@ impl Step for Rustfmt {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Rustfmt {
|
||||
stage: run.builder.top_stage,
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
|
||||
let stage = self.stage;
|
||||
let compiler = self.compiler;
|
||||
let target = self.target;
|
||||
|
||||
builder.info(&format!("Dist Rustfmt stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist Rustfmt stage{} ({})", compiler.stage, target));
|
||||
let src = builder.src.join("src/tools/rustfmt");
|
||||
let release_num = builder.release_num("rustfmt");
|
||||
let name = pkgname(builder, "rustfmt");
|
||||
|
|
@ -1435,12 +1434,14 @@ impl Step for Rustfmt {
|
|||
|
||||
// Prepare the image directory
|
||||
let rustfmt = builder.ensure(tool::Rustfmt {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target, extra_features: Vec::new()
|
||||
compiler,
|
||||
target,
|
||||
extra_features: Vec::new(),
|
||||
}).or_else(|| { missing_tool("Rustfmt", builder.build.config.missing_tools); None })?;
|
||||
let cargofmt = builder.ensure(tool::Cargofmt {
|
||||
compiler: builder.compiler(stage, builder.config.build),
|
||||
target, extra_features: Vec::new()
|
||||
compiler,
|
||||
target,
|
||||
extra_features: Vec::new(),
|
||||
}).or_else(|| { missing_tool("Cargofmt", builder.build.config.missing_tools); None })?;
|
||||
|
||||
builder.install(&rustfmt, &image.join("bin"), 0o755);
|
||||
|
|
@ -1505,30 +1506,28 @@ impl Step for Extended {
|
|||
|
||||
/// Creates a combined installer for the specified target in the provided stage.
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
let stage = self.stage;
|
||||
let target = self.target;
|
||||
let stage = self.stage;
|
||||
let compiler = builder.compiler_for(self.stage, self.host, self.target);
|
||||
|
||||
builder.info(&format!("Dist extended stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist extended stage{} ({})", compiler.stage, target));
|
||||
|
||||
let rustc_installer = builder.ensure(Rustc {
|
||||
compiler: builder.compiler(stage, target),
|
||||
});
|
||||
let cargo_installer = builder.ensure(Cargo { stage, target });
|
||||
let rustfmt_installer = builder.ensure(Rustfmt { stage, target });
|
||||
let rls_installer = builder.ensure(Rls { stage, target });
|
||||
let llvm_tools_installer = builder.ensure(LlvmTools { stage, target });
|
||||
let clippy_installer = builder.ensure(Clippy { stage, target });
|
||||
let miri_installer = builder.ensure(Miri { stage, target });
|
||||
let cargo_installer = builder.ensure(Cargo { compiler, target });
|
||||
let rustfmt_installer = builder.ensure(Rustfmt { compiler, target });
|
||||
let rls_installer = builder.ensure(Rls { compiler, target });
|
||||
let llvm_tools_installer = builder.ensure(LlvmTools { target });
|
||||
let clippy_installer = builder.ensure(Clippy { compiler, target });
|
||||
let miri_installer = builder.ensure(Miri { compiler, target });
|
||||
let lldb_installer = builder.ensure(Lldb { target });
|
||||
let mingw_installer = builder.ensure(Mingw { host: target });
|
||||
let analysis_installer = builder.ensure(Analysis {
|
||||
compiler: builder.compiler(stage, self.host),
|
||||
target
|
||||
});
|
||||
let analysis_installer = builder.ensure(Analysis { compiler, target });
|
||||
|
||||
let docs_installer = builder.ensure(Docs { stage, host: target, });
|
||||
let docs_installer = builder.ensure(Docs { host: target, });
|
||||
let std_installer = builder.ensure(Std {
|
||||
compiler: builder.compiler(stage, self.host),
|
||||
compiler: builder.compiler(stage, target),
|
||||
target,
|
||||
});
|
||||
|
||||
|
|
@ -2076,7 +2075,6 @@ pub fn maybe_install_llvm_dylib(builder: &Builder<'_>,
|
|||
|
||||
#[derive(Clone, Debug, Eq, Hash, PartialEq)]
|
||||
pub struct LlvmTools {
|
||||
pub stage: u32,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
|
|
@ -2090,26 +2088,24 @@ impl Step for LlvmTools {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(LlvmTools {
|
||||
stage: run.builder.top_stage,
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
|
||||
let stage = self.stage;
|
||||
let target = self.target;
|
||||
assert!(builder.config.extended);
|
||||
|
||||
/* run only if llvm-config isn't used */
|
||||
if let Some(config) = builder.config.target_config.get(&target) {
|
||||
if let Some(ref _s) = config.llvm_config {
|
||||
builder.info(&format!("Skipping LlvmTools stage{} ({}): external LLVM",
|
||||
stage, target));
|
||||
builder.info(&format!("Skipping LlvmTools ({}): external LLVM",
|
||||
target));
|
||||
return None;
|
||||
}
|
||||
}
|
||||
|
||||
builder.info(&format!("Dist LlvmTools stage{} ({})", stage, target));
|
||||
builder.info(&format!("Dist LlvmTools ({})", target));
|
||||
let src = builder.src.join("src/llvm-project/llvm");
|
||||
let name = pkgname(builder, "llvm-tools");
|
||||
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ use crate::cache::{INTERNER, Interned};
|
|||
use crate::config::Config;
|
||||
|
||||
macro_rules! book {
|
||||
($($name:ident, $path:expr, $book_name:expr, $book_ver:expr;)+) => {
|
||||
($($name:ident, $path:expr, $book_name:expr;)+) => {
|
||||
$(
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct $name {
|
||||
|
|
@ -46,10 +46,10 @@ macro_rules! book {
|
|||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
builder.ensure(Rustbook {
|
||||
builder.ensure(RustbookSrc {
|
||||
target: self.target,
|
||||
name: INTERNER.intern_str($book_name),
|
||||
version: $book_ver,
|
||||
src: doc_src(builder),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
@ -60,50 +60,17 @@ macro_rules! book {
|
|||
// NOTE: When adding a book here, make sure to ALSO build the book by
|
||||
// adding a build step in `src/bootstrap/builder.rs`!
|
||||
book!(
|
||||
EditionGuide, "src/doc/edition-guide", "edition-guide", RustbookVersion::MdBook2;
|
||||
EmbeddedBook, "src/doc/embedded-book", "embedded-book", RustbookVersion::MdBook2;
|
||||
Nomicon, "src/doc/nomicon", "nomicon", RustbookVersion::MdBook1;
|
||||
Reference, "src/doc/reference", "reference", RustbookVersion::MdBook1;
|
||||
RustByExample, "src/doc/rust-by-example", "rust-by-example", RustbookVersion::MdBook1;
|
||||
RustcBook, "src/doc/rustc", "rustc", RustbookVersion::MdBook1;
|
||||
RustdocBook, "src/doc/rustdoc", "rustdoc", RustbookVersion::MdBook1;
|
||||
EditionGuide, "src/doc/edition-guide", "edition-guide";
|
||||
EmbeddedBook, "src/doc/embedded-book", "embedded-book";
|
||||
Nomicon, "src/doc/nomicon", "nomicon";
|
||||
Reference, "src/doc/reference", "reference";
|
||||
RustByExample, "src/doc/rust-by-example", "rust-by-example";
|
||||
RustcBook, "src/doc/rustc", "rustc";
|
||||
RustdocBook, "src/doc/rustdoc", "rustdoc";
|
||||
);
|
||||
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
enum RustbookVersion {
|
||||
MdBook1,
|
||||
MdBook2,
|
||||
}
|
||||
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
struct Rustbook {
|
||||
target: Interned<String>,
|
||||
name: Interned<String>,
|
||||
version: RustbookVersion,
|
||||
}
|
||||
|
||||
impl Step for Rustbook {
|
||||
type Output = ();
|
||||
|
||||
// rustbook is never directly called, and only serves as a shim for the nomicon and the
|
||||
// reference.
|
||||
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
|
||||
run.never()
|
||||
}
|
||||
|
||||
/// Invoke `rustbook` for `target` for the doc book `name`.
|
||||
///
|
||||
/// This will not actually generate any documentation if the documentation has
|
||||
/// already been generated.
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
let src = builder.src.join("src/doc");
|
||||
builder.ensure(RustbookSrc {
|
||||
target: self.target,
|
||||
name: self.name,
|
||||
src: INTERNER.intern_path(src),
|
||||
version: self.version,
|
||||
});
|
||||
}
|
||||
fn doc_src(builder: &Builder<'_>) -> Interned<PathBuf> {
|
||||
INTERNER.intern_path(builder.src.join("src/doc"))
|
||||
}
|
||||
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
|
|
@ -134,7 +101,6 @@ impl Step for UnstableBook {
|
|||
target: self.target,
|
||||
name: INTERNER.intern_str("unstable-book"),
|
||||
src: builder.md_doc_out(self.target),
|
||||
version: RustbookVersion::MdBook1,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
@ -188,7 +154,6 @@ struct RustbookSrc {
|
|||
target: Interned<String>,
|
||||
name: Interned<String>,
|
||||
src: Interned<PathBuf>,
|
||||
version: RustbookVersion,
|
||||
}
|
||||
|
||||
impl Step for RustbookSrc {
|
||||
|
|
@ -220,18 +185,11 @@ impl Step for RustbookSrc {
|
|||
builder.info(&format!("Rustbook ({}) - {}", target, name));
|
||||
let _ = fs::remove_dir_all(&out);
|
||||
|
||||
let vers = match self.version {
|
||||
RustbookVersion::MdBook1 => "1",
|
||||
RustbookVersion::MdBook2 => "2",
|
||||
};
|
||||
|
||||
builder.run(rustbook_cmd
|
||||
.arg("build")
|
||||
.arg(&src)
|
||||
.arg("-d")
|
||||
.arg(out)
|
||||
.arg("-m")
|
||||
.arg(vers));
|
||||
.arg(out));
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -274,33 +232,33 @@ impl Step for TheBook {
|
|||
let name = self.name;
|
||||
|
||||
// build book
|
||||
builder.ensure(Rustbook {
|
||||
builder.ensure(RustbookSrc {
|
||||
target,
|
||||
name: INTERNER.intern_string(name.to_string()),
|
||||
version: RustbookVersion::MdBook2,
|
||||
src: doc_src(builder),
|
||||
});
|
||||
|
||||
// building older edition redirects
|
||||
|
||||
let source_name = format!("{}/first-edition", name);
|
||||
builder.ensure(Rustbook {
|
||||
builder.ensure(RustbookSrc {
|
||||
target,
|
||||
name: INTERNER.intern_string(source_name),
|
||||
version: RustbookVersion::MdBook2,
|
||||
src: doc_src(builder),
|
||||
});
|
||||
|
||||
let source_name = format!("{}/second-edition", name);
|
||||
builder.ensure(Rustbook {
|
||||
builder.ensure(RustbookSrc {
|
||||
target,
|
||||
name: INTERNER.intern_string(source_name),
|
||||
version: RustbookVersion::MdBook2,
|
||||
src: doc_src(builder),
|
||||
});
|
||||
|
||||
let source_name = format!("{}/2018-edition", name);
|
||||
builder.ensure(Rustbook {
|
||||
builder.ensure(RustbookSrc {
|
||||
target,
|
||||
name: INTERNER.intern_string(source_name),
|
||||
version: RustbookVersion::MdBook2,
|
||||
src: doc_src(builder),
|
||||
});
|
||||
|
||||
// build the version info page and CSS
|
||||
|
|
@ -475,12 +433,7 @@ impl Step for Std {
|
|||
builder.info(&format!("Documenting stage{} std ({})", stage, target));
|
||||
let out = builder.doc_out(target);
|
||||
t!(fs::create_dir_all(&out));
|
||||
let compiler = builder.compiler(stage, builder.config.build);
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler
|
||||
};
|
||||
let compiler = builder.compiler_for(stage, builder.config.build, target);
|
||||
|
||||
builder.ensure(compile::Std { compiler, target });
|
||||
let out_dir = builder.stage_out(compiler, Mode::Std)
|
||||
|
|
@ -563,12 +516,7 @@ impl Step for Test {
|
|||
builder.info(&format!("Documenting stage{} test ({})", stage, target));
|
||||
let out = builder.doc_out(target);
|
||||
t!(fs::create_dir_all(&out));
|
||||
let compiler = builder.compiler(stage, builder.config.build);
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler
|
||||
};
|
||||
let compiler = builder.compiler_for(stage, builder.config.build, target);
|
||||
|
||||
// Build libstd docs so that we generate relative links
|
||||
builder.ensure(Std { stage, target });
|
||||
|
|
@ -632,12 +580,7 @@ impl Step for WhitelistedRustc {
|
|||
builder.info(&format!("Documenting stage{} whitelisted compiler ({})", stage, target));
|
||||
let out = builder.doc_out(target);
|
||||
t!(fs::create_dir_all(&out));
|
||||
let compiler = builder.compiler(stage, builder.config.build);
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler
|
||||
};
|
||||
let compiler = builder.compiler_for(stage, builder.config.build, target);
|
||||
|
||||
// Build libstd docs so that we generate relative links
|
||||
builder.ensure(Std { stage, target });
|
||||
|
|
@ -706,12 +649,7 @@ impl Step for Rustc {
|
|||
t!(fs::create_dir_all(&out));
|
||||
|
||||
// Get the correct compiler for this stage.
|
||||
let compiler = builder.compiler(stage, builder.config.build);
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler
|
||||
};
|
||||
let compiler = builder.compiler_for(stage, builder.config.build, target);
|
||||
|
||||
if !builder.config.compiler_docs {
|
||||
builder.info("\tskipping - compiler/librustdoc docs disabled");
|
||||
|
|
@ -728,7 +666,7 @@ impl Step for Rustc {
|
|||
|
||||
// Build cargo command.
|
||||
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "doc");
|
||||
cargo.env("RUSTDOCFLAGS", "--document-private-items");
|
||||
cargo.env("RUSTDOCFLAGS", "--document-private-items --passes strip-hidden");
|
||||
compile::rustc_cargo(builder, &mut cargo);
|
||||
|
||||
// Only include compiler crates, no dependencies of those, such as `libc`.
|
||||
|
|
@ -807,12 +745,7 @@ impl Step for Rustdoc {
|
|||
t!(fs::create_dir_all(&out));
|
||||
|
||||
// Get the correct compiler for this stage.
|
||||
let compiler = builder.compiler(stage, builder.config.build);
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler
|
||||
};
|
||||
let compiler = builder.compiler_for(stage, builder.config.build, target);
|
||||
|
||||
if !builder.config.compiler_docs {
|
||||
builder.info("\tskipping - compiler/librustdoc docs disabled");
|
||||
|
|
@ -923,11 +856,6 @@ impl Step for UnstableBookGen {
|
|||
fn run(self, builder: &Builder<'_>) {
|
||||
let target = self.target;
|
||||
|
||||
builder.ensure(compile::Std {
|
||||
compiler: builder.compiler(builder.top_stage, builder.config.build),
|
||||
target,
|
||||
});
|
||||
|
||||
builder.info(&format!("Generating unstable book md files ({})", target));
|
||||
let out = builder.md_doc_out(target).join("unstable-book");
|
||||
builder.create_dir(&out);
|
||||
|
|
|
|||
|
|
@ -33,8 +33,11 @@ pub struct Flags {
|
|||
pub rustc_error_format: Option<String>,
|
||||
pub dry_run: bool,
|
||||
|
||||
// true => deny
|
||||
pub warnings: Option<bool>,
|
||||
// This overrides the deny-warnings configuation option,
|
||||
// which passes -Dwarnings to the compiler invocations.
|
||||
//
|
||||
// true => deny, false => allow
|
||||
pub deny_warnings: Option<bool>,
|
||||
}
|
||||
|
||||
pub enum Subcommand {
|
||||
|
|
@ -44,6 +47,12 @@ pub enum Subcommand {
|
|||
Check {
|
||||
paths: Vec<PathBuf>,
|
||||
},
|
||||
Clippy {
|
||||
paths: Vec<PathBuf>,
|
||||
},
|
||||
Fix {
|
||||
paths: Vec<PathBuf>,
|
||||
},
|
||||
Doc {
|
||||
paths: Vec<PathBuf>,
|
||||
},
|
||||
|
|
@ -52,6 +61,7 @@ pub enum Subcommand {
|
|||
/// Whether to automatically update stderr/stdout files
|
||||
bless: bool,
|
||||
compare_mode: Option<String>,
|
||||
pass: Option<String>,
|
||||
test_args: Vec<String>,
|
||||
rustc_args: Vec<String>,
|
||||
fail_fast: bool,
|
||||
|
|
@ -90,6 +100,8 @@ Usage: x.py <subcommand> [options] [<paths>...]
|
|||
Subcommands:
|
||||
build Compile either the compiler or libraries
|
||||
check Compile either the compiler or libraries, using cargo check
|
||||
clippy Run clippy
|
||||
fix Run cargo fix
|
||||
test Build and run some test suites
|
||||
bench Build and run some benchmarks
|
||||
doc Build documentation
|
||||
|
|
@ -146,6 +158,8 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
|
|||
let subcommand = args.iter().find(|&s| {
|
||||
(s == "build")
|
||||
|| (s == "check")
|
||||
|| (s == "clippy")
|
||||
|| (s == "fix")
|
||||
|| (s == "test")
|
||||
|| (s == "bench")
|
||||
|| (s == "doc")
|
||||
|
|
@ -189,6 +203,12 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
|
|||
"mode describing what file the actual ui output will be compared to",
|
||||
"COMPARE MODE",
|
||||
);
|
||||
opts.optopt(
|
||||
"",
|
||||
"pass",
|
||||
"force {check,build,run}-pass tests to this mode.",
|
||||
"check | build | run"
|
||||
);
|
||||
opts.optflag(
|
||||
"",
|
||||
"rustfix-coverage",
|
||||
|
|
@ -281,6 +301,28 @@ Arguments:
|
|||
the compiler.",
|
||||
);
|
||||
}
|
||||
"clippy" => {
|
||||
subcommand_help.push_str(
|
||||
"\n
|
||||
Arguments:
|
||||
This subcommand accepts a number of paths to directories to the crates
|
||||
and/or artifacts to run clippy against. For example:
|
||||
|
||||
./x.py clippy src/libcore
|
||||
./x.py clippy src/libcore src/libproc_macro",
|
||||
);
|
||||
}
|
||||
"fix" => {
|
||||
subcommand_help.push_str(
|
||||
"\n
|
||||
Arguments:
|
||||
This subcommand accepts a number of paths to directories to the crates
|
||||
and/or artifacts to run `cargo fix` against. For example:
|
||||
|
||||
./x.py fix src/libcore
|
||||
./x.py fix src/libcore src/libproc_macro",
|
||||
);
|
||||
}
|
||||
"test" => {
|
||||
subcommand_help.push_str(
|
||||
"\n
|
||||
|
|
@ -288,7 +330,7 @@ Arguments:
|
|||
This subcommand accepts a number of paths to directories to tests that
|
||||
should be compiled and run. For example:
|
||||
|
||||
./x.py test src/test/run-pass
|
||||
./x.py test src/test/ui
|
||||
./x.py test src/libstd --test-args hash_map
|
||||
./x.py test src/libstd --stage 0 --no-doc
|
||||
./x.py test src/test/ui --bless
|
||||
|
|
@ -363,10 +405,13 @@ Arguments:
|
|||
let cmd = match subcommand.as_str() {
|
||||
"build" => Subcommand::Build { paths },
|
||||
"check" => Subcommand::Check { paths },
|
||||
"clippy" => Subcommand::Clippy { paths },
|
||||
"fix" => Subcommand::Fix { paths },
|
||||
"test" => Subcommand::Test {
|
||||
paths,
|
||||
bless: matches.opt_present("bless"),
|
||||
compare_mode: matches.opt_str("compare-mode"),
|
||||
pass: matches.opt_str("pass"),
|
||||
test_args: matches.opt_strs("test-args"),
|
||||
rustc_args: matches.opt_strs("rustc-args"),
|
||||
fail_fast: !matches.opt_present("no-fail-fast"),
|
||||
|
|
@ -426,7 +471,7 @@ Arguments:
|
|||
.into_iter()
|
||||
.map(|p| p.into())
|
||||
.collect::<Vec<_>>(),
|
||||
warnings: matches.opt_str("warnings").map(|v| v == "deny"),
|
||||
deny_warnings: parse_deny_warnings(&matches),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -490,6 +535,15 @@ impl Subcommand {
|
|||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn pass(&self) -> Option<&str> {
|
||||
match *self {
|
||||
Subcommand::Test {
|
||||
ref pass, ..
|
||||
} => pass.as_ref().map(|s| &s[..]),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn split(s: &[String]) -> Vec<String> {
|
||||
|
|
@ -498,3 +552,18 @@ fn split(s: &[String]) -> Vec<String> {
|
|||
.map(|s| s.to_string())
|
||||
.collect()
|
||||
}
|
||||
|
||||
fn parse_deny_warnings(matches: &getopts::Matches) -> Option<bool> {
|
||||
match matches.opt_str("warnings").as_ref().map(|v| v.as_str()) {
|
||||
Some("deny") => Some(true),
|
||||
Some("allow") => Some(false),
|
||||
Some(value) => {
|
||||
eprintln!(
|
||||
r#"invalid value for --warnings: {:?}, expected "allow" or "deny""#,
|
||||
value,
|
||||
);
|
||||
process::exit(1);
|
||||
},
|
||||
None => None,
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -5,12 +5,13 @@
|
|||
|
||||
use std::env;
|
||||
use std::fs;
|
||||
use std::path::{Path, PathBuf, Component};
|
||||
use std::path::{Component, Path, PathBuf};
|
||||
use std::process::Command;
|
||||
|
||||
use build_helper::t;
|
||||
|
||||
use crate::dist::{self, pkgname, sanitize_sh, tmpdir};
|
||||
use crate::Compiler;
|
||||
|
||||
use crate::builder::{Builder, RunConfig, ShouldRun, Step};
|
||||
use crate::cache::Interned;
|
||||
|
|
@ -58,7 +59,7 @@ fn install_sh(
|
|||
package: &str,
|
||||
name: &str,
|
||||
stage: u32,
|
||||
host: Option<Interned<String>>
|
||||
host: Option<Interned<String>>,
|
||||
) {
|
||||
builder.info(&format!("Install {} stage{} ({:?})", package, stage, host));
|
||||
|
||||
|
|
@ -144,9 +145,8 @@ macro_rules! install {
|
|||
$(
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct $name {
|
||||
pub stage: u32,
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
pub host: Interned<String>,
|
||||
}
|
||||
|
||||
impl $name {
|
||||
|
|
@ -175,9 +175,8 @@ macro_rules! install {
|
|||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure($name {
|
||||
stage: run.builder.top_stage,
|
||||
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
|
||||
target: run.target,
|
||||
host: run.builder.config.build,
|
||||
});
|
||||
}
|
||||
|
||||
|
|
@ -190,67 +189,81 @@ macro_rules! install {
|
|||
|
||||
install!((self, builder, _config),
|
||||
Docs, "src/doc", _config.docs, only_hosts: false, {
|
||||
builder.ensure(dist::Docs { stage: self.stage, host: self.target });
|
||||
install_docs(builder, self.stage, self.target);
|
||||
builder.ensure(dist::Docs { host: self.target });
|
||||
install_docs(builder, self.compiler.stage, self.target);
|
||||
};
|
||||
Std, "src/libstd", true, only_hosts: true, {
|
||||
for target in &builder.targets {
|
||||
builder.ensure(dist::Std {
|
||||
compiler: builder.compiler(self.stage, self.host),
|
||||
compiler: self.compiler,
|
||||
target: *target
|
||||
});
|
||||
install_std(builder, self.stage, *target);
|
||||
install_std(builder, self.compiler.stage, *target);
|
||||
}
|
||||
};
|
||||
Cargo, "cargo", Self::should_build(_config), only_hosts: true, {
|
||||
builder.ensure(dist::Cargo { stage: self.stage, target: self.target });
|
||||
install_cargo(builder, self.stage, self.target);
|
||||
builder.ensure(dist::Cargo { compiler: self.compiler, target: self.target });
|
||||
install_cargo(builder, self.compiler.stage, self.target);
|
||||
};
|
||||
Rls, "rls", Self::should_build(_config), only_hosts: true, {
|
||||
if builder.ensure(dist::Rls { stage: self.stage, target: self.target }).is_some() ||
|
||||
if builder.ensure(dist::Rls { compiler: self.compiler, target: self.target }).is_some() ||
|
||||
Self::should_install(builder) {
|
||||
install_rls(builder, self.stage, self.target);
|
||||
install_rls(builder, self.compiler.stage, self.target);
|
||||
} else {
|
||||
builder.info(&format!("skipping Install RLS stage{} ({})", self.stage, self.target));
|
||||
builder.info(
|
||||
&format!("skipping Install RLS stage{} ({})", self.compiler.stage, self.target),
|
||||
);
|
||||
}
|
||||
};
|
||||
Clippy, "clippy", Self::should_build(_config), only_hosts: true, {
|
||||
if builder.ensure(dist::Clippy { stage: self.stage, target: self.target }).is_some() ||
|
||||
Self::should_install(builder) {
|
||||
install_clippy(builder, self.stage, self.target);
|
||||
if builder.ensure(dist::Clippy {
|
||||
compiler: self.compiler,
|
||||
target: self.target,
|
||||
}).is_some() || Self::should_install(builder) {
|
||||
install_clippy(builder, self.compiler.stage, self.target);
|
||||
} else {
|
||||
builder.info(&format!("skipping Install clippy stage{} ({})", self.stage, self.target));
|
||||
builder.info(
|
||||
&format!("skipping Install clippy stage{} ({})", self.compiler.stage, self.target),
|
||||
);
|
||||
}
|
||||
};
|
||||
Miri, "miri", Self::should_build(_config), only_hosts: true, {
|
||||
if builder.ensure(dist::Miri { stage: self.stage, target: self.target }).is_some() ||
|
||||
if builder.ensure(dist::Miri { compiler: self.compiler, target: self.target }).is_some() ||
|
||||
Self::should_install(builder) {
|
||||
install_miri(builder, self.stage, self.target);
|
||||
install_miri(builder, self.compiler.stage, self.target);
|
||||
} else {
|
||||
builder.info(&format!("skipping Install miri stage{} ({})", self.stage, self.target));
|
||||
builder.info(
|
||||
&format!("skipping Install miri stage{} ({})", self.compiler.stage, self.target),
|
||||
);
|
||||
}
|
||||
};
|
||||
Rustfmt, "rustfmt", Self::should_build(_config), only_hosts: true, {
|
||||
if builder.ensure(dist::Rustfmt { stage: self.stage, target: self.target }).is_some() ||
|
||||
Self::should_install(builder) {
|
||||
install_rustfmt(builder, self.stage, self.target);
|
||||
if builder.ensure(dist::Rustfmt {
|
||||
compiler: self.compiler,
|
||||
target: self.target
|
||||
}).is_some() || Self::should_install(builder) {
|
||||
install_rustfmt(builder, self.compiler.stage, self.target);
|
||||
} else {
|
||||
builder.info(
|
||||
&format!("skipping Install Rustfmt stage{} ({})", self.stage, self.target));
|
||||
&format!("skipping Install Rustfmt stage{} ({})", self.compiler.stage, self.target),
|
||||
);
|
||||
}
|
||||
};
|
||||
Analysis, "analysis", Self::should_build(_config), only_hosts: false, {
|
||||
builder.ensure(dist::Analysis {
|
||||
compiler: builder.compiler(self.stage, self.host),
|
||||
// Find the actual compiler (handling the full bootstrap option) which
|
||||
// produced the save-analysis data because that data isn't copied
|
||||
// through the sysroot uplifting.
|
||||
compiler: builder.compiler_for(builder.top_stage, builder.config.build, self.target),
|
||||
target: self.target
|
||||
});
|
||||
install_analysis(builder, self.stage, self.target);
|
||||
install_analysis(builder, self.compiler.stage, self.target);
|
||||
};
|
||||
Rustc, "src/librustc", true, only_hosts: true, {
|
||||
builder.ensure(dist::Rustc {
|
||||
compiler: builder.compiler(self.stage, self.target),
|
||||
compiler: self.compiler,
|
||||
});
|
||||
install_rustc(builder, self.stage, self.target);
|
||||
install_rustc(builder, self.compiler.stage, self.target);
|
||||
};
|
||||
);
|
||||
|
||||
|
|
@ -266,15 +279,12 @@ impl Step for Src {
|
|||
|
||||
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
|
||||
let config = &run.builder.config;
|
||||
let cond = config.extended &&
|
||||
config.tools.as_ref().map_or(true, |t| t.contains("src"));
|
||||
let cond = config.extended && config.tools.as_ref().map_or(true, |t| t.contains("src"));
|
||||
run.path("src").default_condition(cond)
|
||||
}
|
||||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(Src {
|
||||
stage: run.builder.top_stage,
|
||||
});
|
||||
run.builder.ensure(Src { stage: run.builder.top_stage });
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
|
|
|
|||
|
|
@ -32,6 +32,7 @@
|
|||
use std::env;
|
||||
use std::io;
|
||||
use std::mem;
|
||||
use std::ptr;
|
||||
use crate::Build;
|
||||
|
||||
type HANDLE = *mut u8;
|
||||
|
|
@ -118,8 +119,8 @@ pub unsafe fn setup(build: &mut Build) {
|
|||
SetErrorMode(mode & !SEM_NOGPFAULTERRORBOX);
|
||||
|
||||
// Create a new job object for us to use
|
||||
let job = CreateJobObjectW(0 as *mut _, 0 as *const _);
|
||||
assert!(job != 0 as *mut _, "{}", io::Error::last_os_error());
|
||||
let job = CreateJobObjectW(ptr::null_mut(), ptr::null());
|
||||
assert!(!job.is_null(), "{}", io::Error::last_os_error());
|
||||
|
||||
// Indicate that when all handles to the job object are gone that all
|
||||
// process in the object should be killed. Note that this includes our
|
||||
|
|
@ -166,8 +167,8 @@ pub unsafe fn setup(build: &mut Build) {
|
|||
};
|
||||
|
||||
let parent = OpenProcess(PROCESS_DUP_HANDLE, FALSE, pid.parse().unwrap());
|
||||
assert!(parent != 0 as *mut _, "{}", io::Error::last_os_error());
|
||||
let mut parent_handle = 0 as *mut _;
|
||||
assert!(!parent.is_null(), "{}", io::Error::last_os_error());
|
||||
let mut parent_handle = ptr::null_mut();
|
||||
let r = DuplicateHandle(GetCurrentProcess(), job,
|
||||
parent, &mut parent_handle,
|
||||
0, FALSE, DUPLICATE_SAME_ACCESS);
|
||||
|
|
|
|||
|
|
@ -103,8 +103,9 @@
|
|||
//! More documentation can be found in each respective module below, and you can
|
||||
//! also check out the `src/bootstrap/README.md` file for more information.
|
||||
|
||||
#![deny(rust_2018_idioms)]
|
||||
#![deny(warnings)]
|
||||
// NO-RUSTC-WRAPPER
|
||||
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
|
||||
|
||||
#![feature(core_intrinsics)]
|
||||
#![feature(drain_filter)]
|
||||
|
||||
|
|
@ -124,11 +125,11 @@ use std::os::unix::fs::symlink as symlink_file;
|
|||
use std::os::windows::fs::symlink_file;
|
||||
|
||||
use build_helper::{
|
||||
mtime, output, run_silent, run_suppressed, t, try_run_silent, try_run_suppressed,
|
||||
mtime, output, run, run_suppressed, t, try_run, try_run_suppressed,
|
||||
};
|
||||
use filetime::FileTime;
|
||||
|
||||
use crate::util::{exe, libdir, OutputFolder, CiEnv};
|
||||
use crate::util::{exe, libdir, CiEnv};
|
||||
|
||||
mod cc_detect;
|
||||
mod channel;
|
||||
|
|
@ -197,11 +198,11 @@ pub struct Compiler {
|
|||
|
||||
#[derive(PartialEq, Eq, Copy, Clone, Debug)]
|
||||
pub enum DocTests {
|
||||
// Default, run normal tests and doc tests.
|
||||
/// Run normal tests and doc tests (default).
|
||||
Yes,
|
||||
// Do not run any doc tests.
|
||||
/// Do not run any doc tests.
|
||||
No,
|
||||
// Only run doc tests.
|
||||
/// Only run doc tests.
|
||||
Only,
|
||||
}
|
||||
|
||||
|
|
@ -221,10 +222,10 @@ pub enum GitRepo {
|
|||
/// methods specifically on this structure itself (to make it easier to
|
||||
/// organize).
|
||||
pub struct Build {
|
||||
// User-specified configuration via config.toml
|
||||
/// User-specified configuration from `config.toml`.
|
||||
config: Config,
|
||||
|
||||
// Derived properties from the above two configurations
|
||||
// Properties derived from the above configuration
|
||||
src: PathBuf,
|
||||
out: PathBuf,
|
||||
rust_info: channel::GitInfo,
|
||||
|
|
@ -240,12 +241,12 @@ pub struct Build {
|
|||
doc_tests: DocTests,
|
||||
verbosity: usize,
|
||||
|
||||
// Targets for which to build.
|
||||
// Targets for which to build
|
||||
build: Interned<String>,
|
||||
hosts: Vec<Interned<String>>,
|
||||
targets: Vec<Interned<String>>,
|
||||
|
||||
// Stage 0 (downloaded) compiler and cargo or their local rust equivalents.
|
||||
// Stage 0 (downloaded) compiler and cargo or their local rust equivalents
|
||||
initial_rustc: PathBuf,
|
||||
initial_cargo: PathBuf,
|
||||
|
||||
|
|
@ -255,7 +256,7 @@ pub struct Build {
|
|||
cxx: HashMap<Interned<String>, cc::Tool>,
|
||||
ar: HashMap<Interned<String>, PathBuf>,
|
||||
ranlib: HashMap<Interned<String>, PathBuf>,
|
||||
// Misc
|
||||
// Miscellaneous
|
||||
crates: HashMap<Interned<String>, Crate>,
|
||||
is_sudo: bool,
|
||||
ci_env: CiEnv,
|
||||
|
|
@ -270,14 +271,9 @@ pub struct Build {
|
|||
#[derive(Debug)]
|
||||
struct Crate {
|
||||
name: Interned<String>,
|
||||
version: String,
|
||||
deps: HashSet<Interned<String>>,
|
||||
id: String,
|
||||
path: PathBuf,
|
||||
doc_step: String,
|
||||
build_step: String,
|
||||
test_step: String,
|
||||
bench_step: String,
|
||||
}
|
||||
|
||||
impl Crate {
|
||||
|
|
@ -544,9 +540,7 @@ impl Build {
|
|||
Mode::Rustc => "-rustc",
|
||||
Mode::Codegen => "-codegen",
|
||||
Mode::ToolBootstrap => "-bootstrap-tools",
|
||||
Mode::ToolStd => "-tools",
|
||||
Mode::ToolTest => "-tools",
|
||||
Mode::ToolRustc => "-tools",
|
||||
Mode::ToolStd | Mode::ToolTest | Mode::ToolRustc => "-tools",
|
||||
};
|
||||
self.out.join(&*compiler.host)
|
||||
.join(format!("stage{}{}", compiler.stage, suffix))
|
||||
|
|
@ -686,7 +680,7 @@ impl Build {
|
|||
fn run(&self, cmd: &mut Command) {
|
||||
if self.config.dry_run { return; }
|
||||
self.verbose(&format!("running: {:?}", cmd));
|
||||
run_silent(cmd)
|
||||
run(cmd)
|
||||
}
|
||||
|
||||
/// Runs a command, printing out nice contextual information if it fails.
|
||||
|
|
@ -702,7 +696,7 @@ impl Build {
|
|||
fn try_run(&self, cmd: &mut Command) -> bool {
|
||||
if self.config.dry_run { return true; }
|
||||
self.verbose(&format!("running: {:?}", cmd));
|
||||
try_run_silent(cmd)
|
||||
try_run(cmd)
|
||||
}
|
||||
|
||||
/// Runs a command, printing out nice contextual information if it fails.
|
||||
|
|
@ -1097,19 +1091,6 @@ impl Build {
|
|||
}
|
||||
}
|
||||
|
||||
/// Fold the output of the commands after this method into a group. The fold
|
||||
/// ends when the returned object is dropped. Folding can only be used in
|
||||
/// the Travis CI environment.
|
||||
pub fn fold_output<D, F>(&self, name: F) -> Option<OutputFolder>
|
||||
where D: Into<String>, F: FnOnce() -> D
|
||||
{
|
||||
if !self.config.dry_run && self.ci_env == CiEnv::Travis {
|
||||
Some(OutputFolder::new(name().into()))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Updates the actual toolstate of a tool.
|
||||
///
|
||||
/// The toolstates are saved to the file specified by the key
|
||||
|
|
@ -1214,8 +1195,7 @@ impl Build {
|
|||
/// when this function is called.
|
||||
pub fn cp_r(&self, src: &Path, dst: &Path) {
|
||||
if self.config.dry_run { return; }
|
||||
for f in t!(fs::read_dir(src)) {
|
||||
let f = t!(f);
|
||||
for f in self.read_dir(src) {
|
||||
let path = f.path();
|
||||
let name = path.file_name().unwrap();
|
||||
let dst = dst.join(name);
|
||||
|
|
@ -1331,7 +1311,7 @@ fn chmod(path: &Path, perms: u32) {
|
|||
fn chmod(_path: &Path, _perms: u32) {}
|
||||
|
||||
|
||||
impl<'a> Compiler {
|
||||
impl Compiler {
|
||||
pub fn with_stage(mut self, stage: u32) -> Compiler {
|
||||
self.stage = stage;
|
||||
self
|
||||
|
|
|
|||
|
|
@ -20,7 +20,6 @@ struct Output {
|
|||
struct Package {
|
||||
id: String,
|
||||
name: String,
|
||||
version: String,
|
||||
source: Option<String>,
|
||||
manifest_path: String,
|
||||
}
|
||||
|
|
@ -84,12 +83,7 @@ fn build_krate(features: &str, build: &mut Build, resolves: &mut Vec<ResolveNode
|
|||
let mut path = PathBuf::from(package.manifest_path);
|
||||
path.pop();
|
||||
build.crates.insert(name, Crate {
|
||||
build_step: format!("build-crate-{}", name),
|
||||
doc_step: format!("doc-crate-{}", name),
|
||||
test_step: format!("test-crate-{}", name),
|
||||
bench_step: format!("bench-crate-{}", name),
|
||||
name,
|
||||
version: package.version,
|
||||
id: package.id,
|
||||
deps: HashSet::new(),
|
||||
path,
|
||||
|
|
|
|||
|
|
@ -48,10 +48,8 @@ check:
|
|||
$(Q)$(BOOTSTRAP) test $(BOOTSTRAP_ARGS)
|
||||
check-aux:
|
||||
$(Q)$(BOOTSTRAP) test \
|
||||
src/test/run-pass/pretty \
|
||||
src/test/run-fail/pretty \
|
||||
src/test/run-pass-valgrind/pretty \
|
||||
src/test/run-pass-fulldeps/pretty \
|
||||
$(AUX_ARGS) \
|
||||
$(BOOTSTRAP_ARGS)
|
||||
check-bootstrap:
|
||||
|
|
@ -75,15 +73,22 @@ check-stage2-T-x86_64-unknown-linux-musl-H-x86_64-unknown-linux-gnu:
|
|||
|
||||
TESTS_IN_2 := \
|
||||
src/test/ui \
|
||||
src/test/run-pass \
|
||||
src/test/compile-fail \
|
||||
src/test/run-pass-fulldeps \
|
||||
src/tools/linkchecker
|
||||
|
||||
appveyor-subset-1:
|
||||
ci-subset-1:
|
||||
$(Q)$(BOOTSTRAP) test $(TESTS_IN_2:%=--exclude %)
|
||||
appveyor-subset-2:
|
||||
ci-subset-2:
|
||||
$(Q)$(BOOTSTRAP) test $(TESTS_IN_2)
|
||||
|
||||
TESTS_IN_MINGW_2 := \
|
||||
src/test/ui \
|
||||
src/test/compile-fail
|
||||
|
||||
ci-mingw-subset-1:
|
||||
$(Q)$(BOOTSTRAP) test $(TESTS_IN_MINGW_2:%=--exclude %)
|
||||
ci-mingw-subset-2:
|
||||
$(Q)$(BOOTSTRAP) test $(TESTS_IN_MINGW_2)
|
||||
|
||||
|
||||
.PHONY: dist
|
||||
|
|
|
|||
|
|
@ -104,7 +104,6 @@ impl Step for Llvm {
|
|||
}
|
||||
}
|
||||
|
||||
let _folder = builder.fold_output(|| "llvm");
|
||||
let descriptor = if emscripten { "Emscripten " } else { "" };
|
||||
builder.info(&format!("Building {}LLVM for {}", descriptor, target));
|
||||
let _time = util::timeit(&builder);
|
||||
|
|
@ -126,14 +125,18 @@ impl Step for Llvm {
|
|||
} else {
|
||||
match builder.config.llvm_targets {
|
||||
Some(ref s) => s,
|
||||
None => "X86;ARM;AArch64;Mips;PowerPC;SystemZ;MSP430;Sparc;NVPTX;Hexagon",
|
||||
None => "AArch64;ARM;Hexagon;MSP430;Mips;NVPTX;PowerPC;RISCV;\
|
||||
Sparc;SystemZ;WebAssembly;X86",
|
||||
}
|
||||
};
|
||||
|
||||
let llvm_exp_targets = if self.emscripten {
|
||||
""
|
||||
} else {
|
||||
&builder.config.llvm_experimental_targets[..]
|
||||
match builder.config.llvm_experimental_targets {
|
||||
Some(ref s) => s,
|
||||
None => "",
|
||||
}
|
||||
};
|
||||
|
||||
let assertions = if builder.config.llvm_assertions {"ON"} else {"OFF"};
|
||||
|
|
@ -151,6 +154,7 @@ impl Step for Llvm {
|
|||
.define("WITH_POLLY", "OFF")
|
||||
.define("LLVM_ENABLE_TERMINFO", "OFF")
|
||||
.define("LLVM_ENABLE_LIBEDIT", "OFF")
|
||||
.define("LLVM_ENABLE_Z3_SOLVER", "OFF")
|
||||
.define("LLVM_PARALLEL_COMPILE_JOBS", builder.jobs().to_string())
|
||||
.define("LLVM_TARGET_ARCH", target.split('-').next().unwrap())
|
||||
.define("LLVM_DEFAULT_TARGET_TRIPLE", target);
|
||||
|
|
@ -203,8 +207,16 @@ impl Step for Llvm {
|
|||
cfg.define("LLVM_BUILD_32_BITS", "ON");
|
||||
}
|
||||
|
||||
let mut enabled_llvm_projects = Vec::new();
|
||||
|
||||
if util::forcing_clang_based_tests() {
|
||||
enabled_llvm_projects.push("clang");
|
||||
enabled_llvm_projects.push("compiler-rt");
|
||||
}
|
||||
|
||||
if want_lldb {
|
||||
cfg.define("LLVM_ENABLE_PROJECTS", "clang;lldb");
|
||||
enabled_llvm_projects.push("clang");
|
||||
enabled_llvm_projects.push("lldb");
|
||||
// For the time being, disable code signing.
|
||||
cfg.define("LLDB_CODESIGN_IDENTITY", "");
|
||||
cfg.define("LLDB_NO_DEBUGSERVER", "ON");
|
||||
|
|
@ -214,6 +226,12 @@ impl Step for Llvm {
|
|||
cfg.define("LLVM_ENABLE_LIBXML2", "OFF");
|
||||
}
|
||||
|
||||
if enabled_llvm_projects.len() > 0 {
|
||||
enabled_llvm_projects.sort();
|
||||
enabled_llvm_projects.dedup();
|
||||
cfg.define("LLVM_ENABLE_PROJECTS", enabled_llvm_projects.join(";"));
|
||||
}
|
||||
|
||||
if let Some(num_linkers) = builder.config.llvm_link_jobs {
|
||||
if num_linkers > 0 {
|
||||
cfg.define("LLVM_PARALLEL_LINK_JOBS", num_linkers.to_string());
|
||||
|
|
@ -402,7 +420,7 @@ fn configure_cmake(builder: &Builder<'_>,
|
|||
|
||||
cfg.build_arg("-j").build_arg(builder.jobs().to_string());
|
||||
let mut cflags = builder.cflags(target, GitRepo::Llvm).join(" ");
|
||||
if let Some(ref s) = builder.config.llvm_cxxflags {
|
||||
if let Some(ref s) = builder.config.llvm_cflags {
|
||||
cflags.push_str(&format!(" {}", s));
|
||||
}
|
||||
cfg.define("CMAKE_C_FLAGS", cflags);
|
||||
|
|
@ -479,7 +497,6 @@ impl Step for Lld {
|
|||
return out_dir
|
||||
}
|
||||
|
||||
let _folder = builder.fold_output(|| "lld");
|
||||
builder.info(&format!("Building LLD for {}", target));
|
||||
let _time = util::timeit(&builder);
|
||||
t!(fs::create_dir_all(&out_dir));
|
||||
|
|
@ -534,7 +551,7 @@ impl Step for TestHelpers {
|
|||
}
|
||||
|
||||
/// Compiles the `rust_test_helpers.c` library which we used in various
|
||||
/// `run-pass` test suites for ABI testing.
|
||||
/// `run-pass` tests for ABI testing.
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
if builder.config.dry_run {
|
||||
return;
|
||||
|
|
@ -546,7 +563,6 @@ impl Step for TestHelpers {
|
|||
return
|
||||
}
|
||||
|
||||
let _folder = builder.fold_output(|| "build_test_helpers");
|
||||
builder.info("Building test helpers");
|
||||
t!(fs::create_dir_all(&dst));
|
||||
let mut cfg = cc::Build::new();
|
||||
|
|
|
|||
|
|
@ -78,8 +78,11 @@ pub fn check(build: &mut Build) {
|
|||
|
||||
// We need cmake, but only if we're actually building LLVM or sanitizers.
|
||||
let building_llvm = build.hosts.iter()
|
||||
.filter_map(|host| build.config.target_config.get(host))
|
||||
.any(|config| config.llvm_config.is_none());
|
||||
.map(|host| build.config.target_config
|
||||
.get(host)
|
||||
.map(|config| config.llvm_config.is_none())
|
||||
.unwrap_or(true))
|
||||
.any(|build_llvm_ourselves| build_llvm_ourselves);
|
||||
if building_llvm || build.config.sanitizers {
|
||||
cmd_finder.must_have("cmake");
|
||||
}
|
||||
|
|
@ -106,6 +109,14 @@ pub fn check(build: &mut Build) {
|
|||
build.config.ninja = true;
|
||||
}
|
||||
}
|
||||
|
||||
if build.config.lldb_enabled {
|
||||
cmd_finder.must_have("swig");
|
||||
let out = output(Command::new("swig").arg("-version"));
|
||||
if !out.contains("SWIG Version 3") && !out.contains("SWIG Version 4") {
|
||||
panic!("Ensure that Swig 3.x.x or 4.x.x is installed.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
build.config.python = build.config.python.take().map(|p| cmd_finder.must_have(p))
|
||||
|
|
@ -191,10 +202,6 @@ pub fn check(build: &mut Build) {
|
|||
panic!("couldn't find libc.a in musl dir: {}",
|
||||
root.join("lib").display());
|
||||
}
|
||||
if fs::metadata(root.join("lib/libunwind.a")).is_err() {
|
||||
panic!("couldn't find libunwind.a in musl dir: {}",
|
||||
root.join("lib").display());
|
||||
}
|
||||
}
|
||||
None => {
|
||||
panic!("when targeting MUSL either the rust.musl-root \
|
||||
|
|
|
|||
|
|
@ -229,6 +229,9 @@ impl Step for Cargo {
|
|||
cargo.env("CFG_DISABLE_CROSS_TESTS", "1");
|
||||
// Disable a test that has issues with mingw.
|
||||
cargo.env("CARGO_TEST_DISABLE_GIT_CLI", "1");
|
||||
// Forcibly disable tests using nightly features since any changes to
|
||||
// those features won't be able to land.
|
||||
cargo.env("CARGO_TEST_DISABLE_NIGHTLY", "1");
|
||||
|
||||
try_run(
|
||||
builder,
|
||||
|
|
@ -360,11 +363,9 @@ pub struct Miri {
|
|||
impl Step for Miri {
|
||||
type Output = ();
|
||||
const ONLY_HOSTS: bool = true;
|
||||
const DEFAULT: bool = true;
|
||||
|
||||
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
|
||||
let test_miri = run.builder.config.test_miri;
|
||||
run.path("src/tools/miri").default_condition(test_miri)
|
||||
run.path("src/tools/miri")
|
||||
}
|
||||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
|
|
@ -386,26 +387,92 @@ impl Step for Miri {
|
|||
extra_features: Vec::new(),
|
||||
});
|
||||
if let Some(miri) = miri {
|
||||
let mut cargo = tool::prepare_tool_cargo(builder,
|
||||
compiler,
|
||||
Mode::ToolRustc,
|
||||
host,
|
||||
"test",
|
||||
"src/tools/miri",
|
||||
SourceType::Submodule,
|
||||
&[]);
|
||||
// # Run `cargo miri setup`.
|
||||
// As a side-effect, this will install xargo.
|
||||
let mut cargo = tool::prepare_tool_cargo(
|
||||
builder,
|
||||
compiler,
|
||||
Mode::ToolRustc,
|
||||
host,
|
||||
"run",
|
||||
"src/tools/miri",
|
||||
SourceType::Submodule,
|
||||
&[],
|
||||
);
|
||||
cargo
|
||||
.arg("--bin")
|
||||
.arg("cargo-miri")
|
||||
.arg("--")
|
||||
.arg("miri")
|
||||
.arg("setup");
|
||||
|
||||
// Tell `cargo miri` not to worry about the sysroot mismatch (we built with
|
||||
// stage1 but run with stage2).
|
||||
cargo.env("MIRI_SKIP_SYSROOT_CHECK", "1");
|
||||
// Tell `cargo miri setup` where to find the sources.
|
||||
cargo.env("XARGO_RUST_SRC", builder.src.join("src"));
|
||||
// Debug things.
|
||||
cargo.env("RUST_BACKTRACE", "1");
|
||||
// Configure `cargo install` path, and let cargo-miri know that that's where
|
||||
// xargo ends up.
|
||||
cargo.env("CARGO_INSTALL_ROOT", &builder.out); // cargo adds a `bin/`
|
||||
cargo.env("XARGO", builder.out.join("bin").join("xargo"));
|
||||
|
||||
if !try_run(builder, &mut cargo) {
|
||||
return;
|
||||
}
|
||||
|
||||
// # Determine where Miri put its sysroot.
|
||||
// To this end, we run `cargo miri setup --env` and capture the output.
|
||||
// (We do this separately from the above so that when the setup actually
|
||||
// happens we get some output.)
|
||||
// We re-use the `cargo` from above.
|
||||
cargo.arg("--env");
|
||||
|
||||
// FIXME: Is there a way in which we can re-use the usual `run` helpers?
|
||||
let miri_sysroot = if builder.config.dry_run {
|
||||
String::new()
|
||||
} else {
|
||||
builder.verbose(&format!("running: {:?}", cargo));
|
||||
let out = cargo.output()
|
||||
.expect("We already ran `cargo miri setup` before and that worked");
|
||||
assert!(out.status.success(), "`cargo miri setup` returned with non-0 exit code");
|
||||
// Output is "MIRI_SYSROOT=<str>\n".
|
||||
let stdout = String::from_utf8(out.stdout)
|
||||
.expect("`cargo miri setup` stdout is not valid UTF-8");
|
||||
let stdout = stdout.trim();
|
||||
builder.verbose(&format!("`cargo miri setup --env` returned: {:?}", stdout));
|
||||
let sysroot = stdout.splitn(2, '=')
|
||||
.nth(1).expect("`cargo miri setup` stdout did not contain '='");
|
||||
sysroot.to_owned()
|
||||
};
|
||||
|
||||
// # Run `cargo test`.
|
||||
let mut cargo = tool::prepare_tool_cargo(
|
||||
builder,
|
||||
compiler,
|
||||
Mode::ToolRustc,
|
||||
host,
|
||||
"test",
|
||||
"src/tools/miri",
|
||||
SourceType::Submodule,
|
||||
&[],
|
||||
);
|
||||
|
||||
// miri tests need to know about the stage sysroot
|
||||
cargo.env("MIRI_SYSROOT", builder.sysroot(compiler));
|
||||
cargo.env("MIRI_SYSROOT", miri_sysroot);
|
||||
cargo.env("RUSTC_TEST_SUITE", builder.rustc(compiler));
|
||||
cargo.env("RUSTC_LIB_PATH", builder.rustc_libdir(compiler));
|
||||
cargo.env("MIRI_PATH", miri);
|
||||
|
||||
builder.add_rustc_lib_path(compiler, &mut cargo);
|
||||
|
||||
if try_run(builder, &mut cargo) {
|
||||
builder.save_toolstate("miri", ToolState::TestPass);
|
||||
if !try_run(builder, &mut cargo) {
|
||||
return;
|
||||
}
|
||||
|
||||
// # Done!
|
||||
builder.save_toolstate("miri", ToolState::TestPass);
|
||||
} else {
|
||||
eprintln!("failed to test miri: could not build");
|
||||
}
|
||||
|
|
@ -683,7 +750,7 @@ impl Step for RustdocUi {
|
|||
target: self.target,
|
||||
mode: "ui",
|
||||
suite: "rustdoc-ui",
|
||||
path: None,
|
||||
path: Some("src/test/rustdoc-ui"),
|
||||
compare_mode: None,
|
||||
})
|
||||
}
|
||||
|
|
@ -709,11 +776,10 @@ impl Step for Tidy {
|
|||
if !builder.config.vendor {
|
||||
cmd.arg("--no-vendor");
|
||||
}
|
||||
if !builder.config.verbose_tests {
|
||||
cmd.arg("--quiet");
|
||||
if builder.is_verbose() {
|
||||
cmd.arg("--verbose");
|
||||
}
|
||||
|
||||
let _folder = builder.fold_output(|| "tidy");
|
||||
builder.info("tidy check");
|
||||
try_run(builder, &mut cmd);
|
||||
}
|
||||
|
|
@ -821,13 +887,6 @@ default_test_with_compare_mode!(Ui {
|
|||
compare_mode: "nll"
|
||||
});
|
||||
|
||||
default_test_with_compare_mode!(RunPass {
|
||||
path: "src/test/run-pass",
|
||||
mode: "run-pass",
|
||||
suite: "run-pass",
|
||||
compare_mode: "nll"
|
||||
});
|
||||
|
||||
default_test!(CompileFail {
|
||||
path: "src/test/compile-fail",
|
||||
mode: "compile-fail",
|
||||
|
|
@ -882,12 +941,6 @@ host_test!(UiFullDeps {
|
|||
suite: "ui-fulldeps"
|
||||
});
|
||||
|
||||
host_test!(RunPassFullDeps {
|
||||
path: "src/test/run-pass-fulldeps",
|
||||
mode: "run-pass",
|
||||
suite: "run-pass-fulldeps"
|
||||
});
|
||||
|
||||
host_test!(Rustdoc {
|
||||
path: "src/test/rustdoc",
|
||||
mode: "rustdoc",
|
||||
|
|
@ -899,13 +952,6 @@ host_test!(Pretty {
|
|||
mode: "pretty",
|
||||
suite: "pretty"
|
||||
});
|
||||
test!(RunPassPretty {
|
||||
path: "src/test/run-pass/pretty",
|
||||
mode: "pretty",
|
||||
suite: "run-pass",
|
||||
default: false,
|
||||
host: true
|
||||
});
|
||||
test!(RunFailPretty {
|
||||
path: "src/test/run-fail/pretty",
|
||||
mode: "pretty",
|
||||
|
|
@ -976,14 +1022,10 @@ impl Step for Compiletest {
|
|||
}
|
||||
|
||||
if suite == "debuginfo" {
|
||||
// Skip debuginfo tests on MSVC
|
||||
if builder.config.build.contains("msvc") {
|
||||
return;
|
||||
}
|
||||
|
||||
let msvc = builder.config.build.contains("msvc");
|
||||
if mode == "debuginfo" {
|
||||
return builder.ensure(Compiletest {
|
||||
mode: "debuginfo-both",
|
||||
mode: if msvc { "debuginfo-cdb" } else { "debuginfo-gdb+lldb" },
|
||||
..self
|
||||
});
|
||||
}
|
||||
|
|
@ -1069,6 +1111,11 @@ impl Step for Compiletest {
|
|||
}
|
||||
});
|
||||
|
||||
if let Some(ref pass) = builder.config.cmd.pass() {
|
||||
cmd.arg("--pass");
|
||||
cmd.arg(pass);
|
||||
}
|
||||
|
||||
if let Some(ref nodejs) = builder.config.nodejs {
|
||||
cmd.arg("--nodejs").arg(nodejs);
|
||||
}
|
||||
|
|
@ -1082,10 +1129,8 @@ impl Step for Compiletest {
|
|||
if builder.config.rust_optimize_tests {
|
||||
flags.push("-O".to_string());
|
||||
}
|
||||
if builder.config.rust_debuginfo_tests {
|
||||
flags.push("-g".to_string());
|
||||
}
|
||||
}
|
||||
flags.push(format!("-Cdebuginfo={}", builder.config.rust_debuginfo_level_tests));
|
||||
flags.push("-Zunstable-options".to_string());
|
||||
flags.push(builder.config.cmd.rustc_args().join(" "));
|
||||
|
||||
|
|
@ -1149,24 +1194,9 @@ impl Step for Compiletest {
|
|||
}
|
||||
}
|
||||
|
||||
if let Some(var) = env::var_os("RUSTBUILD_FORCE_CLANG_BASED_TESTS") {
|
||||
match &var.to_string_lossy().to_lowercase()[..] {
|
||||
"1" | "yes" | "on" => {
|
||||
assert!(builder.config.lldb_enabled,
|
||||
"RUSTBUILD_FORCE_CLANG_BASED_TESTS needs Clang/LLDB to \
|
||||
be built.");
|
||||
let clang_exe = builder.llvm_out(target).join("bin").join("clang");
|
||||
cmd.arg("--run-clang-based-tests-with").arg(clang_exe);
|
||||
}
|
||||
"0" | "no" | "off" => {
|
||||
// Nothing to do.
|
||||
}
|
||||
other => {
|
||||
// Let's make sure typos don't get unnoticed
|
||||
panic!("Unrecognized option '{}' set in \
|
||||
RUSTBUILD_FORCE_CLANG_BASED_TESTS", other);
|
||||
}
|
||||
}
|
||||
if util::forcing_clang_based_tests() {
|
||||
let clang_exe = builder.llvm_out(target).join("bin").join("clang");
|
||||
cmd.arg("--run-clang-based-tests-with").arg(clang_exe);
|
||||
}
|
||||
|
||||
// Get paths from cmd args
|
||||
|
|
@ -1326,7 +1356,6 @@ impl Step for Compiletest {
|
|||
|
||||
builder.ci_env.force_coloring_in_ci(&mut cmd);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("test_{}", suite));
|
||||
builder.info(&format!(
|
||||
"Check compiletest suite={} mode={} ({} -> {})",
|
||||
suite, mode, &compiler.host, target
|
||||
|
|
@ -1336,7 +1365,6 @@ impl Step for Compiletest {
|
|||
|
||||
if let Some(compare_mode) = compare_mode {
|
||||
cmd.arg("--compare-mode").arg(compare_mode);
|
||||
let _folder = builder.fold_output(|| format!("test_{}_{}", suite, compare_mode));
|
||||
builder.info(&format!(
|
||||
"Check compiletest suite={} mode={} compare_mode={} ({} -> {})",
|
||||
suite, mode, compare_mode, &compiler.host, target
|
||||
|
|
@ -1380,7 +1408,6 @@ impl Step for DocTest {
|
|||
// tests for all files that end in `*.md`
|
||||
let mut stack = vec![builder.src.join(self.path)];
|
||||
let _time = util::timeit(&builder);
|
||||
let _folder = builder.fold_output(|| format!("test_{}", self.name));
|
||||
|
||||
let mut files = Vec::new();
|
||||
while let Some(p) = stack.pop() {
|
||||
|
|
@ -1511,10 +1538,9 @@ impl Step for ErrorIndex {
|
|||
.env("CFG_BUILD", &builder.config.build)
|
||||
.env("RUSTC_ERROR_METADATA_DST", builder.extended_error_dir());
|
||||
|
||||
let _folder = builder.fold_output(|| "test_error_index");
|
||||
builder.info(&format!("Testing error-index stage{}", compiler.stage));
|
||||
let _time = util::timeit(&builder);
|
||||
builder.run(&mut tool);
|
||||
builder.run_quiet(&mut tool);
|
||||
markdown_test(builder, compiler, &output);
|
||||
}
|
||||
}
|
||||
|
|
@ -1546,6 +1572,34 @@ fn markdown_test(builder: &Builder<'_>, compiler: Compiler, markdown: &Path) ->
|
|||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
|
||||
pub struct RustcGuide;
|
||||
|
||||
impl Step for RustcGuide {
|
||||
type Output = ();
|
||||
const DEFAULT: bool = false;
|
||||
const ONLY_HOSTS: bool = true;
|
||||
|
||||
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
|
||||
run.path("src/doc/rustc-guide")
|
||||
}
|
||||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(RustcGuide);
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
let src = builder.src.join("src/doc/rustc-guide");
|
||||
let mut rustbook_cmd = builder.tool_cmd(Tool::Rustbook);
|
||||
let toolstate = if try_run(builder, rustbook_cmd.arg("linkcheck").arg(&src)) {
|
||||
ToolState::TestPass
|
||||
} else {
|
||||
ToolState::TestFail
|
||||
};
|
||||
builder.save_toolstate("rustc-guide", toolstate);
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
|
||||
pub struct CrateLibrustc {
|
||||
compiler: Compiler,
|
||||
|
|
@ -1711,15 +1765,11 @@ impl Step for Crate {
|
|||
builder.ensure(compile::Test { compiler, target });
|
||||
builder.ensure(RemoteCopyLibs { compiler, target });
|
||||
|
||||
// If we're not doing a full bootstrap but we're testing a stage2 version of
|
||||
// libstd, then what we're actually testing is the libstd produced in
|
||||
// stage1. Reflect that here by updating the compiler that we're working
|
||||
// with automatically.
|
||||
let compiler = if builder.force_use_stage1(compiler, target) {
|
||||
builder.compiler(1, compiler.host)
|
||||
} else {
|
||||
compiler.clone()
|
||||
};
|
||||
// If we're not doing a full bootstrap but we're testing a stage2
|
||||
// version of libstd, then what we're actually testing is the libstd
|
||||
// produced in stage1. Reflect that here by updating the compiler that
|
||||
// we're working with automatically.
|
||||
let compiler = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
|
||||
let mut cargo = builder.cargo(compiler, mode, target, test_kind.subcommand());
|
||||
match mode {
|
||||
|
|
@ -1814,14 +1864,6 @@ impl Step for Crate {
|
|||
);
|
||||
}
|
||||
|
||||
let _folder = builder.fold_output(|| {
|
||||
format!(
|
||||
"{}_stage{}-{}",
|
||||
test_kind.subcommand(),
|
||||
compiler.stage,
|
||||
krate
|
||||
)
|
||||
});
|
||||
builder.info(&format!(
|
||||
"{} {} stage{} ({} -> {})",
|
||||
test_kind, krate, compiler.stage, &compiler.host, target
|
||||
|
|
@ -1889,8 +1931,6 @@ impl Step for CrateRustdoc {
|
|||
cargo.arg("--quiet");
|
||||
}
|
||||
|
||||
let _folder = builder
|
||||
.fold_output(|| format!("{}_stage{}-rustdoc", test_kind.subcommand(), compiler.stage));
|
||||
builder.info(&format!(
|
||||
"{} rustdoc stage{} ({} -> {})",
|
||||
test_kind, compiler.stage, &compiler.host, target
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ use build_helper::t;
|
|||
use crate::Mode;
|
||||
use crate::Compiler;
|
||||
use crate::builder::{Step, RunConfig, ShouldRun, Builder};
|
||||
use crate::util::{exe, add_lib_path};
|
||||
use crate::util::{exe, add_lib_path, CiEnv};
|
||||
use crate::compile;
|
||||
use crate::channel::GitInfo;
|
||||
use crate::channel;
|
||||
|
|
@ -74,16 +74,16 @@ impl Step for ToolBuild {
|
|||
&self.extra_features,
|
||||
);
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-{}", compiler.stage, tool));
|
||||
builder.info(&format!("Building stage{} tool {} ({})", compiler.stage, tool, target));
|
||||
let mut duplicates = Vec::new();
|
||||
let is_expected = compile::stream_cargo(builder, &mut cargo, &mut |msg| {
|
||||
let is_expected = compile::stream_cargo(builder, &mut cargo, vec![], &mut |msg| {
|
||||
// Only care about big things like the RLS/Cargo for now
|
||||
match tool {
|
||||
| "rls"
|
||||
| "cargo"
|
||||
| "clippy-driver"
|
||||
| "miri"
|
||||
| "rustfmt"
|
||||
=> {}
|
||||
|
||||
_ => return,
|
||||
|
|
@ -108,36 +108,63 @@ impl Step for ToolBuild {
|
|||
continue
|
||||
}
|
||||
|
||||
// Don't worry about libs that turn out to be host dependencies
|
||||
// or build scripts, we only care about target dependencies that
|
||||
// are in `deps`.
|
||||
if let Some(maybe_target) = val.1
|
||||
.parent() // chop off file name
|
||||
.and_then(|p| p.parent()) // chop off `deps`
|
||||
.and_then(|p| p.parent()) // chop off `release`
|
||||
.and_then(|p| p.file_name())
|
||||
.and_then(|p| p.to_str())
|
||||
{
|
||||
if maybe_target != &*target {
|
||||
continue
|
||||
// Don't worry about compiles that turn out to be host
|
||||
// dependencies or build scripts. To skip these we look for
|
||||
// anything that goes in `.../release/deps` but *doesn't* go in
|
||||
// `$target/release/deps`. This ensure that outputs in
|
||||
// `$target/release` are still considered candidates for
|
||||
// deduplication.
|
||||
if let Some(parent) = val.1.parent() {
|
||||
if parent.ends_with("release/deps") {
|
||||
let maybe_target = parent
|
||||
.parent()
|
||||
.and_then(|p| p.parent())
|
||||
.and_then(|p| p.file_name())
|
||||
.and_then(|p| p.to_str())
|
||||
.unwrap();
|
||||
if maybe_target != &*target {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Record that we've built an artifact for `id`, and if one was
|
||||
// already listed then we need to see if we reused the same
|
||||
// artifact or produced a duplicate.
|
||||
let mut artifacts = builder.tool_artifacts.borrow_mut();
|
||||
let prev_artifacts = artifacts
|
||||
.entry(target)
|
||||
.or_default();
|
||||
if let Some(prev) = prev_artifacts.get(&*id) {
|
||||
if prev.1 != val.1 {
|
||||
duplicates.push((
|
||||
id.to_string(),
|
||||
val,
|
||||
prev.clone(),
|
||||
));
|
||||
let prev = match prev_artifacts.get(&*id) {
|
||||
Some(prev) => prev,
|
||||
None => {
|
||||
prev_artifacts.insert(id.to_string(), val);
|
||||
continue;
|
||||
}
|
||||
return
|
||||
};
|
||||
if prev.1 == val.1 {
|
||||
return; // same path, same artifact
|
||||
}
|
||||
prev_artifacts.insert(id.to_string(), val);
|
||||
|
||||
// If the paths are different and one of them *isn't* inside of
|
||||
// `release/deps`, then it means it's probably in
|
||||
// `$target/release`, or it's some final artifact like
|
||||
// `libcargo.rlib`. In these situations Cargo probably just
|
||||
// copied it up from `$target/release/deps/libcargo-xxxx.rlib`,
|
||||
// so if the features are equal we can just skip it.
|
||||
let prev_no_hash = prev.1.parent().unwrap().ends_with("release/deps");
|
||||
let val_no_hash = val.1.parent().unwrap().ends_with("release/deps");
|
||||
if prev.2 == val.2 || !prev_no_hash || !val_no_hash {
|
||||
return;
|
||||
}
|
||||
|
||||
// ... and otherwise this looks like we duplicated some sort of
|
||||
// compilation, so record it to generate an error later.
|
||||
duplicates.push((
|
||||
id.to_string(),
|
||||
val,
|
||||
prev.clone(),
|
||||
));
|
||||
}
|
||||
});
|
||||
|
||||
|
|
@ -252,11 +279,26 @@ pub fn prepare_tool_cargo(
|
|||
cargo
|
||||
}
|
||||
|
||||
fn rustbook_features() -> Vec<String> {
|
||||
let mut features = Vec::new();
|
||||
|
||||
// Due to CI budged and risk of spurious failures we want to limit jobs running this check.
|
||||
// At same time local builds should run it regardless of the platform.
|
||||
// `CiEnv::None` means it's local build and `CHECK_LINKS` is defined in x86_64-gnu-tools to
|
||||
// explicitly enable it on single job
|
||||
if CiEnv::current() == CiEnv::None || env::var("CHECK_LINKS").is_ok() {
|
||||
features.push("linkcheck".to_string());
|
||||
}
|
||||
|
||||
features
|
||||
}
|
||||
|
||||
macro_rules! bootstrap_tool {
|
||||
($(
|
||||
$name:ident, $path:expr, $tool_name:expr
|
||||
$(,llvm_tools = $llvm:expr)*
|
||||
$(,is_external_tool = $external:expr)*
|
||||
$(,features = $features:expr)*
|
||||
;
|
||||
)+) => {
|
||||
#[derive(Copy, PartialEq, Eq, Clone)]
|
||||
|
|
@ -267,10 +309,6 @@ macro_rules! bootstrap_tool {
|
|||
}
|
||||
|
||||
impl Tool {
|
||||
pub fn get_mode(&self) -> Mode {
|
||||
Mode::ToolBootstrap
|
||||
}
|
||||
|
||||
/// Whether this tool requires LLVM to run
|
||||
pub fn uses_llvm_tools(&self) -> bool {
|
||||
match self {
|
||||
|
|
@ -327,7 +365,12 @@ macro_rules! bootstrap_tool {
|
|||
} else {
|
||||
SourceType::InTree
|
||||
},
|
||||
extra_features: Vec::new(),
|
||||
extra_features: {
|
||||
// FIXME(#60643): avoid this lint by using `_`
|
||||
let mut _tmp = Vec::new();
|
||||
$(_tmp.extend($features);)*
|
||||
_tmp
|
||||
},
|
||||
}).expect("expected to build -- essential tool")
|
||||
}
|
||||
}
|
||||
|
|
@ -336,7 +379,7 @@ macro_rules! bootstrap_tool {
|
|||
}
|
||||
|
||||
bootstrap_tool!(
|
||||
Rustbook, "src/tools/rustbook", "rustbook";
|
||||
Rustbook, "src/tools/rustbook", "rustbook", features = rustbook_features();
|
||||
UnstableBookGen, "src/tools/unstable-book-gen", "unstable-book-gen";
|
||||
Tidy, "src/tools/tidy", "tidy";
|
||||
Linkchecker, "src/tools/linkchecker", "linkchecker";
|
||||
|
|
@ -485,11 +528,6 @@ impl Step for Rustdoc {
|
|||
&[],
|
||||
);
|
||||
|
||||
// Most tools don't get debuginfo, but rustdoc should.
|
||||
cargo.env("RUSTC_DEBUGINFO", builder.config.rust_debuginfo.to_string())
|
||||
.env("RUSTC_DEBUGINFO_LINES", builder.config.rust_debuginfo_lines.to_string());
|
||||
|
||||
let _folder = builder.fold_output(|| format!("stage{}-rustdoc", target_compiler.stage));
|
||||
builder.info(&format!("Building rustdoc for stage{} ({})",
|
||||
target_compiler.stage, target_compiler.host));
|
||||
builder.run(&mut cargo);
|
||||
|
|
@ -539,9 +577,9 @@ impl Step for Cargo {
|
|||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> PathBuf {
|
||||
// Cargo depends on procedural macros, which requires a full host
|
||||
// compiler to be available, so we need to depend on that.
|
||||
builder.ensure(compile::Rustc {
|
||||
// Cargo depends on procedural macros, so make sure the host
|
||||
// libstd/libproc_macro is available.
|
||||
builder.ensure(compile::Test {
|
||||
compiler: self.compiler,
|
||||
target: builder.config.build,
|
||||
});
|
||||
|
|
@ -613,26 +651,26 @@ macro_rules! tool_extended {
|
|||
tool_extended!((self, builder),
|
||||
Cargofmt, rustfmt, "src/tools/rustfmt", "cargo-fmt", {};
|
||||
CargoClippy, clippy, "src/tools/clippy", "cargo-clippy", {
|
||||
// Clippy depends on procedural macros (serde), which requires a full host
|
||||
// compiler to be available, so we need to depend on that.
|
||||
builder.ensure(compile::Rustc {
|
||||
// Clippy depends on procedural macros, so make sure that's built for
|
||||
// the compiler itself.
|
||||
builder.ensure(compile::Test {
|
||||
compiler: self.compiler,
|
||||
target: builder.config.build,
|
||||
});
|
||||
};
|
||||
Clippy, clippy, "src/tools/clippy", "clippy-driver", {
|
||||
// Clippy depends on procedural macros (serde), which requires a full host
|
||||
// compiler to be available, so we need to depend on that.
|
||||
builder.ensure(compile::Rustc {
|
||||
// Clippy depends on procedural macros, so make sure that's built for
|
||||
// the compiler itself.
|
||||
builder.ensure(compile::Test {
|
||||
compiler: self.compiler,
|
||||
target: builder.config.build,
|
||||
});
|
||||
};
|
||||
Miri, miri, "src/tools/miri", "miri", {};
|
||||
CargoMiri, miri, "src/tools/miri", "cargo-miri", {
|
||||
// Miri depends on procedural macros (serde), which requires a full host
|
||||
// compiler to be available, so we need to depend on that.
|
||||
builder.ensure(compile::Rustc {
|
||||
// Miri depends on procedural macros, so make sure that's built for
|
||||
// the compiler itself.
|
||||
builder.ensure(compile::Test {
|
||||
compiler: self.compiler,
|
||||
target: builder.config.build,
|
||||
});
|
||||
|
|
@ -646,9 +684,9 @@ tool_extended!((self, builder),
|
|||
if clippy.is_some() {
|
||||
self.extra_features.push("clippy".to_owned());
|
||||
}
|
||||
// RLS depends on procedural macros, which requires a full host
|
||||
// compiler to be available, so we need to depend on that.
|
||||
builder.ensure(compile::Rustc {
|
||||
// RLS depends on procedural macros, so make sure that's built for
|
||||
// the compiler itself.
|
||||
builder.ensure(compile::Test {
|
||||
compiler: self.compiler,
|
||||
target: builder.config.build,
|
||||
});
|
||||
|
|
@ -662,23 +700,14 @@ impl<'a> Builder<'a> {
|
|||
pub fn tool_cmd(&self, tool: Tool) -> Command {
|
||||
let mut cmd = Command::new(self.tool_exe(tool));
|
||||
let compiler = self.compiler(0, self.config.build);
|
||||
self.prepare_tool_cmd(compiler, tool, &mut cmd);
|
||||
cmd
|
||||
}
|
||||
|
||||
/// Prepares the `cmd` provided to be able to run the `compiler` provided.
|
||||
///
|
||||
/// Notably this munges the dynamic library lookup path to point to the
|
||||
/// right location to run `compiler`.
|
||||
fn prepare_tool_cmd(&self, compiler: Compiler, tool: Tool, cmd: &mut Command) {
|
||||
let host = &compiler.host;
|
||||
// Prepares the `cmd` provided to be able to run the `compiler` provided.
|
||||
//
|
||||
// Notably this munges the dynamic library lookup path to point to the
|
||||
// right location to run `compiler`.
|
||||
let mut lib_paths: Vec<PathBuf> = vec![
|
||||
if compiler.stage == 0 {
|
||||
self.build.rustc_snapshot_libdir()
|
||||
} else {
|
||||
PathBuf::from(&self.sysroot_libdir(compiler, compiler.host))
|
||||
},
|
||||
self.cargo_out(compiler, tool.get_mode(), *host).join("deps"),
|
||||
self.build.rustc_snapshot_libdir(),
|
||||
self.cargo_out(compiler, Mode::ToolBootstrap, *host).join("deps"),
|
||||
];
|
||||
|
||||
// On MSVC a tool may invoke a C compiler (e.g., compiletest in run-make
|
||||
|
|
@ -699,6 +728,7 @@ impl<'a> Builder<'a> {
|
|||
}
|
||||
}
|
||||
|
||||
add_lib_path(lib_paths, cmd);
|
||||
add_lib_path(lib_paths, &mut cmd);
|
||||
cmd
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -6,10 +6,10 @@
|
|||
use std::env;
|
||||
use std::str;
|
||||
use std::fs;
|
||||
use std::io::{self, Write};
|
||||
use std::io;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
use std::time::{SystemTime, Instant};
|
||||
use std::time::Instant;
|
||||
|
||||
use build_helper::t;
|
||||
|
||||
|
|
@ -209,7 +209,7 @@ pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
|
|||
let h = CreateFileW(path.as_ptr(),
|
||||
GENERIC_WRITE,
|
||||
FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
|
||||
0 as *mut _,
|
||||
ptr::null_mut(),
|
||||
OPEN_EXISTING,
|
||||
FILE_FLAG_OPEN_REPARSE_POINT | FILE_FLAG_BACKUP_SEMANTICS,
|
||||
ptr::null_mut());
|
||||
|
|
@ -254,87 +254,21 @@ pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
|
|||
}
|
||||
}
|
||||
|
||||
/// An RAII structure that indicates all output until this instance is dropped
|
||||
/// is part of the same group.
|
||||
///
|
||||
/// On Travis CI, these output will be folded by default, together with the
|
||||
/// elapsed time in this block. This reduces noise from unnecessary logs,
|
||||
/// allowing developers to quickly identify the error.
|
||||
///
|
||||
/// Travis CI supports folding by printing `travis_fold:start:<name>` and
|
||||
/// `travis_fold:end:<name>` around the block. Time elapsed is recognized
|
||||
/// similarly with `travis_time:[start|end]:<name>`. These are undocumented, but
|
||||
/// can easily be deduced from source code of the [Travis build commands].
|
||||
///
|
||||
/// [Travis build commands]:
|
||||
/// https://github.com/travis-ci/travis-build/blob/f603c0089/lib/travis/build/templates/header.sh
|
||||
pub struct OutputFolder {
|
||||
name: String,
|
||||
start_time: SystemTime, // we need SystemTime to get the UNIX timestamp.
|
||||
}
|
||||
|
||||
impl OutputFolder {
|
||||
/// Creates a new output folder with the given group name.
|
||||
pub fn new(name: String) -> OutputFolder {
|
||||
// "\r" moves the cursor to the beginning of the line, and "\x1b[0K" is
|
||||
// the ANSI escape code to clear from the cursor to end of line.
|
||||
// Travis seems to have trouble when _not_ using "\r\x1b[0K", that will
|
||||
// randomly put lines to the top of the webpage.
|
||||
print!("travis_fold:start:{0}\r\x1b[0Ktravis_time:start:{0}\r\x1b[0K", name);
|
||||
OutputFolder {
|
||||
name,
|
||||
start_time: SystemTime::now(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Drop for OutputFolder {
|
||||
fn drop(&mut self) {
|
||||
use std::time::*;
|
||||
use std::u64;
|
||||
|
||||
fn to_nanos(duration: Result<Duration, SystemTimeError>) -> u64 {
|
||||
match duration {
|
||||
Ok(d) => d.as_secs() * 1_000_000_000 + d.subsec_nanos() as u64,
|
||||
Err(_) => u64::MAX,
|
||||
}
|
||||
}
|
||||
|
||||
let end_time = SystemTime::now();
|
||||
let duration = end_time.duration_since(self.start_time);
|
||||
let start = self.start_time.duration_since(UNIX_EPOCH);
|
||||
let finish = end_time.duration_since(UNIX_EPOCH);
|
||||
println!(
|
||||
"travis_fold:end:{0}\r\x1b[0K\n\
|
||||
travis_time:end:{0}:start={1},finish={2},duration={3}\r\x1b[0K",
|
||||
self.name,
|
||||
to_nanos(start),
|
||||
to_nanos(finish),
|
||||
to_nanos(duration)
|
||||
);
|
||||
io::stdout().flush().unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
/// The CI environment rustbuild is running in. This mainly affects how the logs
|
||||
/// are printed.
|
||||
#[derive(Copy, Clone, PartialEq, Eq, Debug)]
|
||||
pub enum CiEnv {
|
||||
/// Not a CI environment.
|
||||
None,
|
||||
/// The Travis CI environment, for Linux (including Docker) and macOS builds.
|
||||
Travis,
|
||||
/// The AppVeyor environment, for Windows builds.
|
||||
AppVeyor,
|
||||
/// The Azure Pipelines environment, for Linux (including Docker), Windows, and macOS builds.
|
||||
AzurePipelines,
|
||||
}
|
||||
|
||||
impl CiEnv {
|
||||
/// Obtains the current CI environment.
|
||||
pub fn current() -> CiEnv {
|
||||
if env::var("TRAVIS").ok().map_or(false, |e| &*e == "true") {
|
||||
CiEnv::Travis
|
||||
} else if env::var("APPVEYOR").ok().map_or(false, |e| &*e == "True") {
|
||||
CiEnv::AppVeyor
|
||||
if env::var("TF_BUILD").ok().map_or(false, |e| &*e == "True") {
|
||||
CiEnv::AzurePipelines
|
||||
} else {
|
||||
CiEnv::None
|
||||
}
|
||||
|
|
@ -352,3 +286,19 @@ impl CiEnv {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn forcing_clang_based_tests() -> bool {
|
||||
if let Some(var) = env::var_os("RUSTBUILD_FORCE_CLANG_BASED_TESTS") {
|
||||
match &var.to_string_lossy().to_lowercase()[..] {
|
||||
"1" | "yes" | "on" => true,
|
||||
"0" | "no" | "off" => false,
|
||||
other => {
|
||||
// Let's make sure typos don't go unnoticed
|
||||
panic!("Unrecognized option '{}' set in \
|
||||
RUSTBUILD_FORCE_CLANG_BASED_TESTS", other)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
#![deny(rust_2018_idioms)]
|
||||
// NO-RUSTC-WRAPPER
|
||||
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
|
||||
|
||||
use std::fs::File;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
|
@ -44,18 +45,19 @@ pub fn restore_library_path() {
|
|||
}
|
||||
}
|
||||
|
||||
pub fn run(cmd: &mut Command) {
|
||||
/// Run the command, printing what we are running.
|
||||
pub fn run_verbose(cmd: &mut Command) {
|
||||
println!("running: {:?}", cmd);
|
||||
run_silent(cmd);
|
||||
run(cmd);
|
||||
}
|
||||
|
||||
pub fn run_silent(cmd: &mut Command) {
|
||||
if !try_run_silent(cmd) {
|
||||
pub fn run(cmd: &mut Command) {
|
||||
if !try_run(cmd) {
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn try_run_silent(cmd: &mut Command) -> bool {
|
||||
pub fn try_run(cmd: &mut Command) -> bool {
|
||||
let status = match cmd.status() {
|
||||
Ok(status) => status,
|
||||
Err(e) => fail(&format!(
|
||||
|
|
@ -288,9 +290,9 @@ pub fn sanitizer_lib_boilerplate(sanitizer_name: &str)
|
|||
} else {
|
||||
format!("static={}", link_name)
|
||||
};
|
||||
// The source for `compiler-rt` comes from the `compiler-builtins` crate, so
|
||||
// load our env var set by cargo to find the source code.
|
||||
let dir = env::var_os("DEP_COMPILER_RT_COMPILER_RT").unwrap();
|
||||
// This env var is provided by rustbuild to tell us where `compiler-rt`
|
||||
// lives.
|
||||
let dir = env::var_os("RUST_COMPILER_RT_ROOT").unwrap();
|
||||
let lib = native_lib_boilerplate(
|
||||
dir.as_ref(),
|
||||
sanitizer_name,
|
||||
|
|
|
|||
350
src/ci/azure-pipelines/auto.yml
Normal file
350
src/ci/azure-pipelines/auto.yml
Normal file
|
|
@ -0,0 +1,350 @@
|
|||
#
|
||||
# Azure Pipelines "auto" branch build for Rust on Linux, macOS, and Windows.
|
||||
#
|
||||
|
||||
pr: none
|
||||
trigger:
|
||||
- auto
|
||||
|
||||
variables:
|
||||
- group: real-prod-credentials
|
||||
|
||||
jobs:
|
||||
- job: Linux
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: ubuntu-16.04
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
strategy:
|
||||
matrix:
|
||||
x86_64-gnu-llvm-6.0:
|
||||
IMAGE: x86_64-gnu-llvm-6.0
|
||||
RUST_BACKTRACE: 1
|
||||
|
||||
dist-x86_64-linux:
|
||||
IMAGE: dist-x86_64-linux
|
||||
DEPLOY: 1
|
||||
|
||||
# "alternate" deployments, these are "nightlies" but have LLVM assertions
|
||||
# turned on, they're deployed to a different location primarily for
|
||||
# additional testing.
|
||||
dist-x86_64-linux-alt:
|
||||
IMAGE: dist-x86_64-linux
|
||||
DEPLOY_ALT: 1
|
||||
|
||||
# Linux builders, remaining docker images
|
||||
arm-android:
|
||||
IMAGE: arm-android
|
||||
|
||||
armhf-gnu:
|
||||
IMAGE: armhf-gnu
|
||||
|
||||
dist-various-1:
|
||||
IMAGE: dist-various-1
|
||||
DEPLOY: 1
|
||||
|
||||
dist-various-2:
|
||||
IMAGE: dist-various-2
|
||||
DEPLOY: 1
|
||||
|
||||
dist-aarch64-linux:
|
||||
IMAGE: dist-aarch64-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-android:
|
||||
IMAGE: dist-android
|
||||
DEPLOY: 1
|
||||
|
||||
dist-arm-linux:
|
||||
IMAGE: dist-arm-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-armhf-linux:
|
||||
IMAGE: dist-armhf-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-armv7-linux:
|
||||
IMAGE: dist-armv7-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-i586-gnu-i586-i686-musl:
|
||||
IMAGE: dist-i586-gnu-i586-i686-musl
|
||||
DEPLOY: 1
|
||||
|
||||
dist-i686-freebsd:
|
||||
IMAGE: dist-i686-freebsd
|
||||
DEPLOY: 1
|
||||
|
||||
dist-i686-linux:
|
||||
IMAGE: dist-i686-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-mips-linux:
|
||||
IMAGE: dist-mips-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-mips64-linux:
|
||||
IMAGE: dist-mips64-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-mips64el-linux:
|
||||
IMAGE: dist-mips64el-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-mipsel-linux:
|
||||
IMAGE: dist-mipsel-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-powerpc-linux:
|
||||
IMAGE: dist-powerpc-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-powerpc64-linux:
|
||||
IMAGE: dist-powerpc64-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-powerpc64le-linux:
|
||||
IMAGE: dist-powerpc64le-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-s390x-linux:
|
||||
IMAGE: dist-s390x-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-x86_64-freebsd:
|
||||
IMAGE: dist-x86_64-freebsd
|
||||
DEPLOY: 1
|
||||
|
||||
dist-x86_64-musl:
|
||||
IMAGE: dist-x86_64-musl
|
||||
DEPLOY: 1
|
||||
|
||||
dist-x86_64-netbsd:
|
||||
IMAGE: dist-x86_64-netbsd
|
||||
DEPLOY: 1
|
||||
|
||||
asmjs:
|
||||
IMAGE: asmjs
|
||||
i686-gnu:
|
||||
IMAGE: i686-gnu
|
||||
i686-gnu-nopt:
|
||||
IMAGE: i686-gnu-nopt
|
||||
test-various:
|
||||
IMAGE: test-various
|
||||
x86_64-gnu:
|
||||
IMAGE: x86_64-gnu
|
||||
x86_64-gnu-full-bootstrap:
|
||||
IMAGE: x86_64-gnu-full-bootstrap
|
||||
x86_64-gnu-aux:
|
||||
IMAGE: x86_64-gnu-aux
|
||||
x86_64-gnu-tools:
|
||||
IMAGE: x86_64-gnu-tools
|
||||
x86_64-gnu-debug:
|
||||
IMAGE: x86_64-gnu-debug
|
||||
x86_64-gnu-nopt:
|
||||
IMAGE: x86_64-gnu-nopt
|
||||
x86_64-gnu-distcheck:
|
||||
IMAGE: x86_64-gnu-distcheck
|
||||
mingw-check:
|
||||
IMAGE: mingw-check
|
||||
|
||||
- job: macOS
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: macos-10.13
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
strategy:
|
||||
matrix:
|
||||
# OSX builders running tests, these run the full test suite.
|
||||
# NO_DEBUG_ASSERTIONS=1 to make them go faster, but also do have some
|
||||
# runners that run `//ignore-debug` tests.
|
||||
#
|
||||
# Note that the compiler is compiled to target 10.8 here because the Xcode
|
||||
# version that we're using, 8.2, cannot compile LLVM for OSX 10.7.
|
||||
x86_64-apple:
|
||||
SCRIPT: ./x.py test
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-apple-darwin --enable-sanitizers --enable-profiler --set rust.jemalloc
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.8
|
||||
MACOSX_STD_DEPLOYMENT_TARGET: 10.7
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
|
||||
dist-x86_64-apple:
|
||||
SCRIPT: ./x.py dist
|
||||
RUST_CONFIGURE_ARGS: --target=aarch64-apple-ios,armv7-apple-ios,armv7s-apple-ios,i386-apple-ios,x86_64-apple-ios --enable-full-tools --enable-sanitizers --enable-profiler --set rust.jemalloc
|
||||
DEPLOY: 1
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.7
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
|
||||
dist-x86_64-apple-alt:
|
||||
SCRIPT: ./x.py dist
|
||||
RUST_CONFIGURE_ARGS: --enable-extended --enable-profiler --set rust.jemalloc
|
||||
DEPLOY_ALT: 1
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.7
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
|
||||
i686-apple:
|
||||
SCRIPT: ./x.py test
|
||||
RUST_CONFIGURE_ARGS: --build=i686-apple-darwin --set rust.jemalloc
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.8
|
||||
MACOSX_STD_DEPLOYMENT_TARGET: 10.7
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
|
||||
dist-i686-apple:
|
||||
SCRIPT: ./x.py dist
|
||||
RUST_CONFIGURE_ARGS: --build=i686-apple-darwin --enable-full-tools --enable-profiler --set rust.jemalloc
|
||||
DEPLOY: 1
|
||||
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.7
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
|
||||
|
||||
|
||||
- job: Windows
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: 'vs2017-win2016'
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
strategy:
|
||||
matrix:
|
||||
# 32/64 bit MSVC tests
|
||||
x86_64-msvc-1:
|
||||
MSYS_BITS: 64
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-profiler
|
||||
SCRIPT: make ci-subset-1
|
||||
# FIXME(#59637)
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
x86_64-msvc-2:
|
||||
MSYS_BITS: 64
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-profiler
|
||||
SCRIPT: make ci-subset-2
|
||||
i686-msvc-1:
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
|
||||
SCRIPT: make ci-subset-1
|
||||
i686-msvc-2:
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
|
||||
SCRIPT: make ci-subset-2
|
||||
# MSVC aux tests
|
||||
x86_64-msvc-aux:
|
||||
MSYS_BITS: 64
|
||||
RUST_CHECK_TARGET: check-aux EXCLUDE_CARGO=1
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
|
||||
x86_64-msvc-cargo:
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: python x.py test src/tools/cargotest src/tools/cargo
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
|
||||
VCVARS_BAT: vcvars64.bat
|
||||
# MSVC tools tests
|
||||
x86_64-msvc-tools:
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: src/ci/docker/x86_64-gnu-tools/checktools.sh x.py /tmp/toolstates.json windows
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --save-toolstates=/tmp/toolstates.json
|
||||
|
||||
# 32/64-bit MinGW builds.
|
||||
#
|
||||
# We are using MinGW with posix threads since LLVM does not compile with
|
||||
# the win32 threads version due to missing support for C++'s std::thread.
|
||||
#
|
||||
# Instead of relying on the MinGW version installed on appveryor we download
|
||||
# and install one ourselves so we won't be surprised by changes to appveyor's
|
||||
# build image.
|
||||
#
|
||||
# Finally, note that the downloads below are all in the `rust-lang-ci` S3
|
||||
# bucket, but they cleraly didn't originate there! The downloads originally
|
||||
# came from the mingw-w64 SourceForge download site. Unfortunately
|
||||
# SourceForge is notoriously flaky, so we mirror it on our own infrastructure.
|
||||
i686-mingw-1:
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
|
||||
SCRIPT: make ci-mingw-subset-1
|
||||
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw32
|
||||
# FIXME(#59637)
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
i686-mingw-2:
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
|
||||
SCRIPT: make ci-mingw-subset-2
|
||||
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw32
|
||||
x86_64-mingw-1:
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: make ci-mingw-subset-1
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
|
||||
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw64
|
||||
# FIXME(#59637)
|
||||
NO_DEBUG_ASSERTIONS: 1
|
||||
NO_LLVM_ASSERTIONS: 1
|
||||
x86_64-mingw-2:
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: make ci-mingw-subset-2
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
|
||||
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw64
|
||||
|
||||
# 32/64 bit MSVC and GNU deployment
|
||||
dist-x86_64-msvc:
|
||||
RUST_CONFIGURE_ARGS: >
|
||||
--build=x86_64-pc-windows-msvc
|
||||
--target=x86_64-pc-windows-msvc,aarch64-pc-windows-msvc
|
||||
--enable-full-tools
|
||||
--enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
dist-i686-msvc:
|
||||
RUST_CONFIGURE_ARGS: >
|
||||
--build=i686-pc-windows-msvc
|
||||
--target=i586-pc-windows-msvc
|
||||
--enable-full-tools
|
||||
--enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
dist-i686-mingw:
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu --enable-full-tools --enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw32
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
dist-x86_64-mingw:
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: python x.py dist
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu --enable-full-tools --enable-profiler
|
||||
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
|
||||
MINGW_DIR: mingw64
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
|
||||
# "alternate" deployment, see .travis.yml for more info
|
||||
dist-x86_64-msvc-alt:
|
||||
MSYS_BITS: 64
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-extended --enable-profiler
|
||||
SCRIPT: python x.py dist
|
||||
DEPLOY_ALT: 1
|
||||
25
src/ci/azure-pipelines/master.yml
Normal file
25
src/ci/azure-pipelines/master.yml
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
#
|
||||
# Azure Pipelines job to publish toolstate. Only triggers on pushes to master.
|
||||
#
|
||||
|
||||
pr: none
|
||||
trigger:
|
||||
- master
|
||||
|
||||
variables:
|
||||
- group: real-prod-credentials
|
||||
|
||||
pool:
|
||||
vmImage: ubuntu-16.04
|
||||
|
||||
steps:
|
||||
- checkout: self
|
||||
fetchDepth: 2
|
||||
|
||||
- script: |
|
||||
export MESSAGE_FILE=$(mktemp -t msg.XXXXXX)
|
||||
. src/ci/docker/x86_64-gnu-tools/repo.sh
|
||||
commit_toolstate_change "$MESSAGE_FILE" "$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "$MESSAGE_FILE" "$TOOLSTATE_REPO_ACCESS_TOKEN"
|
||||
displayName: Publish toolstate
|
||||
env:
|
||||
TOOLSTATE_REPO_ACCESS_TOKEN: $(TOOLSTATE_REPO_ACCESS_TOKEN)
|
||||
35
src/ci/azure-pipelines/pr.yml
Normal file
35
src/ci/azure-pipelines/pr.yml
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
#
|
||||
# Azure Pipelines pull request build for Rust
|
||||
#
|
||||
|
||||
trigger: none
|
||||
pr:
|
||||
- master
|
||||
|
||||
variables:
|
||||
- group: public-credentials
|
||||
|
||||
jobs:
|
||||
- job: Linux
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: ubuntu-16.04
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
strategy:
|
||||
matrix:
|
||||
x86_64-gnu-llvm-6.0:
|
||||
IMAGE: x86_64-gnu-llvm-6.0
|
||||
mingw-check:
|
||||
IMAGE: mingw-check
|
||||
|
||||
- job: LinuxTools
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: ubuntu-16.04
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
parameters:
|
||||
only_on_updated_submodules: 'yes'
|
||||
variables:
|
||||
IMAGE: x86_64-gnu-tools
|
||||
46
src/ci/azure-pipelines/steps/install-clang.yml
Normal file
46
src/ci/azure-pipelines/steps/install-clang.yml
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
steps:
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
curl -f http://releases.llvm.org/7.0.0/clang+llvm-7.0.0-x86_64-apple-darwin.tar.xz | tar xJf -
|
||||
|
||||
export CC=`pwd`/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang
|
||||
echo "##vso[task.setvariable variable=CC]$CC"
|
||||
|
||||
export CXX=`pwd`/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang++
|
||||
echo "##vso[task.setvariable variable=CXX]$CXX"
|
||||
|
||||
# Configure `AR` specifically so rustbuild doesn't try to infer it as
|
||||
# `clang-ar` by accident.
|
||||
echo "##vso[task.setvariable variable=AR]ar"
|
||||
displayName: Install clang (OSX)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Darwin'))
|
||||
|
||||
# If we're compiling for MSVC then we, like most other distribution builders,
|
||||
# switch to clang as the compiler. This'll allow us eventually to enable LTO
|
||||
# amongst LLVM and rustc. Note that we only do this on MSVC as I don't think
|
||||
# clang has an output mode compatible with MinGW that we need. If it does we
|
||||
# should switch to clang for MinGW as well!
|
||||
#
|
||||
# Note that the LLVM installer is an NSIS installer
|
||||
#
|
||||
# Original downloaded here came from
|
||||
# http://releases.llvm.org/7.0.0/LLVM-7.0.0-win64.exe
|
||||
# That installer was run through `wine` on Linux and then the resulting
|
||||
# installation directory (found in `$HOME/.wine/drive_c/Program Files/LLVM`) was
|
||||
# packaged up into a tarball. We've had issues otherwise that the installer will
|
||||
# randomly hang, provide not a lot of useful information, pollute global state,
|
||||
# etc. In general the tarball is just more confined and easier to deal with when
|
||||
# working with various CI environments.
|
||||
- bash: |
|
||||
set -e
|
||||
mkdir -p citools
|
||||
cd citools
|
||||
curl -f https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/LLVM-7.0.0-win64.tar.gz | tar xzf -
|
||||
echo "##vso[task.setvariable variable=RUST_CONFIGURE_ARGS]$RUST_CONFIGURE_ARGS --set llvm.clang-cl=`pwd`/clang-rust/bin/clang-cl.exe"
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), eq(variables['MINGW_URL'],''))
|
||||
displayName: Install clang (Windows)
|
||||
|
||||
# Note that we don't install clang on Linux since its compiler story is just so
|
||||
# different. Each container has its own toolchain configured appropriately
|
||||
# already.
|
||||
21
src/ci/azure-pipelines/steps/install-sccache.yml
Normal file
21
src/ci/azure-pipelines/steps/install-sccache.yml
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
steps:
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
curl -fo /usr/local/bin/sccache https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2018-04-02-sccache-x86_64-apple-darwin
|
||||
chmod +x /usr/local/bin/sccache
|
||||
displayName: Install sccache (OSX)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Darwin'))
|
||||
|
||||
- script: |
|
||||
md sccache
|
||||
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf sccache\sccache.exe https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2018-04-26-sccache-x86_64-pc-windows-msvc"
|
||||
echo ##vso[task.prependpath]%CD%\sccache
|
||||
displayName: Install sccache (Windows)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# Note that we don't install sccache on Linux since it's installed elsewhere
|
||||
# through all the containers.
|
||||
#
|
||||
# FIXME: we should probably install sccache outside the containers and then
|
||||
# mount it inside the containers so we can centralize all installation here.
|
||||
119
src/ci/azure-pipelines/steps/install-windows-build-deps.yml
Normal file
119
src/ci/azure-pipelines/steps/install-windows-build-deps.yml
Normal file
|
|
@ -0,0 +1,119 @@
|
|||
steps:
|
||||
# We use the WIX toolset to create combined installers for Windows, and these
|
||||
# binaries are downloaded from
|
||||
# https://github.com/wixtoolset/wix3 originally
|
||||
- bash: |
|
||||
set -e
|
||||
curl -O https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/wix311-binaries.zip
|
||||
echo "##vso[task.setvariable variable=WIX]`pwd`/wix"
|
||||
mkdir -p wix/bin
|
||||
cd wix/bin
|
||||
7z x ../../wix311-binaries.zip
|
||||
displayName: Install wix
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# We use InnoSetup and its `iscc` program to also create combined installers.
|
||||
# Honestly at this point WIX above and `iscc` are just holdovers from
|
||||
# oh-so-long-ago and are required for creating installers on Windows. I think
|
||||
# one is MSI installers and one is EXE, but they're not used so frequently at
|
||||
# this point anyway so perhaps it's a wash!
|
||||
- script: |
|
||||
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf is-install.exe https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-08-22-is.exe"
|
||||
is-install.exe /VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-
|
||||
echo ##vso[task.prependpath]C:\Program Files (x86)\Inno Setup 5
|
||||
displayName: Install InnoSetup
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# We've had issues with the default drive in use running out of space during a
|
||||
# build, and it looks like the `C:` drive has more space than the default `D:`
|
||||
# drive. We should probably confirm this with the azure pipelines team at some
|
||||
# point, but this seems to fix our "disk space full" problems.
|
||||
- script: |
|
||||
mkdir c:\MORE_SPACE
|
||||
mklink /J build c:\MORE_SPACE
|
||||
displayName: "Ensure build happens on C:/ instead of D:/"
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
- bash: git config --replace-all --global core.autocrlf false
|
||||
displayName: "Disable git automatic line ending conversion (on C:/)"
|
||||
|
||||
# Download and install MSYS2, needed primarily for the test suite (run-make) but
|
||||
# also used by the MinGW toolchain for assembling things.
|
||||
#
|
||||
# FIXME: we should probe the default azure image and see if we can use the MSYS2
|
||||
# toolchain there. (if there's even one there). For now though this gets the job
|
||||
# done.
|
||||
- script: |
|
||||
set MSYS_PATH=%CD%\citools\msys64
|
||||
choco install msys2 --params="/InstallDir:%MSYS_PATH% /NoPath" -y
|
||||
set PATH=%MSYS_PATH%\usr\bin;%PATH%
|
||||
pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar
|
||||
IF "%MINGW_URL%"=="" (
|
||||
IF "%MSYS_BITS%"=="32" pacman -S --noconfirm --needed mingw-w64-i686-toolchain mingw-w64-i686-cmake mingw-w64-i686-gcc mingw-w64-i686-python2
|
||||
IF "%MSYS_BITS%"=="64" pacman -S --noconfirm --needed mingw-w64-x86_64-toolchain mingw-w64-x86_64-cmake mingw-w64-x86_64-gcc mingw-w64-x86_64-python2
|
||||
)
|
||||
where rev
|
||||
rev --help
|
||||
where make
|
||||
|
||||
echo ##vso[task.setvariable variable=MSYS_PATH]%MSYS_PATH%
|
||||
echo ##vso[task.prependpath]%MSYS_PATH%\usr\bin
|
||||
displayName: Install msys2
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# If we need to download a custom MinGW, do so here and set the path
|
||||
# appropriately.
|
||||
#
|
||||
# Here we also do a pretty heinous thing which is to mangle the MinGW
|
||||
# installation we just downloaded. Currently, as of this writing, we're using
|
||||
# MinGW-w64 builds of gcc, and that's currently at 6.3.0. We use 6.3.0 as it
|
||||
# appears to be the first version which contains a fix for #40546, builds
|
||||
# randomly failing during LLVM due to ar.exe/ranlib.exe failures.
|
||||
#
|
||||
# Unfortunately, though, 6.3.0 *also* is the first version of MinGW-w64 builds
|
||||
# to contain a regression in gdb (#40184). As a result if we were to use the
|
||||
# gdb provided (7.11.1) then we would fail all debuginfo tests.
|
||||
#
|
||||
# In order to fix spurious failures (pretty high priority) we use 6.3.0. To
|
||||
# avoid disabling gdb tests we download an *old* version of gdb, specifically
|
||||
# that found inside the 6.2.0 distribution. We then overwrite the 6.3.0 gdb
|
||||
# with the 6.2.0 gdb to get tests passing.
|
||||
#
|
||||
# Note that we don't literally overwrite the gdb.exe binary because it appears
|
||||
# to just use gdborig.exe, so that's the binary we deal with instead.
|
||||
- script: |
|
||||
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf %MINGW_ARCHIVE% %MINGW_URL%/%MINGW_ARCHIVE%"
|
||||
7z x -y %MINGW_ARCHIVE% > nul
|
||||
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf 2017-04-20-%MSYS_BITS%bit-gdborig.exe %MINGW_URL%/2017-04-20-%MSYS_BITS%bit-gdborig.exe"
|
||||
mv 2017-04-20-%MSYS_BITS%bit-gdborig.exe %MINGW_DIR%\bin\gdborig.exe
|
||||
echo ##vso[task.prependpath]%CD%\%MINGW_DIR%\bin
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), ne(variables['MINGW_URL'],''))
|
||||
displayName: Download custom MinGW
|
||||
|
||||
# Otherwise pull in the MinGW installed on appveyor
|
||||
- script: |
|
||||
echo ##vso[task.prependpath]%MSYS_PATH%\mingw%MSYS_BITS%\bin
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), eq(variables['MINGW_URL'],''))
|
||||
displayName: Add MinGW to path
|
||||
|
||||
# Make sure we use the native python interpreter instead of some msys equivalent
|
||||
# one way or another. The msys interpreters seem to have weird path conversions
|
||||
# baked in which break LLVM's build system one way or another, so let's use the
|
||||
# native version which keeps everything as native as possible.
|
||||
- script: |
|
||||
copy C:\Python27amd64\python.exe C:\Python27amd64\python2.7.exe
|
||||
echo ##vso[task.prependpath]C:\Python27amd64
|
||||
displayName: Prefer the "native" Python as LLVM has trouble building with MSYS sometimes
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# Note that this is originally from the github releases patch of Ninja
|
||||
- script: |
|
||||
md ninja
|
||||
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf 2017-03-15-ninja-win.zip https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-03-15-ninja-win.zip"
|
||||
7z x -oninja 2017-03-15-ninja-win.zip
|
||||
del 2017-03-15-ninja-win.zip
|
||||
set RUST_CONFIGURE_ARGS=%RUST_CONFIGURE_ARGS% --enable-ninja
|
||||
echo ##vso[task.setvariable variable=RUST_CONFIGURE_ARGS]%RUST_CONFIGURE_ARGS%
|
||||
echo ##vso[task.prependpath]%CD%\ninja
|
||||
displayName: Download and install ninja
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
207
src/ci/azure-pipelines/steps/run.yml
Normal file
207
src/ci/azure-pipelines/steps/run.yml
Normal file
|
|
@ -0,0 +1,207 @@
|
|||
# FIXME(linux): need to configure core dumps, enable them, and then dump
|
||||
# backtraces on failure from all core dumps:
|
||||
#
|
||||
# - bash: sudo apt install gdb
|
||||
# - bash: sudo sh -c 'echo "/checkout/obj/cores/core.%p.%E" > /proc/sys/kernel/core_pattern'
|
||||
#
|
||||
# Check travis config for `gdb --batch` command to print all crash logs
|
||||
|
||||
parameters:
|
||||
# When this parameter is set to anything other than an empty string the tests
|
||||
# will only be executed when the commit updates submodules
|
||||
only_on_updated_submodules: ''
|
||||
|
||||
steps:
|
||||
|
||||
# Disable automatic line ending conversion, which is enabled by default on
|
||||
# Azure's Windows image. Having the conversion enabled caused regressions both
|
||||
# in our test suite (it broke miri tests) and in the ecosystem, since we
|
||||
# started shipping install scripts with CRLF endings instead of the old LF.
|
||||
#
|
||||
# Note that we do this a couple times during the build as the PATH and current
|
||||
# user/directory change, e.g. when mingw is enabled.
|
||||
- bash: git config --global core.autocrlf false
|
||||
displayName: "Disable git automatic line ending conversion"
|
||||
|
||||
- checkout: self
|
||||
fetchDepth: 2
|
||||
|
||||
# Set the SKIP_JOB environment variable if this job is supposed to only run
|
||||
# when submodules are updated and they were not. The following time consuming
|
||||
# tasks will be skipped when the environment variable is present.
|
||||
- ${{ if parameters.only_on_updated_submodules }}:
|
||||
- bash: |
|
||||
set -e
|
||||
# Submodules pseudo-files inside git have the 160000 permissions, so when
|
||||
# those files are present in the diff a submodule was updated.
|
||||
if git diff HEAD^ | grep "^index .* 160000" >/dev/null 2>&1; then
|
||||
echo "Executing the job since submodules are updated"
|
||||
else
|
||||
echo "Not executing this job since no submodules were updated"
|
||||
echo "##vso[task.setvariable variable=SKIP_JOB;]1"
|
||||
fi
|
||||
displayName: Decide whether to run this job
|
||||
|
||||
# Spawn a background process to collect CPU usage statistics which we'll upload
|
||||
# at the end of the build. See the comments in the script here for more
|
||||
# information.
|
||||
- bash: python src/ci/cpu-usage-over-time.py &> cpu-usage.csv &
|
||||
displayName: "Collect CPU-usage statistics in the background"
|
||||
|
||||
- bash: printenv | sort
|
||||
displayName: Show environment variables
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
df -h
|
||||
du . | sort -nr | head -n100
|
||||
displayName: Show disk usage
|
||||
# FIXME: this hasn't been tested, but maybe it works on Windows? Should test!
|
||||
condition: and(succeeded(), ne(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
- template: install-sccache.yml
|
||||
- template: install-clang.yml
|
||||
|
||||
# Switch to XCode 9.3 on OSX since it seems to be the last version that supports
|
||||
# i686-apple-darwin. We'll eventually want to upgrade this and it will probably
|
||||
# force us to drop i686-apple-darwin, but let's keep the wheels turning for now.
|
||||
- bash: |
|
||||
set -e
|
||||
sudo xcode-select --switch /Applications/Xcode_9.3.app
|
||||
displayName: Switch to Xcode 9.3 (OSX)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Darwin'))
|
||||
|
||||
- template: install-windows-build-deps.yml
|
||||
|
||||
# Looks like docker containers have IPv6 disabled by default, so let's turn it
|
||||
# on since libstd tests require it
|
||||
- bash: |
|
||||
set -e
|
||||
sudo mkdir -p /etc/docker
|
||||
echo '{"ipv6":true,"fixed-cidr-v6":"fd9a:8454:6789:13f7::/64"}' | sudo tee /etc/docker/daemon.json
|
||||
sudo service docker restart
|
||||
displayName: Enable IPv6
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['Agent.OS'], 'Linux'))
|
||||
|
||||
# Disable automatic line ending conversion (again). On Windows, when we're
|
||||
# installing dependencies, something switches the git configuration directory or
|
||||
# re-enables autocrlf. We've not tracked down the exact cause -- and there may
|
||||
# be multiple -- but this should ensure submodules are checked out with the
|
||||
# appropriate line endings.
|
||||
- bash: git config --replace-all --global core.autocrlf false
|
||||
displayName: "Disable git automatic line ending conversion"
|
||||
|
||||
# Check out all our submodules, but more quickly than using git by using one of
|
||||
# our custom scripts
|
||||
- bash: |
|
||||
set -e
|
||||
mkdir -p $HOME/rustsrc
|
||||
$BUILD_SOURCESDIRECTORY/src/ci/init_repo.sh . $HOME/rustsrc
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), ne(variables['Agent.OS'], 'Windows_NT'))
|
||||
displayName: Check out submodules (Unix)
|
||||
- script: |
|
||||
if not exist C:\cache\rustsrc\NUL mkdir C:\cache\rustsrc
|
||||
sh src/ci/init_repo.sh . /c/cache/rustsrc
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
displayName: Check out submodules (Windows)
|
||||
|
||||
# See also the disable for autocrlf above, this just checks that it worked
|
||||
#
|
||||
# We check both in rust-lang/rust and in a submodule to make sure both are
|
||||
# accurate. Submodules are checked out significantly later than the main
|
||||
# repository in this script, so settings can (and do!) change between then.
|
||||
#
|
||||
# Linux (and maybe macOS) builders don't currently have dos2unix so just only
|
||||
# run this step on Windows.
|
||||
- bash: |
|
||||
set -x
|
||||
# print out the git configuration so we can better investigate failures in
|
||||
# the following
|
||||
git config --list --show-origin
|
||||
dos2unix -ih Cargo.lock src/tools/rust-installer/install-template.sh
|
||||
endings=$(dos2unix -ic Cargo.lock src/tools/rust-installer/install-template.sh)
|
||||
# if endings has non-zero length, error out
|
||||
if [ -n "$endings" ]; then exit 1 ; fi
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
displayName: Verify line endings are LF
|
||||
|
||||
# Ensure the `aws` CLI is installed so we can deploy later on, cache docker
|
||||
# images, etc.
|
||||
- bash: src/ci/install-awscli.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
displayName: Install awscli
|
||||
|
||||
# Configure our CI_JOB_NAME variable which log analyzers can use for the main
|
||||
# step to see what's going on.
|
||||
- bash: |
|
||||
builder=$(echo $AGENT_JOBNAME | cut -d ' ' -f 2)
|
||||
echo "##vso[task.setvariable variable=CI_JOB_NAME]$builder"
|
||||
displayName: Configure Job Name
|
||||
|
||||
# As a quick smoke check on the otherwise very fast mingw-check linux builder
|
||||
# check our own internal scripts.
|
||||
- bash: |
|
||||
set -e
|
||||
git clone --depth=1 https://github.com/rust-lang-nursery/rust-toolstate.git
|
||||
cd rust-toolstate
|
||||
python2.7 "$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "" ""
|
||||
cd ..
|
||||
rm -rf rust-toolstate
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['IMAGE'], 'mingw-check'))
|
||||
displayName: Verify the publish_toolstate script works
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
# Remove any preexisting rustup installation since it can interfere
|
||||
# with the cargotest step and its auto-detection of things like Clippy in
|
||||
# the environment
|
||||
rustup self uninstall -y || true
|
||||
if [ "$IMAGE" = "" ]; then
|
||||
src/ci/run.sh
|
||||
else
|
||||
src/ci/docker/run.sh $IMAGE
|
||||
fi
|
||||
#timeoutInMinutes: 180
|
||||
timeoutInMinutes: 600
|
||||
env:
|
||||
CI: true
|
||||
SRC: .
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
TOOLSTATE_REPO_ACCESS_TOKEN: $(TOOLSTATE_REPO_ACCESS_TOKEN)
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
displayName: Run build
|
||||
|
||||
# If we're a deploy builder, use the `aws` command to publish everything to our
|
||||
# bucket.
|
||||
- bash: |
|
||||
set -e
|
||||
source src/ci/shared.sh
|
||||
if [ "$AGENT_OS" = "Linux" ]; then
|
||||
rm -rf obj/build/dist/doc
|
||||
upload_dir=obj/build/dist
|
||||
else
|
||||
rm -rf build/dist/doc
|
||||
upload_dir=build/dist
|
||||
fi
|
||||
ls -la $upload_dir
|
||||
deploy_dir=rustc-builds
|
||||
if [ "$DEPLOY_ALT" == "1" ]; then
|
||||
deploy_dir=rustc-builds-alt
|
||||
fi
|
||||
retry aws s3 cp --no-progress --recursive --acl public-read ./$upload_dir s3://$DEPLOY_BUCKET/$deploy_dir/$BUILD_SOURCEVERSION
|
||||
env:
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), or(eq(variables.DEPLOY, '1'), eq(variables.DEPLOY_ALT, '1')))
|
||||
displayName: Upload artifacts
|
||||
|
||||
# Upload CPU usage statistics that we've been gathering this whole time. Always
|
||||
# execute this step in case we want to inspect failed builds, but don't let
|
||||
# errors here ever fail the build since this is just informational.
|
||||
- bash: aws s3 cp --acl public-read cpu-usage.csv s3://$DEPLOY_BUCKET/rustc-builds/$BUILD_SOURCEVERSION/cpu-$CI_JOB_NAME.csv
|
||||
env:
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
condition: variables['AWS_SECRET_ACCESS_KEY']
|
||||
continueOnError: true
|
||||
displayName: Upload CPU usage statistics
|
||||
78
src/ci/azure-pipelines/try.yml
Normal file
78
src/ci/azure-pipelines/try.yml
Normal file
|
|
@ -0,0 +1,78 @@
|
|||
pr: none
|
||||
trigger:
|
||||
- try
|
||||
|
||||
variables:
|
||||
- group: real-prod-credentials
|
||||
|
||||
jobs:
|
||||
- job: Linux
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: ubuntu-16.04
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
strategy:
|
||||
matrix:
|
||||
dist-x86_64-linux:
|
||||
IMAGE: dist-x86_64-linux
|
||||
DEPLOY: 1
|
||||
|
||||
dist-x86_64-linux-alt:
|
||||
IMAGE: dist-x86_64-linux
|
||||
DEPLOY_ALT: 1
|
||||
|
||||
# The macOS and Windows builds here are currently disabled due to them not being
|
||||
# overly necessary on `try` builds. We also don't actually have anything that
|
||||
# consumes the artifacts currently. Perhaps one day we can reenable, but for now
|
||||
# it helps free up capacity on Azure.
|
||||
# - job: macOS
|
||||
# timeoutInMinutes: 600
|
||||
# pool:
|
||||
# vmImage: macos-10.13
|
||||
# steps:
|
||||
# - template: steps/run.yml
|
||||
# strategy:
|
||||
# matrix:
|
||||
# dist-x86_64-apple:
|
||||
# SCRIPT: ./x.py dist
|
||||
# RUST_CONFIGURE_ARGS: --target=aarch64-apple-ios,armv7-apple-ios,armv7s-apple-ios,i386-apple-ios,x86_64-apple-ios --enable-full-tools --enable-sanitizers --enable-profiler --set rust.jemalloc
|
||||
# DEPLOY: 1
|
||||
# RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
# MACOSX_DEPLOYMENT_TARGET: 10.7
|
||||
# NO_LLVM_ASSERTIONS: 1
|
||||
# NO_DEBUG_ASSERTIONS: 1
|
||||
# DIST_REQUIRE_ALL_TOOLS: 1
|
||||
#
|
||||
# dist-x86_64-apple-alt:
|
||||
# SCRIPT: ./x.py dist
|
||||
# RUST_CONFIGURE_ARGS: --enable-extended --enable-profiler --set rust.jemalloc
|
||||
# DEPLOY_ALT: 1
|
||||
# RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
|
||||
# MACOSX_DEPLOYMENT_TARGET: 10.7
|
||||
# NO_LLVM_ASSERTIONS: 1
|
||||
# NO_DEBUG_ASSERTIONS: 1
|
||||
#
|
||||
# - job: Windows
|
||||
# timeoutInMinutes: 600
|
||||
# pool:
|
||||
# vmImage: 'vs2017-win2016'
|
||||
# steps:
|
||||
# - template: steps/run.yml
|
||||
# strategy:
|
||||
# matrix:
|
||||
# dist-x86_64-msvc:
|
||||
# RUST_CONFIGURE_ARGS: >
|
||||
# --build=x86_64-pc-windows-msvc
|
||||
# --target=x86_64-pc-windows-msvc,aarch64-pc-windows-msvc
|
||||
# --enable-full-tools
|
||||
# --enable-profiler
|
||||
# SCRIPT: python x.py dist
|
||||
# DIST_REQUIRE_ALL_TOOLS: 1
|
||||
# DEPLOY: 1
|
||||
#
|
||||
# dist-x86_64-msvc-alt:
|
||||
# MSYS_BITS: 64
|
||||
# RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-extended --enable-profiler
|
||||
# SCRIPT: python x.py dist
|
||||
# DEPLOY_ALT: 1
|
||||
160
src/ci/cpu-usage-over-time.py
Normal file
160
src/ci/cpu-usage-over-time.py
Normal file
|
|
@ -0,0 +1,160 @@
|
|||
#!/usr/bin/env python
|
||||
# ignore-tidy-linelength
|
||||
|
||||
# This is a small script that we use on CI to collect CPU usage statistics of
|
||||
# our builders. By seeing graphs of CPU usage over time we hope to correlate
|
||||
# that with possible improvements to Rust's own build system, ideally diagnosing
|
||||
# that either builders are always fully using their CPU resources or they're
|
||||
# idle for long stretches of time.
|
||||
#
|
||||
# This script is relatively simple, but it's platform specific. Each platform
|
||||
# (OSX/Windows/Linux) has a different way of calculating the current state of
|
||||
# CPU at a point in time. We then compare two captured states to determine the
|
||||
# percentage of time spent in one state versus another. The state capturing is
|
||||
# all platform-specific but the loop at the bottom is the cross platform part
|
||||
# that executes everywhere.
|
||||
#
|
||||
# # Viewing statistics
|
||||
#
|
||||
# All builders will upload their CPU statistics as CSV files to our S3 buckets.
|
||||
# These URLS look like:
|
||||
#
|
||||
# https://$bucket.s3.amazonaws.com/rustc-builds/$commit/cpu-$builder.csv
|
||||
#
|
||||
# for example
|
||||
#
|
||||
# https://rust-lang-ci2.s3.amazonaws.com/rustc-builds/68baada19cd5340f05f0db15a3e16d6671609bcc/cpu-x86_64-apple.csv
|
||||
#
|
||||
# Each CSV file has two columns. The first is the timestamp of the measurement
|
||||
# and the second column is the % of idle cpu time in that time slice. Ideally
|
||||
# the second column is always zero.
|
||||
#
|
||||
# Once you've downloaded a file there's various ways to plot it and visualize
|
||||
# it. For command line usage you use the `src/etc/cpu-usage-over-time-plot.sh`
|
||||
# script in this repository.
|
||||
|
||||
import datetime
|
||||
import sys
|
||||
import time
|
||||
|
||||
if sys.platform == 'linux2':
|
||||
class State:
|
||||
def __init__(self):
|
||||
with open('/proc/stat', 'r') as file:
|
||||
data = file.readline().split()
|
||||
if data[0] != 'cpu':
|
||||
raise Exception('did not start with "cpu"')
|
||||
self.user = int(data[1])
|
||||
self.nice = int(data[2])
|
||||
self.system = int(data[3])
|
||||
self.idle = int(data[4])
|
||||
self.iowait = int(data[5])
|
||||
self.irq = int(data[6])
|
||||
self.softirq = int(data[7])
|
||||
self.steal = int(data[8])
|
||||
self.guest = int(data[9])
|
||||
self.guest_nice = int(data[10])
|
||||
|
||||
def idle_since(self, prev):
|
||||
user = self.user - prev.user
|
||||
nice = self.nice - prev.nice
|
||||
system = self.system - prev.system
|
||||
idle = self.idle - prev.idle
|
||||
iowait = self.iowait - prev.iowait
|
||||
irq = self.irq - prev.irq
|
||||
softirq = self.softirq - prev.softirq
|
||||
steal = self.steal - prev.steal
|
||||
guest = self.guest - prev.guest
|
||||
guest_nice = self.guest_nice - prev.guest_nice
|
||||
total = user + nice + system + idle + iowait + irq + softirq + steal + guest + guest_nice
|
||||
return float(idle) / float(total) * 100
|
||||
|
||||
elif sys.platform == 'win32':
|
||||
from ctypes.wintypes import DWORD
|
||||
from ctypes import Structure, windll, WinError, GetLastError, byref
|
||||
|
||||
class FILETIME(Structure):
|
||||
_fields_ = [
|
||||
("dwLowDateTime", DWORD),
|
||||
("dwHighDateTime", DWORD),
|
||||
]
|
||||
|
||||
class State:
|
||||
def __init__(self):
|
||||
idle, kernel, user = FILETIME(), FILETIME(), FILETIME()
|
||||
|
||||
success = windll.kernel32.GetSystemTimes(
|
||||
byref(idle),
|
||||
byref(kernel),
|
||||
byref(user),
|
||||
)
|
||||
|
||||
assert success, WinError(GetLastError())[1]
|
||||
|
||||
self.idle = (idle.dwHighDateTime << 32) | idle.dwLowDateTime
|
||||
self.kernel = (kernel.dwHighDateTime << 32) | kernel.dwLowDateTime
|
||||
self.user = (user.dwHighDateTime << 32) | user.dwLowDateTime
|
||||
|
||||
def idle_since(self, prev):
|
||||
idle = self.idle - prev.idle
|
||||
user = self.user - prev.user
|
||||
kernel = self.kernel - prev.kernel
|
||||
return float(idle) / float(user + kernel) * 100
|
||||
|
||||
elif sys.platform == 'darwin':
|
||||
from ctypes import *
|
||||
libc = cdll.LoadLibrary('/usr/lib/libc.dylib')
|
||||
|
||||
PROESSOR_CPU_LOAD_INFO = c_int(2)
|
||||
CPU_STATE_USER = 0
|
||||
CPU_STATE_SYSTEM = 1
|
||||
CPU_STATE_IDLE = 2
|
||||
CPU_STATE_NICE = 3
|
||||
c_int_p = POINTER(c_int)
|
||||
|
||||
class State:
|
||||
def __init__(self):
|
||||
num_cpus_u = c_uint(0)
|
||||
cpu_info = c_int_p()
|
||||
cpu_info_cnt = c_int(0)
|
||||
err = libc.host_processor_info(
|
||||
libc.mach_host_self(),
|
||||
PROESSOR_CPU_LOAD_INFO,
|
||||
byref(num_cpus_u),
|
||||
byref(cpu_info),
|
||||
byref(cpu_info_cnt),
|
||||
)
|
||||
assert err == 0
|
||||
self.user = 0
|
||||
self.system = 0
|
||||
self.idle = 0
|
||||
self.nice = 0
|
||||
cur = 0
|
||||
while cur < cpu_info_cnt.value:
|
||||
self.user += cpu_info[cur + CPU_STATE_USER]
|
||||
self.system += cpu_info[cur + CPU_STATE_SYSTEM]
|
||||
self.idle += cpu_info[cur + CPU_STATE_IDLE]
|
||||
self.nice += cpu_info[cur + CPU_STATE_NICE]
|
||||
cur += num_cpus_u.value
|
||||
|
||||
def idle_since(self, prev):
|
||||
user = self.user - prev.user
|
||||
system = self.system - prev.system
|
||||
idle = self.idle - prev.idle
|
||||
nice = self.nice - prev.nice
|
||||
return float(idle) / float(user + system + idle + nice) * 100.0
|
||||
|
||||
else:
|
||||
print('unknown platform', sys.platform)
|
||||
sys.exit(1)
|
||||
|
||||
cur_state = State();
|
||||
print("Time,Idle")
|
||||
while True:
|
||||
time.sleep(1);
|
||||
next_state = State();
|
||||
now = datetime.datetime.utcnow().isoformat()
|
||||
idle = next_state.idle_since(cur_state)
|
||||
print("%s,%s" % (now, idle))
|
||||
sys.stdout.flush()
|
||||
cur_state = next_state
|
||||
|
|
@ -20,7 +20,7 @@ Images will output artifacts in an `obj` dir at the root of a repository.
|
|||
|
||||
- Each directory, excluding `scripts` and `disabled`, corresponds to a docker image
|
||||
- `scripts` contains files shared by docker images
|
||||
- `disabled` contains images that are not built on travis
|
||||
- `disabled` contains images that are not built on CI
|
||||
|
||||
## Docker Toolbox on Windows
|
||||
|
||||
|
|
|
|||
|
|
@ -72,7 +72,7 @@ RUN arm-linux-gnueabihf-gcc addentropy.c -o rootfs/addentropy -static
|
|||
|
||||
# TODO: What is this?!
|
||||
# Source of the file: https://github.com/vfdev-5/qemu-rpi2-vexpress/raw/master/vexpress-v2p-ca15-tc1.dtb
|
||||
RUN curl -O https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/vexpress-v2p-ca15-tc1.dtb
|
||||
RUN curl -O https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/vexpress-v2p-ca15-tc1.dtb
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
RUN sh /scripts/sccache.sh
|
||||
|
|
|
|||
|
|
@ -32,8 +32,16 @@ ENV TARGETS=asmjs-unknown-emscripten
|
|||
ENV RUST_CONFIGURE_ARGS --enable-emscripten --disable-optimize-tests
|
||||
|
||||
ENV SCRIPT python2.7 ../x.py test --target $TARGETS \
|
||||
src/test/run-pass \
|
||||
src/test/ui \
|
||||
src/test/run-fail \
|
||||
src/libstd \
|
||||
src/liballoc \
|
||||
src/libcore
|
||||
|
||||
# Debug assertions in rustc are largely covered by other builders, and LLVM
|
||||
# assertions cause this builder to slow down by quite a large amount and don't
|
||||
# buy us a huge amount over other builders (not sure if we've ever seen an
|
||||
# asmjs-specific backend assertion trip), so disable assertions for these
|
||||
# tests.
|
||||
ENV NO_LLVM_ASSERTIONS=1
|
||||
ENV NO_DEBUG_ASSERTIONS=1
|
||||
|
|
|
|||
|
|
@ -36,7 +36,3 @@ ENV HOSTS=powerpc-unknown-linux-gnu
|
|||
|
||||
ENV RUST_CONFIGURE_ARGS --enable-extended --disable-docs
|
||||
ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS
|
||||
|
||||
# FIXME(#36150) this will fail the bootstrap. Probably means something bad is
|
||||
# happening!
|
||||
ENV NO_LLVM_ASSERTIONS 1
|
||||
|
|
|
|||
|
|
@ -64,7 +64,7 @@ RUN env \
|
|||
env \
|
||||
CC=arm-linux-gnueabihf-gcc CFLAGS="-march=armv7-a" \
|
||||
CXX=arm-linux-gnueabihf-g++ CXXFLAGS="-march=armv7-a" \
|
||||
bash musl.sh armv7 && \
|
||||
bash musl.sh armv7hf && \
|
||||
env \
|
||||
CC=aarch64-linux-gnu-gcc \
|
||||
CXX=aarch64-linux-gnu-g++ \
|
||||
|
|
@ -104,7 +104,9 @@ ENV TARGETS=$TARGETS,armv5te-unknown-linux-musleabi
|
|||
ENV TARGETS=$TARGETS,armv7-unknown-linux-musleabihf
|
||||
ENV TARGETS=$TARGETS,aarch64-unknown-linux-musl
|
||||
ENV TARGETS=$TARGETS,sparc64-unknown-linux-gnu
|
||||
ENV TARGETS=$TARGETS,x86_64-unknown-redox
|
||||
# FIXME: temporarily disable the redox builder,
|
||||
# see: https://github.com/rust-lang/rust/issues/63160
|
||||
# ENV TARGETS=$TARGETS,x86_64-unknown-redox
|
||||
ENV TARGETS=$TARGETS,thumbv6m-none-eabi
|
||||
ENV TARGETS=$TARGETS,thumbv7m-none-eabi
|
||||
ENV TARGETS=$TARGETS,thumbv7em-none-eabi
|
||||
|
|
@ -112,6 +114,7 @@ ENV TARGETS=$TARGETS,thumbv7em-none-eabihf
|
|||
ENV TARGETS=$TARGETS,thumbv8m.base-none-eabi
|
||||
ENV TARGETS=$TARGETS,thumbv8m.main-none-eabi
|
||||
ENV TARGETS=$TARGETS,thumbv8m.main-none-eabihf
|
||||
ENV TARGETS=$TARGETS,riscv32i-unknown-none-elf
|
||||
ENV TARGETS=$TARGETS,riscv32imc-unknown-none-elf
|
||||
ENV TARGETS=$TARGETS,riscv32imac-unknown-none-elf
|
||||
ENV TARGETS=$TARGETS,riscv64imac-unknown-none-elf
|
||||
|
|
@ -126,7 +129,6 @@ ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
|
|||
CC_mips_unknown_linux_musl=mips-openwrt-linux-gcc \
|
||||
CC_sparc64_unknown_linux_gnu=sparc64-linux-gnu-gcc \
|
||||
CC_x86_64_unknown_redox=x86_64-unknown-redox-gcc \
|
||||
CC_armebv7r_none_eabi=arm-none-eabi-gcc \
|
||||
CC_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-gcc \
|
||||
AR_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-ar \
|
||||
CXX_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-g++
|
||||
|
|
@ -135,7 +137,7 @@ ENV RUST_CONFIGURE_ARGS \
|
|||
--musl-root-armv5te=/musl-armv5te \
|
||||
--musl-root-arm=/musl-arm \
|
||||
--musl-root-armhf=/musl-armhf \
|
||||
--musl-root-armv7=/musl-armv7 \
|
||||
--musl-root-armv7hf=/musl-armv7hf \
|
||||
--musl-root-aarch64=/musl-aarch64 \
|
||||
--musl-root-mips=/musl-mips \
|
||||
--musl-root-mipsel=/musl-mipsel \
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ mkdir /usr/local/mips-linux-musl
|
|||
# originally from
|
||||
# https://downloads.openwrt.org/snapshots/trunk/ar71xx/generic/
|
||||
# OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2
|
||||
URL="https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror"
|
||||
URL="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc"
|
||||
FILE="OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2"
|
||||
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mips-linux-musl --strip-components=2
|
||||
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ mkdir /usr/local/mipsel-linux-musl
|
|||
# Note that this originally came from:
|
||||
# https://downloads.openwrt.org/snapshots/trunk/malta/generic/
|
||||
# OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2
|
||||
URL="https://s3-us-west-1.amazonaws.com/rust-lang-ci2/libc"
|
||||
URL="https://rust-lang-ci2.s3.amazonaws.com/libc"
|
||||
FILE="OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2"
|
||||
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mipsel-linux-musl --strip-components=2
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,8 @@ RUN sed -i 's/^# deb-src/deb-src/' /etc/apt/sources.list
|
|||
|
||||
RUN apt-get update && apt-get build-dep -y clang llvm && apt-get install -y --no-install-recommends \
|
||||
build-essential \
|
||||
gcc-multilib \
|
||||
# gcc-multilib can not be installed together with gcc-arm-linux-gnueabi
|
||||
gcc-7-multilib \
|
||||
libedit-dev \
|
||||
libgmp-dev \
|
||||
libisl-dev \
|
||||
|
|
@ -21,10 +22,19 @@ RUN apt-get update && apt-get build-dep -y clang llvm && apt-get install -y --no
|
|||
unzip \
|
||||
# Needed for apt-key to work:
|
||||
dirmngr \
|
||||
gpg-agent
|
||||
gpg-agent \
|
||||
g++-7-arm-linux-gnueabi
|
||||
|
||||
RUN apt-key adv --batch --yes --keyserver keyserver.ubuntu.com --recv-keys 74DA7924C5513486
|
||||
RUN add-apt-repository -y 'deb http://apt.dilos.org/dilos dilos2-testing main'
|
||||
RUN add-apt-repository -y 'deb http://apt.dilos.org/dilos dilos2 main'
|
||||
|
||||
WORKDIR /build
|
||||
COPY scripts/musl.sh /build
|
||||
RUN env \
|
||||
CC=arm-linux-gnueabi-gcc-7 CFLAGS="-march=armv7-a" \
|
||||
CXX=arm-linux-gnueabi-g++-7 CXXFLAGS="-march=armv7-a" \
|
||||
bash musl.sh armv7 && \
|
||||
rm -rf /build/*
|
||||
|
||||
WORKDIR /tmp
|
||||
COPY dist-various-2/shared.sh /tmp/
|
||||
|
|
@ -58,7 +68,11 @@ ENV \
|
|||
CXX_sparcv9_sun_solaris=sparcv9-sun-solaris2.10-g++ \
|
||||
AR_x86_64_sun_solaris=x86_64-sun-solaris2.10-ar \
|
||||
CC_x86_64_sun_solaris=x86_64-sun-solaris2.10-gcc \
|
||||
CXX_x86_64_sun_solaris=x86_64-sun-solaris2.10-g++
|
||||
CXX_x86_64_sun_solaris=x86_64-sun-solaris2.10-g++ \
|
||||
CC_armv7_unknown_linux_gnueabi=arm-linux-gnueabi-gcc-7 \
|
||||
CXX_armv7_unknown_linux_gnueabi=arm-linux-gnueabi-g++-7 \
|
||||
CC=gcc-7 \
|
||||
CXX=g++-7
|
||||
|
||||
ENV CARGO_TARGET_X86_64_FUCHSIA_AR /usr/local/bin/llvm-ar
|
||||
ENV CARGO_TARGET_X86_64_FUCHSIA_RUSTFLAGS \
|
||||
|
|
@ -73,17 +87,27 @@ ENV CARGO_TARGET_AARCH64_FUCHSIA_RUSTFLAGS \
|
|||
|
||||
ENV TARGETS=x86_64-fuchsia
|
||||
ENV TARGETS=$TARGETS,aarch64-fuchsia
|
||||
ENV TARGETS=$TARGETS,sparcv9-sun-solaris
|
||||
ENV TARGETS=$TARGETS,wasm32-unknown-unknown
|
||||
ENV TARGETS=$TARGETS,wasm32-wasi
|
||||
ENV TARGETS=$TARGETS,sparcv9-sun-solaris
|
||||
ENV TARGETS=$TARGETS,x86_64-sun-solaris
|
||||
ENV TARGETS=$TARGETS,x86_64-unknown-linux-gnux32
|
||||
ENV TARGETS=$TARGETS,x86_64-unknown-cloudabi
|
||||
ENV TARGETS=$TARGETS,x86_64-fortanix-unknown-sgx
|
||||
ENV TARGETS=$TARGETS,nvptx64-nvidia-cuda
|
||||
ENV TARGETS=$TARGETS,armv7-unknown-linux-gnueabi
|
||||
ENV TARGETS=$TARGETS,armv7-unknown-linux-musleabi
|
||||
|
||||
ENV X86_FORTANIX_SGX_LIBS="/x86_64-fortanix-unknown-sgx/lib/"
|
||||
|
||||
# As per https://bugs.launchpad.net/ubuntu/+source/gcc-defaults/+bug/1300211
|
||||
# we need asm in the search path for gcc-7 (for gnux32) but not in the search path of the
|
||||
# cross compilers.
|
||||
# Luckily one of the folders is /usr/local/include so symlink /usr/include/asm-generic there
|
||||
RUN ln -s /usr/include/asm-generic /usr/local/include/asm
|
||||
|
||||
ENV RUST_CONFIGURE_ARGS --enable-extended --enable-lld --disable-docs \
|
||||
--set target.wasm32-wasi.wasi-root=/wasm32-wasi
|
||||
--set target.wasm32-wasi.wasi-root=/wasm32-wasi \
|
||||
--musl-root-armv7=/musl-armv7
|
||||
|
||||
ENV SCRIPT python2.7 ../x.py dist --target $TARGETS
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@
|
|||
set -ex
|
||||
|
||||
# Originally from https://releases.llvm.org/8.0.0/clang+llvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz
|
||||
curl https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/clang%2Bllvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz | \
|
||||
curl https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/clang%2Bllvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz | \
|
||||
tar xJf -
|
||||
export PATH=`pwd`/clang+llvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04/bin:$PATH
|
||||
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ set -ex
|
|||
source shared.sh
|
||||
|
||||
VERSION=1.0.2k
|
||||
URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/openssl-$VERSION.tar.gz
|
||||
URL=https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/openssl-$VERSION.tar.gz
|
||||
|
||||
curl $URL | tar xzf -
|
||||
|
||||
|
|
|
|||
|
|
@ -25,7 +25,7 @@ cd netbsd
|
|||
|
||||
mkdir -p /x-tools/x86_64-unknown-netbsd/sysroot
|
||||
|
||||
URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
|
||||
URL=https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
|
||||
|
||||
# Originally from ftp://ftp.netbsd.org/pub/NetBSD/NetBSD-$BSD/source/sets/*.tgz
|
||||
curl $URL/2018-03-01-netbsd-src.tgz | tar xzf -
|
||||
|
|
|
|||
|
|
@ -18,4 +18,4 @@ COPY scripts/sccache.sh /scripts/
|
|||
RUN sh /scripts/sccache.sh
|
||||
|
||||
ENV RUST_CONFIGURE_ARGS --build=i686-unknown-linux-gnu --disable-optimize-tests
|
||||
ENV RUST_CHECK_TARGET check
|
||||
ENV SCRIPT python2.7 ../x.py test
|
||||
|
|
|
|||
|
|
@ -17,9 +17,6 @@ dist=$objdir/build/dist
|
|||
|
||||
source "$ci_dir/shared.sh"
|
||||
|
||||
travis_fold start build_docker
|
||||
travis_time_start
|
||||
|
||||
if [ -f "$docker_dir/$image/Dockerfile" ]; then
|
||||
if [ "$CI" != "" ]; then
|
||||
hash_key=/tmp/.docker-hash-key.txt
|
||||
|
|
@ -40,9 +37,12 @@ if [ -f "$docker_dir/$image/Dockerfile" ]; then
|
|||
docker --version >> $hash_key
|
||||
cksum=$(sha512sum $hash_key | \
|
||||
awk '{print $1}')
|
||||
|
||||
s3url="s3://$SCCACHE_BUCKET/docker/$cksum"
|
||||
url="https://s3-us-west-1.amazonaws.com/$SCCACHE_BUCKET/docker/$cksum"
|
||||
echo "Attempting to download $s3url"
|
||||
url="https://$SCCACHE_BUCKET.s3.amazonaws.com/docker/$cksum"
|
||||
upload="aws s3 cp - $s3url"
|
||||
|
||||
echo "Attempting to download $url"
|
||||
rm -f /tmp/rustci_docker_cache
|
||||
set +e
|
||||
retry curl -y 30 -Y 10 --connect-timeout 30 -f -L -C - -o /tmp/rustci_docker_cache "$url"
|
||||
|
|
@ -65,17 +65,17 @@ if [ -f "$docker_dir/$image/Dockerfile" ]; then
|
|||
-f "$dockerfile" \
|
||||
"$context"
|
||||
|
||||
if [ "$s3url" != "" ]; then
|
||||
if [ "$upload" != "" ]; then
|
||||
digest=$(docker inspect rust-ci --format '{{.Id}}')
|
||||
echo "Built container $digest"
|
||||
if ! grep -q "$digest" <(echo "$loaded_images"); then
|
||||
echo "Uploading finished image to $s3url"
|
||||
echo "Uploading finished image to $url"
|
||||
set +e
|
||||
docker history -q rust-ci | \
|
||||
grep -v missing | \
|
||||
xargs docker save | \
|
||||
gzip | \
|
||||
aws s3 cp - $s3url
|
||||
$upload
|
||||
set -e
|
||||
else
|
||||
echo "Looks like docker image is the same as before, not uploading"
|
||||
|
|
@ -87,11 +87,10 @@ if [ -f "$docker_dir/$image/Dockerfile" ]; then
|
|||
echo "$digest" >>"$info"
|
||||
fi
|
||||
elif [ -f "$docker_dir/disabled/$image/Dockerfile" ]; then
|
||||
if [ -n "$TRAVIS_OS_NAME" ]; then
|
||||
echo Cannot run disabled images on travis!
|
||||
if isCI; then
|
||||
echo Cannot run disabled images on CI!
|
||||
exit 1
|
||||
fi
|
||||
# retry messes with the pipe from tar to docker. Not needed on non-travis
|
||||
# Transform changes the context of disabled Dockerfiles to match the enabled ones
|
||||
tar --transform 's#^./disabled/#./#' -C $docker_dir -c . | docker \
|
||||
build \
|
||||
|
|
@ -104,9 +103,6 @@ else
|
|||
exit 1
|
||||
fi
|
||||
|
||||
travis_fold end build_docker
|
||||
travis_time_finish
|
||||
|
||||
mkdir -p $HOME/.cargo
|
||||
mkdir -p $objdir/tmp
|
||||
mkdir -p $objdir/cores
|
||||
|
|
@ -129,24 +125,61 @@ fi
|
|||
# goes ahead and sets it for all builders.
|
||||
args="$args --privileged"
|
||||
|
||||
exec docker \
|
||||
# Things get a little weird if this script is already running in a docker
|
||||
# container. If we're already in a docker container then we assume it's set up
|
||||
# to do docker-in-docker where we have access to a working `docker` command.
|
||||
#
|
||||
# If this is the case (we check via the presence of `/.dockerenv`)
|
||||
# then we can't actually use the `--volume` argument. Typically we use
|
||||
# `--volume` to efficiently share the build and source directory between this
|
||||
# script and the container we're about to spawn. If we're inside docker already
|
||||
# though the `--volume` argument maps the *host's* folder to the container we're
|
||||
# about to spawn, when in fact we want the folder in this container itself. To
|
||||
# work around this we use a recipe cribbed from
|
||||
# https://circleci.com/docs/2.0/building-docker-images/#mounting-folders to
|
||||
# create a temporary container with a volume. We then copy the entire source
|
||||
# directory into this container, and then use that copy in the container we're
|
||||
# about to spawn. Finally after the build finishes we re-extract the object
|
||||
# directory.
|
||||
#
|
||||
# Note that none of this is necessary if we're *not* in a docker-in-docker
|
||||
# scenario. If this script is run on a bare metal host then we share a bunch of
|
||||
# data directories to share as much data as possible. Note that we also use
|
||||
# `LOCAL_USER_ID` (recognized in `src/ci/run.sh`) to ensure that files are all
|
||||
# read/written as the same user as the bare-metal user.
|
||||
if [ -f /.dockerenv ]; then
|
||||
docker create -v /checkout --name checkout alpine:3.4 /bin/true
|
||||
docker cp . checkout:/checkout
|
||||
args="$args --volumes-from checkout"
|
||||
else
|
||||
args="$args --volume $root_dir:/checkout:ro"
|
||||
args="$args --volume $objdir:/checkout/obj"
|
||||
args="$args --volume $HOME/.cargo:/cargo"
|
||||
args="$args --volume $HOME/rustsrc:$HOME/rustsrc"
|
||||
args="$args --env LOCAL_USER_ID=`id -u`"
|
||||
fi
|
||||
|
||||
docker \
|
||||
run \
|
||||
--volume "$root_dir:/checkout:ro" \
|
||||
--volume "$objdir:/checkout/obj" \
|
||||
--workdir /checkout/obj \
|
||||
--env SRC=/checkout \
|
||||
$args \
|
||||
--env CARGO_HOME=/cargo \
|
||||
--env DEPLOY \
|
||||
--env DEPLOY_ALT \
|
||||
--env LOCAL_USER_ID=`id -u` \
|
||||
--env TRAVIS \
|
||||
--env TRAVIS_BRANCH \
|
||||
--env CI \
|
||||
--env TF_BUILD \
|
||||
--env BUILD_SOURCEBRANCHNAME \
|
||||
--env TOOLSTATE_REPO_ACCESS_TOKEN \
|
||||
--env TOOLSTATE_REPO \
|
||||
--env TOOLSTATE_PUBLISH \
|
||||
--env CI_JOB_NAME="${CI_JOB_NAME-$IMAGE}" \
|
||||
--volume "$HOME/.cargo:/cargo" \
|
||||
--volume "$HOME/rustsrc:$HOME/rustsrc" \
|
||||
--init \
|
||||
--rm \
|
||||
rust-ci \
|
||||
/checkout/src/ci/run.sh
|
||||
|
||||
if [ -f /.dockerenv ]; then
|
||||
rm -rf $objdir
|
||||
docker cp checkout:/checkout/obj $objdir
|
||||
fi
|
||||
|
|
|
|||
|
|
@ -23,8 +23,9 @@ REPOSITORIES = [
|
|||
HOST_OS = "linux"
|
||||
|
||||
# Mirroring options
|
||||
MIRROR_BUCKET = "rust-lang-ci2"
|
||||
MIRROR_BASE_DIR = "rust-ci-mirror/android/"
|
||||
MIRROR_BUCKET = "rust-lang-ci-mirrors"
|
||||
MIRROR_BUCKET_REGION = "us-west-1"
|
||||
MIRROR_BASE_DIR = "rustc/android/"
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
|
|
@ -144,7 +145,8 @@ def cli_install(args):
|
|||
lockfile = Lockfile(args.lockfile)
|
||||
for package in lockfile.packages.values():
|
||||
# Download the file from the mirror into a temp file
|
||||
url = "https://" + MIRROR_BUCKET + ".s3.amazonaws.com/" + MIRROR_BASE_DIR
|
||||
url = "https://" + MIRROR_BUCKET + ".s3-" + MIRROR_BUCKET_REGION + \
|
||||
".amazonaws.com/" + MIRROR_BASE_DIR
|
||||
downloaded = package.download(url)
|
||||
# Extract the file in a temporary directory
|
||||
extract_dir = tempfile.mkdtemp()
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ exit 1
|
|||
}
|
||||
|
||||
cd /
|
||||
curl -fL https://s3.amazonaws.com/mozilla-games/emscripten/releases/emsdk-portable.tar.gz | \
|
||||
curl -fL https://mozilla-games.s3.amazonaws.com/emscripten/releases/emsdk-portable.tar.gz | \
|
||||
tar -xz
|
||||
|
||||
cd /emsdk-portable
|
||||
|
|
|
|||
|
|
@ -59,7 +59,7 @@ done
|
|||
|
||||
# Originally downloaded from:
|
||||
# https://download.freebsd.org/ftp/releases/${freebsd_arch}/${freebsd_version}-RELEASE/base.txz
|
||||
URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2019-04-04-freebsd-${freebsd_arch}-${freebsd_version}-RELEASE-base.txz
|
||||
URL=https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2019-04-04-freebsd-${freebsd_arch}-${freebsd_version}-RELEASE-base.txz
|
||||
curl "$URL" | tar xJf - -C "$sysroot" --wildcards "${files_to_extract[@]}"
|
||||
|
||||
# Fix up absolute symlinks from the system image. This can be removed
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
#
|
||||
# Versions of the toolchain components are configurable in `musl-cross-make/Makefile` and
|
||||
# musl unlike GLIBC is forward compatible so upgrading it shouldn't break old distributions.
|
||||
# Right now we have: Binutils 2.27, GCC 6.3.0, musl 1.1.18
|
||||
# Right now we have: Binutils 2.27, GCC 6.4.0, musl 1.1.22.
|
||||
set -ex
|
||||
|
||||
hide_output() {
|
||||
|
|
@ -33,7 +33,7 @@ shift
|
|||
# Apparently applying `-fPIC` everywhere allows them to link successfully.
|
||||
export CFLAGS="-fPIC $CFLAGS"
|
||||
|
||||
git clone https://github.com/richfelker/musl-cross-make -b v0.9.7
|
||||
git clone https://github.com/richfelker/musl-cross-make -b v0.9.8
|
||||
cd musl-cross-make
|
||||
|
||||
hide_output make -j$(nproc) TARGET=$TARGET
|
||||
|
|
@ -54,29 +54,3 @@ if [ "$REPLACE_CC" = "1" ]; then
|
|||
ln -s $TARGET-g++ /usr/local/bin/$exec
|
||||
done
|
||||
fi
|
||||
|
||||
export CC=$TARGET-gcc
|
||||
export CXX=$TARGET-g++
|
||||
|
||||
LLVM=70
|
||||
|
||||
# may have been downloaded in a previous run
|
||||
if [ ! -d libunwind-release_$LLVM ]; then
|
||||
curl -L https://github.com/llvm-mirror/llvm/archive/release_$LLVM.tar.gz | tar xzf -
|
||||
curl -L https://github.com/llvm-mirror/libunwind/archive/release_$LLVM.tar.gz | tar xzf -
|
||||
fi
|
||||
|
||||
# fixme(mati865): Replace it with https://github.com/rust-lang/rust/pull/59089
|
||||
mkdir libunwind-build
|
||||
cd libunwind-build
|
||||
cmake ../libunwind-release_$LLVM \
|
||||
-DLLVM_PATH=/build/llvm-release_$LLVM \
|
||||
-DLIBUNWIND_ENABLE_SHARED=0 \
|
||||
-DCMAKE_C_COMPILER=$CC \
|
||||
-DCMAKE_CXX_COMPILER=$CXX \
|
||||
-DCMAKE_C_FLAGS="$CFLAGS" \
|
||||
-DCMAKE_CXX_FLAGS="$CXXFLAGS"
|
||||
|
||||
hide_output make -j$(nproc)
|
||||
cp lib/libunwind.a $OUTPUT/$TARGET/lib
|
||||
cd - && rm -rf libunwind-build
|
||||
|
|
|
|||
|
|
@ -20,9 +20,11 @@ exit 1
|
|||
TAG=$1
|
||||
shift
|
||||
|
||||
# Ancient binutils versions don't understand debug symbols produced by more recent tools.
|
||||
# Apparently applying `-fPIC` everywhere allows them to link successfully.
|
||||
export CFLAGS="-fPIC $CFLAGS"
|
||||
|
||||
MUSL=musl-1.1.20
|
||||
MUSL=musl-1.1.22
|
||||
|
||||
# may have been downloaded in a previous run
|
||||
if [ ! -d $MUSL ]; then
|
||||
|
|
@ -38,27 +40,3 @@ else
|
|||
fi
|
||||
hide_output make install
|
||||
hide_output make clean
|
||||
|
||||
cd ..
|
||||
|
||||
LLVM=70
|
||||
|
||||
# may have been downloaded in a previous run
|
||||
if [ ! -d libunwind-release_$LLVM ]; then
|
||||
curl -L https://github.com/llvm-mirror/llvm/archive/release_$LLVM.tar.gz | tar xzf -
|
||||
curl -L https://github.com/llvm-mirror/libunwind/archive/release_$LLVM.tar.gz | tar xzf -
|
||||
fi
|
||||
|
||||
mkdir libunwind-build
|
||||
cd libunwind-build
|
||||
cmake ../libunwind-release_$LLVM \
|
||||
-DLLVM_PATH=/build/llvm-release_$LLVM \
|
||||
-DLIBUNWIND_ENABLE_SHARED=0 \
|
||||
-DCMAKE_C_COMPILER=$CC \
|
||||
-DCMAKE_CXX_COMPILER=$CXX \
|
||||
-DCMAKE_C_FLAGS="$CFLAGS" \
|
||||
-DCMAKE_CXX_FLAGS="$CXXFLAGS"
|
||||
|
||||
hide_output make -j$(nproc)
|
||||
cp lib/libunwind.a /musl-$TAG/lib
|
||||
cd ../ && rm -rf libunwind-build
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
set -ex
|
||||
|
||||
curl -fo /usr/local/bin/sccache \
|
||||
https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2018-04-02-sccache-x86_64-unknown-linux-musl
|
||||
https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2018-04-02-sccache-x86_64-unknown-linux-musl
|
||||
|
||||
chmod +x /usr/local/bin/sccache
|
||||
|
|
|
|||
|
|
@ -11,14 +11,12 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
|||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
libssl-dev \
|
||||
pkg-config \
|
||||
xz-utils \
|
||||
wget \
|
||||
patch
|
||||
|
||||
# FIXME: build the `ptx-linker` instead.
|
||||
RUN curl -sL https://github.com/denzp/rust-ptx-linker/releases/download/v0.9.0-alpha.2/rust-ptx-linker.linux64.tar.gz | \
|
||||
tar -xzvC /usr/bin
|
||||
|
||||
RUN curl -sL https://nodejs.org/dist/v9.2.0/node-v9.2.0-linux-x64.tar.xz | \
|
||||
tar -xJ
|
||||
|
||||
|
|
@ -45,7 +43,6 @@ ENV WASM_TARGETS=wasm32-unknown-unknown
|
|||
ENV WASM_SCRIPT python2.7 /checkout/x.py test --target $WASM_TARGETS \
|
||||
src/test/run-make \
|
||||
src/test/ui \
|
||||
src/test/run-pass \
|
||||
src/test/compile-fail \
|
||||
src/test/mir-opt \
|
||||
src/test/codegen-units \
|
||||
|
|
|
|||
|
|
@ -17,6 +17,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
|||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
libssl-dev \
|
||||
pkg-config \
|
||||
xz-utils \
|
||||
lld \
|
||||
clang
|
||||
|
|
@ -31,7 +33,6 @@ ENV RUST_CONFIGURE_ARGS \
|
|||
--build=x86_64-unknown-linux-gnu \
|
||||
--enable-debug \
|
||||
--enable-lld \
|
||||
--enable-lldb \
|
||||
--enable-optimize \
|
||||
--set llvm.use-linker=lld \
|
||||
--set target.x86_64-unknown-linux-gnu.linker=clang \
|
||||
|
|
|
|||
|
|
@ -21,3 +21,10 @@ RUN sh /scripts/sccache.sh
|
|||
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu --set rust.ignore-git=false
|
||||
ENV SCRIPT python2.7 ../x.py test distcheck
|
||||
ENV DIST_SRC 1
|
||||
|
||||
# The purpose of this builder is to test that we can `./x.py test` successfully
|
||||
# from a tarball, not to test LLVM/rustc's own set of assertions. These cause a
|
||||
# significant hit to CI compile time (over a half hour as observed in #61185),
|
||||
# so disable assertions for this builder.
|
||||
ENV NO_LLVM_ASSERTIONS=1
|
||||
ENV NO_DEBUG_ASSERTIONS=1
|
||||
|
|
|
|||
|
|
@ -11,6 +11,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
|||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
libssl-dev \
|
||||
pkg-config \
|
||||
xz-utils
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
|
|
@ -20,3 +22,9 @@ ENV RUST_CONFIGURE_ARGS \
|
|||
--build=x86_64-unknown-linux-gnu \
|
||||
--enable-full-bootstrap
|
||||
ENV SCRIPT python2.7 ../x.py build
|
||||
|
||||
# In general this just slows down the build and we're just a smoke test that
|
||||
# a full bootstrap works in general, so there's not much need to take this
|
||||
# penalty in build times.
|
||||
ENV NO_LLVM_ASSERTIONS 1
|
||||
ENV NO_DEBUG_ASSERTIONS 1
|
||||
|
|
|
|||
|
|
@ -13,6 +13,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
|||
gdb \
|
||||
llvm-6.0-tools \
|
||||
libedit-dev \
|
||||
libssl-dev \
|
||||
pkg-config \
|
||||
zlib1g-dev \
|
||||
xz-utils
|
||||
|
||||
|
|
@ -24,4 +26,10 @@ ENV RUST_CONFIGURE_ARGS \
|
|||
--build=x86_64-unknown-linux-gnu \
|
||||
--llvm-root=/usr/lib/llvm-6.0 \
|
||||
--enable-llvm-link-shared
|
||||
ENV RUST_CHECK_TARGET check
|
||||
ENV SCRIPT python2.7 ../x.py test src/tools/tidy && python2.7 ../x.py test
|
||||
|
||||
# The purpose of this container isn't to test with debug assertions and
|
||||
# this is run on all PRs, so let's get speedier builds by disabling these extra
|
||||
# checks.
|
||||
ENV NO_DEBUG_ASSERTIONS=1
|
||||
ENV NO_LLVM_ASSERTIONS=1
|
||||
|
|
|
|||
|
|
@ -11,6 +11,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
|||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
libssl-dev \
|
||||
pkg-config \
|
||||
xz-utils
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
|
|
@ -19,4 +21,4 @@ RUN sh /scripts/sccache.sh
|
|||
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu \
|
||||
--disable-optimize-tests \
|
||||
--set rust.test-compare-mode
|
||||
ENV RUST_CHECK_TARGET check
|
||||
ENV SCRIPT python2.7 ../x.py test
|
||||
|
|
|
|||
|
|
@ -21,8 +21,10 @@ COPY x86_64-gnu-tools/checkregression.py /tmp/
|
|||
COPY x86_64-gnu-tools/checktools.sh /tmp/
|
||||
COPY x86_64-gnu-tools/repo.sh /tmp/
|
||||
|
||||
# Run rustbook with `linkcheck` feature enabled
|
||||
ENV CHECK_LINKS 1
|
||||
|
||||
ENV RUST_CONFIGURE_ARGS \
|
||||
--build=x86_64-unknown-linux-gnu \
|
||||
--enable-test-miri \
|
||||
--save-toolstates=/tmp/toolstates.json
|
||||
ENV SCRIPT /tmp/checktools.sh ../x.py /tmp/toolstates.json linux
|
||||
|
|
|
|||
|
|
@ -1,9 +1,18 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
## This script has two purposes: detect any tool that *regressed*, which is used
|
||||
## during the week before the beta branches to reject PRs; and detect any tool
|
||||
## that *changed* to see if we need to update the toolstate repo.
|
||||
|
||||
import sys
|
||||
import json
|
||||
|
||||
# Regressions for these tools during the beta cutoff week do not cause failure.
|
||||
# See `status_check` in `checktools.sh` for tools that have to pass on the
|
||||
# beta/stable branches.
|
||||
REGRESSION_OK = ["rustc-guide", "miri", "embedded-book"]
|
||||
|
||||
if __name__ == '__main__':
|
||||
os_name = sys.argv[1]
|
||||
toolstate_file = sys.argv[2]
|
||||
|
|
@ -21,12 +30,7 @@ if __name__ == '__main__':
|
|||
state = cur[os_name]
|
||||
new_state = toolstate.get(tool, '')
|
||||
if verb == 'regressed':
|
||||
if tool == 'rls':
|
||||
# Temporary override until
|
||||
# https://github.com/rust-lang/rust/issues/60848 is fixed.
|
||||
updated = False
|
||||
else:
|
||||
updated = new_state < state
|
||||
updated = new_state < state
|
||||
elif verb == 'changed':
|
||||
updated = new_state != state
|
||||
else:
|
||||
|
|
@ -37,7 +41,8 @@ if __name__ == '__main__':
|
|||
'The state of "{}" has {} from "{}" to "{}"'
|
||||
.format(tool, verb, state, new_state)
|
||||
)
|
||||
regressed = True
|
||||
if not (verb == 'regressed' and tool in REGRESSION_OK):
|
||||
regressed = True
|
||||
|
||||
if regressed:
|
||||
sys.exit(1)
|
||||
|
|
|
|||
|
|
@ -25,6 +25,7 @@ python2.7 "$X_PY" test --no-fail-fast \
|
|||
src/doc/rust-by-example \
|
||||
src/doc/embedded-book \
|
||||
src/doc/edition-guide \
|
||||
src/doc/rustc-guide \
|
||||
src/tools/clippy \
|
||||
src/tools/rls \
|
||||
src/tools/rustfmt \
|
||||
|
|
@ -35,12 +36,17 @@ set -e
|
|||
cat "$TOOLSTATE_FILE"
|
||||
echo
|
||||
|
||||
# This function checks if a particular tool is *not* in status "test-pass".
|
||||
check_tool_failed() {
|
||||
grep -vq '"'"$1"'":"test-pass"' "$TOOLSTATE_FILE"
|
||||
}
|
||||
|
||||
# This function checks that if a tool's submodule changed, the tool's state must improve
|
||||
verify_status() {
|
||||
verify_submodule_changed() {
|
||||
echo "Verifying status of $1..."
|
||||
if echo "$CHANGED_FILES" | grep -q "^M[[:blank:]]$2$"; then
|
||||
echo "This PR updated '$2', verifying if status is 'test-pass'..."
|
||||
if grep -vq '"'"$1"'":"test-pass"' "$TOOLSTATE_FILE"; then
|
||||
if check_tool_failed "$1"; then
|
||||
echo
|
||||
echo "⚠️ We detected that this PR updated '$1', but its tests failed."
|
||||
echo
|
||||
|
|
@ -55,34 +61,43 @@ verify_status() {
|
|||
fi
|
||||
}
|
||||
|
||||
# deduplicates the submodule check and the assertion that on beta some tools MUST be passing
|
||||
# deduplicates the submodule check and the assertion that on beta some tools MUST be passing.
|
||||
# $1 should be "submodule_changed" to only check tools that got changed by this PR,
|
||||
# or "beta_required" to check all tools that have $2 set to "beta".
|
||||
check_dispatch() {
|
||||
if [ "$1" = submodule_changed ]; then
|
||||
# ignore $2 (branch id)
|
||||
verify_status $3 $4
|
||||
verify_submodule_changed $3 $4
|
||||
elif [ "$2" = beta ]; then
|
||||
echo "Requiring test passing for $3..."
|
||||
if grep -q '"'"$3"'":"\(test\|build\)-fail"' "$TOOLSTATE_FILE"; then
|
||||
if check_tool_failed "$3"; then
|
||||
exit 4
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# list all tools here
|
||||
# List all tools here.
|
||||
# This function gets called with "submodule_changed" for each PR that changed a submodule,
|
||||
# and with "beta_required" for each PR that lands on beta/stable.
|
||||
# The purpose of this function is to *reject* PRs if a tool is not "test-pass" and
|
||||
# (a) the tool's submodule has been updated, or (b) we landed on beta/stable and the
|
||||
# tool has to "test-pass" on that branch.
|
||||
status_check() {
|
||||
check_dispatch $1 beta book src/doc/book
|
||||
check_dispatch $1 beta nomicon src/doc/nomicon
|
||||
check_dispatch $1 beta reference src/doc/reference
|
||||
check_dispatch $1 beta rust-by-example src/doc/rust-by-example
|
||||
# Temporarily disabled until
|
||||
# https://github.com/rust-lang/rust/issues/60459 is fixed.
|
||||
# check_dispatch $1 beta edition-guide src/doc/edition-guide
|
||||
check_dispatch $1 beta edition-guide src/doc/edition-guide
|
||||
check_dispatch $1 beta rls src/tools/rls
|
||||
check_dispatch $1 beta rustfmt src/tools/rustfmt
|
||||
check_dispatch $1 beta clippy-driver src/tools/clippy
|
||||
# these tools are not required for beta to successfully branch
|
||||
# These tools are not required on the beta/stable branches, but they *do* cause
|
||||
# PRs to fail if a submodule update does not fix them.
|
||||
# They will still cause failure during the beta cutoff week, unless `checkregression.py`
|
||||
# exempts them from that.
|
||||
check_dispatch $1 nightly miri src/tools/miri
|
||||
check_dispatch $1 nightly embedded-book src/doc/embedded-book
|
||||
check_dispatch $1 nightly rustc-guide src/doc/rustc-guide
|
||||
}
|
||||
|
||||
# If this PR is intended to update one of these tools, do not let the build pass
|
||||
|
|
@ -91,12 +106,14 @@ status_check() {
|
|||
status_check "submodule_changed"
|
||||
|
||||
CHECK_NOT="$(readlink -f "$(dirname $0)/checkregression.py")"
|
||||
# This callback is called by `commit_toolstate_change`, see `repo.sh`.
|
||||
change_toolstate() {
|
||||
# only update the history
|
||||
if python2.7 "$CHECK_NOT" "$OS" "$TOOLSTATE_FILE" "_data/latest.json" changed; then
|
||||
echo 'Toolstate is not changed. Not updating.'
|
||||
else
|
||||
if [ $SIX_WEEK_CYCLE -ge 35 ]; then
|
||||
# Reject any regressions during the week before beta cutoff.
|
||||
python2.7 "$CHECK_NOT" "$OS" "$TOOLSTATE_FILE" "_data/latest.json" regressed
|
||||
fi
|
||||
sed -i "1 a\\
|
||||
|
|
@ -106,7 +123,7 @@ $COMMIT\t$(cat "$TOOLSTATE_FILE")
|
|||
}
|
||||
|
||||
if [ "$RUST_RELEASE_CHANNEL" = nightly ]; then
|
||||
if [ -n "${TOOLSTATE_REPO_ACCESS_TOKEN+is_set}" ]; then
|
||||
if [ -n "${TOOLSTATE_PUBLISH+is_set}" ]; then
|
||||
. "$(dirname $0)/repo.sh"
|
||||
MESSAGE_FILE=$(mktemp -t msg.XXXXXX)
|
||||
echo "($OS CI update)" > "$MESSAGE_FILE"
|
||||
|
|
|
|||
|
|
@ -5,8 +5,8 @@
|
|||
#
|
||||
# The function relies on a GitHub bot user, which should have a Personal access
|
||||
# token defined in the environment variable $TOOLSTATE_REPO_ACCESS_TOKEN. If for
|
||||
# some reason you need to change the token, please update `.travis.yml` and
|
||||
# `appveyor.yml`:
|
||||
# some reason you need to change the token, please update the Azure Pipelines
|
||||
# variable group.
|
||||
#
|
||||
# 1. Generate a new Personal access token:
|
||||
#
|
||||
|
|
@ -18,28 +18,9 @@
|
|||
# Save it somewhere secure, as the token would be gone once you leave
|
||||
# the page.
|
||||
#
|
||||
# 2. Encrypt the token for Travis CI
|
||||
# 2. Update the variable group in Azure Pipelines
|
||||
#
|
||||
# * Install the `travis` tool locally (`gem install travis`).
|
||||
# * Encrypt the token:
|
||||
# ```
|
||||
# travis -r rust-lang/rust encrypt \
|
||||
# TOOLSTATE_REPO_ACCESS_TOKEN=aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
|
||||
# ```
|
||||
# * Copy output to replace the existing one in `.travis.yml`.
|
||||
# * Details of this step can be found in
|
||||
# <https://docs.travis-ci.com/user/encryption-keys/>
|
||||
#
|
||||
# 3. Encrypt the token for AppVeyor
|
||||
#
|
||||
# * Login to AppVeyor using your main account, and login as the rust-lang
|
||||
# organization.
|
||||
# * Open the ["Encrypt data" tool](https://ci.appveyor.com/tools/encrypt)
|
||||
# * Paste the 40-digit token into the "Value to encrypt" box, then click
|
||||
# "Encrypt"
|
||||
# * Copy the output to replace the existing one in `appveyor.yml`.
|
||||
# * Details of this step can be found in
|
||||
# <https://www.appveyor.com/docs/how-to/git-push/>
|
||||
# * Ping a member of the infrastructure team to do this.
|
||||
#
|
||||
# 4. Replace the email address below if the bot account identity is changed
|
||||
#
|
||||
|
|
@ -55,13 +36,20 @@ commit_toolstate_change() {
|
|||
git config --global credential.helper store
|
||||
printf 'https://%s:x-oauth-basic@github.com\n' "$TOOLSTATE_REPO_ACCESS_TOKEN" \
|
||||
> "$HOME/.git-credentials"
|
||||
git clone --depth=1 https://github.com/rust-lang-nursery/rust-toolstate.git
|
||||
git clone --depth=1 $TOOLSTATE_REPO
|
||||
|
||||
cd rust-toolstate
|
||||
FAILURE=1
|
||||
MESSAGE_FILE="$1"
|
||||
shift
|
||||
for RETRY_COUNT in 1 2 3 4 5; do
|
||||
# Call the callback.
|
||||
# - If we are in the `auto` branch (pre-landing), this is called from `checktools.sh` and
|
||||
# the callback is `change_toolstate` in that file. The purpose of this is to publish the
|
||||
# test results (the new commit-to-toolstate mapping) in the toolstate repo.
|
||||
# - If we are in the `master` branch (post-landing), this is called by the CI pipeline
|
||||
# and the callback is `src/tools/publish_toolstate.py`. The purpose is to publish
|
||||
# the new "current" toolstate in the toolstate repo.
|
||||
"$@"
|
||||
# `git commit` failing means nothing to commit.
|
||||
FAILURE=0
|
||||
|
|
|
|||
|
|
@ -11,6 +11,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
|||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
libssl-dev \
|
||||
pkg-config \
|
||||
xz-utils
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
|
|
|
|||
|
|
@ -1,5 +1,9 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
# FIXME(61301): we need to debug spurious failures with this on Windows on
|
||||
# Azure, so let's print more information in the logs.
|
||||
set -x
|
||||
|
||||
set -o errexit
|
||||
set -o pipefail
|
||||
set -o nounset
|
||||
|
|
@ -7,9 +11,6 @@ set -o nounset
|
|||
ci_dir=$(cd $(dirname $0) && pwd)
|
||||
. "$ci_dir/shared.sh"
|
||||
|
||||
travis_fold start init_repo
|
||||
travis_time_start
|
||||
|
||||
REPO_DIR="$1"
|
||||
CACHE_DIR="$2"
|
||||
|
||||
|
|
@ -69,5 +70,3 @@ retry sh -c "git submodule deinit -f $use_git && \
|
|||
git submodule sync && \
|
||||
git submodule update -j 16 --init --recursive $use_git"
|
||||
wait
|
||||
travis_fold end init_repo
|
||||
travis_time_finish
|
||||
|
|
|
|||
35
src/ci/install-awscli.sh
Executable file
35
src/ci/install-awscli.sh
Executable file
|
|
@ -0,0 +1,35 @@
|
|||
#!/bin/bash
|
||||
# This script downloads and installs awscli from the packages mirrored in our
|
||||
# own S3 bucket. This follows the recommendations at:
|
||||
#
|
||||
# https://packaging.python.org/guides/index-mirrors-and-caches/#caching-with-pip
|
||||
#
|
||||
# To create a new mirrored copy you can run the command:
|
||||
#
|
||||
# pip wheel awscli
|
||||
#
|
||||
# Before compressing please make sure all the wheels end with `-none-any.whl`.
|
||||
# If that's not the case you'll need to remove the non-cross-platform ones and
|
||||
# replace them with the .tar.gz downloaded from https://pypi.org. Also make
|
||||
# sure it's possible to call this script with both Python 2 and Python 3.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
MIRROR="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2019-07-27-awscli.tar"
|
||||
DEPS_DIR="/tmp/awscli-deps"
|
||||
|
||||
pip="pip"
|
||||
pipflags=""
|
||||
if [[ "${AGENT_OS}" == "Linux" ]]; then
|
||||
pip="pip3"
|
||||
pipflags="--user"
|
||||
|
||||
sudo apt-get install -y python3-setuptools
|
||||
echo "##vso[task.prependpath]$HOME/.local/bin"
|
||||
fi
|
||||
|
||||
mkdir -p "${DEPS_DIR}"
|
||||
curl "${MIRROR}" | tar xf - -C "${DEPS_DIR}"
|
||||
"${pip}" install ${pipflags} --no-index "--find-links=${DEPS_DIR}" awscli
|
||||
rm -rf "${DEPS_DIR}"
|
||||
|
|
@ -23,7 +23,9 @@ fi
|
|||
ci_dir=`cd $(dirname $0) && pwd`
|
||||
source "$ci_dir/shared.sh"
|
||||
|
||||
if [ "$TRAVIS" != "true" ] || [ "$TRAVIS_BRANCH" == "auto" ]; then
|
||||
branch_name=$(getCIBranch)
|
||||
|
||||
if [ ! isCI ] || [ "$branch_name" = "auto" ] || [ "$branch_name" = "try" ]; then
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set build.print-step-timings --enable-verbose-tests"
|
||||
fi
|
||||
|
||||
|
|
@ -44,10 +46,11 @@ fi
|
|||
# FIXME: need a scheme for changing this `nightly` value to `beta` and `stable`
|
||||
# either automatically or manually.
|
||||
export RUST_RELEASE_CHANNEL=nightly
|
||||
if [ "$DEPLOY$DEPLOY_ALT" != "" ]; then
|
||||
if [ "$DEPLOY$DEPLOY_ALT" = "1" ]; then
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --release-channel=$RUST_RELEASE_CHANNEL"
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-llvm-static-stdcpp"
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set rust.remap-debuginfo"
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --debuginfo-level-std=1"
|
||||
|
||||
if [ "$NO_LLVM_ASSERTIONS" = "1" ]; then
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --disable-llvm-assertions"
|
||||
|
|
@ -75,6 +78,21 @@ if [ "$RUST_RELEASE_CHANNEL" = "nightly" ] || [ "$DIST_REQUIRE_ALL_TOOLS" = "" ]
|
|||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-missing-tools"
|
||||
fi
|
||||
|
||||
# Print the date from the local machine and the date from an external source to
|
||||
# check for clock drifts. An HTTP URL is used instead of HTTPS since on Azure
|
||||
# Pipelines it happened that the certificates were marked as expired.
|
||||
datecheck() {
|
||||
echo "== clock drift check =="
|
||||
echo -n " local time: "
|
||||
date
|
||||
echo -n " network time: "
|
||||
curl -fs --head http://detectportal.firefox.com/success.txt | grep ^Date: \
|
||||
| sed 's/Date: //g' || true
|
||||
echo "== end clock drift check =="
|
||||
}
|
||||
datecheck
|
||||
trap datecheck EXIT
|
||||
|
||||
# We've had problems in the past of shell scripts leaking fds into the sccache
|
||||
# server (#48192) which causes Cargo to erroneously think that a build script
|
||||
# hasn't finished yet. Try to solve that problem by starting a very long-lived
|
||||
|
|
@ -88,28 +106,15 @@ if [ "$RUN_CHECK_WITH_PARALLEL_QUERIES" != "" ]; then
|
|||
rm -rf build
|
||||
fi
|
||||
|
||||
travis_fold start configure
|
||||
travis_time_start
|
||||
$SRC/configure $RUST_CONFIGURE_ARGS
|
||||
travis_fold end configure
|
||||
travis_time_finish
|
||||
|
||||
travis_fold start make-prepare
|
||||
travis_time_start
|
||||
retry make prepare
|
||||
travis_fold end make-prepare
|
||||
travis_time_finish
|
||||
|
||||
travis_fold start check-bootstrap
|
||||
travis_time_start
|
||||
make check-bootstrap
|
||||
travis_fold end check-bootstrap
|
||||
travis_time_finish
|
||||
|
||||
# Display the CPU and memory information. This helps us know why the CI timing
|
||||
# is fluctuating.
|
||||
travis_fold start log-system-info
|
||||
if [ "$TRAVIS_OS_NAME" = "osx" ]; then
|
||||
if isOSX; then
|
||||
system_profiler SPHardwareDataType || true
|
||||
sysctl hw || true
|
||||
ncpus=$(sysctl -n hw.ncpu)
|
||||
|
|
@ -118,23 +123,18 @@ else
|
|||
cat /proc/meminfo || true
|
||||
ncpus=$(grep processor /proc/cpuinfo | wc -l)
|
||||
fi
|
||||
travis_fold end log-system-info
|
||||
|
||||
if [ ! -z "$SCRIPT" ]; then
|
||||
sh -x -c "$SCRIPT"
|
||||
else
|
||||
do_make() {
|
||||
travis_fold start "make-$1"
|
||||
travis_time_start
|
||||
echo "make -j $ncpus $1"
|
||||
make -j $ncpus $1
|
||||
local retval=$?
|
||||
travis_fold end "make-$1"
|
||||
travis_time_finish
|
||||
return $retval
|
||||
}
|
||||
|
||||
do_make tidy
|
||||
do_make all
|
||||
do_make "$RUST_CHECK_TARGET"
|
||||
fi
|
||||
|
||||
sccache --show-stats || true
|
||||
|
|
|
|||
|
|
@ -24,36 +24,14 @@ function retry {
|
|||
done
|
||||
}
|
||||
|
||||
if ! declare -F travis_fold; then
|
||||
if [ "${TRAVIS-false}" = 'true' ]; then
|
||||
# This is a trimmed down copy of
|
||||
# https://github.com/travis-ci/travis-build/blob/master/lib/travis/build/templates/header.sh
|
||||
travis_fold() {
|
||||
echo -en "travis_fold:$1:$2\r\033[0K"
|
||||
}
|
||||
travis_time_start() {
|
||||
travis_timer_id=$(printf %08x $(( RANDOM * RANDOM )))
|
||||
travis_start_time=$(travis_nanoseconds)
|
||||
echo -en "travis_time:start:$travis_timer_id\r\033[0K"
|
||||
}
|
||||
travis_time_finish() {
|
||||
travis_end_time=$(travis_nanoseconds)
|
||||
local duration=$(($travis_end_time-$travis_start_time))
|
||||
local msg="travis_time:end:$travis_timer_id"
|
||||
echo -en "\n$msg:start=$travis_start_time,finish=$travis_end_time,duration=$duration\r\033[0K"
|
||||
}
|
||||
if [ $(uname) = 'Darwin' ]; then
|
||||
travis_nanoseconds() {
|
||||
date -u '+%s000000000'
|
||||
}
|
||||
else
|
||||
travis_nanoseconds() {
|
||||
date -u '+%s%N'
|
||||
}
|
||||
fi
|
||||
else
|
||||
travis_fold() { return 0; }
|
||||
travis_time_start() { return 0; }
|
||||
travis_time_finish() { return 0; }
|
||||
fi
|
||||
fi
|
||||
function isCI {
|
||||
[ "$CI" = "true" ] || [ "$TF_BUILD" = "True" ]
|
||||
}
|
||||
|
||||
function isOSX {
|
||||
[ "$AGENT_OS" = "Darwin" ]
|
||||
}
|
||||
|
||||
function getCIBranch {
|
||||
echo "$BUILD_SOURCEBRANCHNAME"
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
Subproject commit 29fe982990e43b9367be0ff47abc82fb2123fd03
|
||||
Subproject commit 7ddc46460f09a5cd9bd2a620565bdc20b3315ea9
|
||||
|
|
@ -1 +1 @@
|
|||
Subproject commit 581c6cccfaf995394ea9dcac362dc8e731c18558
|
||||
Subproject commit e58bc4ca104e890ac56af846877c874c432a64b5
|
||||
|
|
@ -1 +1 @@
|
|||
Subproject commit 9858872bd1b7dbba5ec27dc30d34eba00acd7ef9
|
||||
Subproject commit c5da1e11915d3f28266168baaf55822f7e3fe999
|
||||
|
|
@ -1 +1 @@
|
|||
Subproject commit c656171b749b7307f21371dd0d3278efee5573b8
|
||||
Subproject commit 8a7d05615e5bc0a7fb961b4919c44f5221ee54da
|
||||
|
|
@ -1 +1 @@
|
|||
Subproject commit 862b669c395822bb0938781d74f860e5762ad4fb
|
||||
Subproject commit b4b3536839042a6743fc76f0d9ad2a812020aeaa
|
||||
|
|
@ -1 +1 @@
|
|||
Subproject commit 811c697b232c611ed754d279ed20643a0c4096f6
|
||||
Subproject commit f2c15ba5ee89ae9469a2cf60494977749901d764
|
||||
|
|
@ -1 +1 @@
|
|||
Subproject commit 3cb727b62b953d59b4360d39aa68b6dc8f157655
|
||||
Subproject commit 6f4ba673ff9d4613e98415bc095347a6a0031e9c
|
||||
|
|
@ -13,5 +13,6 @@
|
|||
- [Targets](targets/index.md)
|
||||
- [Built-in Targets](targets/built-in.md)
|
||||
- [Custom Targets](targets/custom.md)
|
||||
- [Profile-guided Optimization](profile-guided-optimization.md)
|
||||
- [Linker-plugin based LTO](linker-plugin-lto.md)
|
||||
- [Contributing to `rustc`](contributing.md)
|
||||
|
|
|
|||
|
|
@ -214,3 +214,20 @@ This option lets you control what happens when the code panics.
|
|||
## incremental
|
||||
|
||||
This flag allows you to enable incremental compilation.
|
||||
|
||||
## profile-generate
|
||||
|
||||
This flag allows for creating instrumented binaries that will collect
|
||||
profiling data for use with profile-guided optimization (PGO). The flag takes
|
||||
an optional argument which is the path to a directory into which the
|
||||
instrumented binary will emit the collected data. See the chapter on
|
||||
[profile-guided optimization] for more information.
|
||||
|
||||
## profile-use
|
||||
|
||||
This flag specifies the profiling data file to be used for profile-guided
|
||||
optimization (PGO). The flag takes a mandatory argument which is the path
|
||||
to a valid `.profdata` file. See the chapter on
|
||||
[profile-guided optimization] for more information.
|
||||
|
||||
[profile-guided optimization]: ../profile-guided-optimization.md
|
||||
|
|
|
|||
|
|
@ -161,11 +161,11 @@ of print values are:
|
|||
|
||||
## `-g`: include debug information
|
||||
|
||||
A synonym for `-C debuginfo=2`, for more see [here](codegen-options/index.html#debuginfo).
|
||||
A synonym for `-C debuginfo=2`, for more see [here](codegen-options/index.md#debuginfo).
|
||||
|
||||
## `-O`: optimize your code
|
||||
|
||||
A synonym for `-C opt-level=2`, for more see [here](codegen-options/index.html#opt-level).
|
||||
A synonym for `-C opt-level=2`, for more see [here](codegen-options/index.md#opt-level).
|
||||
|
||||
## `-o`: filename of the output
|
||||
|
||||
|
|
@ -188,23 +188,23 @@ and instead produce a test harness.
|
|||
|
||||
## `--target`: select a target triple to build
|
||||
|
||||
This controls which [target](targets/index.html) to produce.
|
||||
This controls which [target](targets/index.md) to produce.
|
||||
|
||||
## `-W`: set lint warnings
|
||||
|
||||
This flag will set which lints should be set to the [warn level](lints/levels.html#warn).
|
||||
This flag will set which lints should be set to the [warn level](lints/levels.md#warn).
|
||||
|
||||
## `-A`: set lint allowed
|
||||
|
||||
This flag will set which lints should be set to the [allow level](lints/levels.html#allow).
|
||||
This flag will set which lints should be set to the [allow level](lints/levels.md#allow).
|
||||
|
||||
## `-D`: set lint denied
|
||||
|
||||
This flag will set which lints should be set to the [deny level](lints/levels.html#deny).
|
||||
This flag will set which lints should be set to the [deny level](lints/levels.md#deny).
|
||||
|
||||
## `-F`: set lint forbidden
|
||||
|
||||
This flag will set which lints should be set to the [forbid level](lints/levels.html#forbid).
|
||||
This flag will set which lints should be set to the [forbid level](lints/levels.md#forbid).
|
||||
|
||||
## `-Z`: set unstable options
|
||||
|
||||
|
|
@ -215,11 +215,11 @@ run: `rustc -Z help`.
|
|||
|
||||
## `--cap-lints`: set the most restrictive lint level
|
||||
|
||||
This flag lets you 'cap' lints, for more, [see here](lints/levels.html#capping-lints).
|
||||
This flag lets you 'cap' lints, for more, [see here](lints/levels.md#capping-lints).
|
||||
|
||||
## `-C`/`--codegen`: code generation options
|
||||
|
||||
This flag will allow you to set [codegen options](codegen-options/index.html).
|
||||
This flag will allow you to set [codegen options](codegen-options/index.md).
|
||||
|
||||
## `-V`/`--version`: print a version
|
||||
|
||||
|
|
@ -271,3 +271,36 @@ current directory out of pathnames emitted into the object files. The
|
|||
replacement is purely textual, with no consideration of the current system's
|
||||
pathname syntax. For example `--remap-path-prefix foo=bar` will match
|
||||
`foo/lib.rs` but not `./foo/lib.rs`.
|
||||
|
||||
## `--json`: configure json messages printed by the compiler
|
||||
|
||||
When the `--error-format=json` option is passed to rustc then all of the
|
||||
compiler's diagnostic output will be emitted in the form of JSON blobs. The
|
||||
`--json` argument can be used in conjunction with `--error-format=json` to
|
||||
configure what the JSON blobs contain as well as which ones are emitted.
|
||||
|
||||
With `--error-format=json` the compiler will always emit any compiler errors as
|
||||
a JSON blob, but the following options are also available to the `--json` flag
|
||||
to customize the output:
|
||||
|
||||
- `diagnostic-short` - json blobs for diagnostic messages should use the "short"
|
||||
rendering instead of the normal "human" default. This means that the output of
|
||||
`--error-format=short` will be embedded into the JSON diagnostics instead of
|
||||
the default `--error-format=human`.
|
||||
|
||||
- `diagnostic-rendered-ansi` - by default JSON blobs in their `rendered` field
|
||||
will contain a plain text rendering of the diagnostic. This option instead
|
||||
indicates that the diagnostic should have embedded ANSI color codes intended
|
||||
to be used to colorize the message in the manner rustc typically already does
|
||||
for terminal outputs. Note that this is usefully combined with crates like
|
||||
`fwdansi` to translate these ANSI codes on Windows to console commands or
|
||||
`strip-ansi-escapes` if you'd like to optionally remove the ansi colors
|
||||
afterwards.
|
||||
|
||||
- `artifacts` - this instructs rustc to emit a JSON blob for each artifact that
|
||||
is emitted. An artifact corresponds to a request from the `--emit` CLI
|
||||
argument, and as soon as the artifact is available on the filesystem a
|
||||
notification will be emitted.
|
||||
|
||||
Note that it is invalid to combine the `--json` argument with the `--color`
|
||||
argument, and it is required to combine `--json` with `--error-format=json`.
|
||||
|
|
|
|||
|
|
@ -100,9 +100,10 @@ LLVM. However, the approximation is usually reliable.
|
|||
|
||||
The following table shows known good combinations of toolchain versions.
|
||||
|
||||
| | Clang 7 | Clang 8 |
|
||||
| | Clang 7 | Clang 8 |
|
||||
|-----------|-----------|-----------|
|
||||
| Rust 1.34 | ✗ | ✓ |
|
||||
| Rust 1.35 | ✗ | ✓(?) |
|
||||
| Rust 1.35 | ✗ | ✓ |
|
||||
| Rust 1.36 | ✗ | ✓ |
|
||||
|
||||
Note that the compatibility policy for this feature might change in the future.
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ warning: unused variable: `x`
|
|||
2 | let x = 5;
|
||||
| ^
|
||||
|
|
||||
= note: #[warn(unused_variables)] on by default
|
||||
= note: `#[warn(unused_variables)]` on by default
|
||||
= note: to avoid this warning, consider using `_x` instead
|
||||
```
|
||||
|
||||
|
|
|
|||
|
|
@ -53,7 +53,7 @@ warning: unused variable: `x`
|
|||
2 | let x = 5;
|
||||
| ^
|
||||
|
|
||||
= note: #[warn(unused_variables)] on by default
|
||||
= note: `#[warn(unused_variables)]` on by default
|
||||
= note: to avoid this warning, consider using `_x` instead
|
||||
```
|
||||
|
||||
|
|
@ -76,7 +76,7 @@ error: bitshift exceeds the type's number of bits
|
|||
2 | 100u8 << 10;
|
||||
| ^^^^^^^^^^^
|
||||
|
|
||||
= note: #[deny(exceeding_bitshifts)] on by default
|
||||
= note: `#[deny(exceeding_bitshifts)]` on by default
|
||||
```
|
||||
|
||||
What's the difference between an error from a lint and a regular old error?
|
||||
|
|
@ -236,7 +236,7 @@ warning: bitshift exceeds the type's number of bits
|
|||
2 | 100u8 << 10;
|
||||
| ^^^^^^^^^^^
|
||||
|
|
||||
= note: #[warn(exceeding_bitshifts)] on by default
|
||||
= note: `#[warn(exceeding_bitshifts)]` on by default
|
||||
|
||||
warning: this expression will panic at run-time
|
||||
--> lib.rs:2:5
|
||||
|
|
|
|||
|
|
@ -165,7 +165,7 @@ pub struct Foo;
|
|||
When set to 'deny', this will produce:
|
||||
|
||||
```text
|
||||
error: type does not implement `fmt::Debug`; consider adding #[derive(Debug)] or a manual implementation
|
||||
error: type does not implement `fmt::Debug`; consider adding `#[derive(Debug)]` or a manual implementation
|
||||
--> src/main.rs:3:1
|
||||
|
|
||||
3 | pub struct Foo;
|
||||
|
|
|
|||
|
|
@ -40,7 +40,7 @@ error: defaults for type parameters are only allowed in `struct`, `enum`, `type`
|
|||
4 | fn foo<T=i32>(t: T) {}
|
||||
| ^
|
||||
|
|
||||
= note: #[deny(invalid_type_param_default)] on by default
|
||||
= note: `#[deny(invalid_type_param_default)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #36887 <https://github.com/rust-lang/rust/issues/36887>
|
||||
```
|
||||
|
|
@ -74,7 +74,7 @@ error: private struct constructors are not usable through re-exports in outer mo
|
|||
5 | ::S;
|
||||
| ^^^
|
||||
|
|
||||
= note: #[deny(legacy_constructor_visibility)] on by default
|
||||
= note: `#[deny(legacy_constructor_visibility)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #39207 <https://github.com/rust-lang/rust/issues/39207>
|
||||
```
|
||||
|
|
@ -84,9 +84,9 @@ error: private struct constructors are not usable through re-exports in outer mo
|
|||
|
||||
The legacy_directory_ownership warning is issued when
|
||||
|
||||
* There is a non-inline module with a #[path] attribute (e.g. #[path = "foo.rs"] mod bar;),
|
||||
* There is a non-inline module with a `#[path]` attribute (e.g. `#[path = "foo.rs"] mod bar;`),
|
||||
* The module's file ("foo.rs" in the above example) is not named "mod.rs", and
|
||||
* The module's file contains a non-inline child module without a #[path] attribute.
|
||||
* The module's file contains a non-inline child module without a `#[path]` attribute.
|
||||
|
||||
The warning can be fixed by renaming the parent module to "mod.rs" and moving
|
||||
it into its own directory if appropriate.
|
||||
|
|
@ -139,7 +139,7 @@ const FOO: i32 = 5;
|
|||
This will produce:
|
||||
|
||||
```text
|
||||
error: const items should never be #[no_mangle]
|
||||
error: const items should never be `#[no_mangle]`
|
||||
--> src/main.rs:3:1
|
||||
|
|
||||
3 | const FOO: i32 = 5;
|
||||
|
|
@ -187,7 +187,7 @@ error: parenthesized parameters may only be used with a trait
|
|||
2 | let x = 5 as usize();
|
||||
| ^^
|
||||
|
|
||||
= note: #[deny(parenthesized_params_in_types_and_modules)] on by default
|
||||
= note: `#[deny(parenthesized_params_in_types_and_modules)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #42238 <https://github.com/rust-lang/rust/issues/42238>
|
||||
```
|
||||
|
|
|
|||
|
|
@ -90,7 +90,7 @@ warning: floating-point literals cannot be used in patterns
|
|||
4 | 5.0 => {},
|
||||
| ^^^
|
||||
|
|
||||
= note: #[warn(illegal_floating_point_literal_pattern)] on by default
|
||||
= note: `#[warn(illegal_floating_point_literal_pattern)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #41620 <https://github.com/rust-lang/rust/issues/41620>
|
||||
```
|
||||
|
|
@ -109,7 +109,7 @@ extern "C" {
|
|||
This will produce:
|
||||
|
||||
```text
|
||||
warning: found struct without foreign-function-safe representation annotation in foreign module, consider adding a #[repr(C)] attribute to the type
|
||||
warning: found struct without foreign-function-safe representation annotation in foreign module, consider adding a `#[repr(C)]` attribute to the type
|
||||
--> src/main.rs:2:20
|
||||
|
|
||||
2 | static STATIC: String;
|
||||
|
|
@ -146,7 +146,7 @@ warning: cannot specify lifetime arguments explicitly if late bound lifetime par
|
|||
8 | S.late::<'static>(&0, &0);
|
||||
| ^^^^^^^
|
||||
|
|
||||
= note: #[warn(late_bound_lifetime_arguments)] on by default
|
||||
= note: `#[warn(late_bound_lifetime_arguments)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #42868 <https://github.com/rust-lang/rust/issues/42868>
|
||||
```
|
||||
|
|
@ -327,7 +327,7 @@ warning: patterns aren't allowed in methods without bodies
|
|||
2 | fn foo(mut arg: u8);
|
||||
| ^^^^^^^
|
||||
|
|
||||
= note: #[warn(patterns_in_fns_without_body)] on by default
|
||||
= note: `#[warn(patterns_in_fns_without_body)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #35203 <https://github.com/rust-lang/rust/issues/35203>
|
||||
```
|
||||
|
|
@ -406,7 +406,7 @@ fn foo() {}
|
|||
This will produce:
|
||||
|
||||
```text
|
||||
warning: function is marked #[no_mangle], but not exported
|
||||
warning: function is marked `#[no_mangle]`, but not exported
|
||||
--> src/main.rs:2:1
|
||||
|
|
||||
2 | fn foo() {}
|
||||
|
|
@ -433,7 +433,7 @@ static X: i32 = 4;
|
|||
This will produce:
|
||||
|
||||
```text
|
||||
warning: static is marked #[no_mangle], but not exported
|
||||
warning: static is marked `#[no_mangle]`, but not exported
|
||||
--> src/main.rs:2:1
|
||||
|
|
||||
2 | static X: i32 = 4;
|
||||
|
|
@ -496,7 +496,7 @@ warning: borrow of packed field requires unsafe function or block (error E0133)
|
|||
11 | let y = &x.data.0;
|
||||
| ^^^^^^^^^
|
||||
|
|
||||
= note: #[warn(safe_packed_borrows)] on by default
|
||||
= note: `#[warn(safe_packed_borrows)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #46043 <https://github.com/rust-lang/rust/issues/46043>
|
||||
```
|
||||
|
|
@ -529,18 +529,21 @@ This lint detects bounds in type aliases. These are not currently enforced.
|
|||
Some example code that triggers this lint:
|
||||
|
||||
```rust
|
||||
#[allow(dead_code)]
|
||||
type SendVec<T: Send> = Vec<T>;
|
||||
```
|
||||
|
||||
This will produce:
|
||||
|
||||
```text
|
||||
warning: type alias is never used: `SendVec`
|
||||
--> src/main.rs:1:1
|
||||
warning: bounds on generic parameters are not enforced in type aliases
|
||||
--> src/lib.rs:2:17
|
||||
|
|
||||
1 | type SendVec<T: Send> = Vec<T>;
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
2 | type SendVec<T: Send> = Vec<T>;
|
||||
| ^^^^
|
||||
|
|
||||
= note: `#[warn(type_alias_bounds)]` on by default
|
||||
= help: the bound will not be checked when the type alias is used, and should be removed
|
||||
```
|
||||
|
||||
## tyvar-behind-raw-pointer
|
||||
|
|
@ -564,7 +567,7 @@ warning: type annotations needed
|
|||
4 | if data.is_null() {}
|
||||
| ^^^^^^^
|
||||
|
|
||||
= note: #[warn(tyvar_behind_raw_pointer)] on by default
|
||||
= note: `#[warn(tyvar_behind_raw_pointer)]` on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in the 2018 edition!
|
||||
= note: for more information, see issue #46906 <https://github.com/rust-lang/rust/issues/46906>
|
||||
```
|
||||
|
|
@ -725,19 +728,17 @@ This lint detects attributes that were not used by the compiler. Some
|
|||
example code that triggers this lint:
|
||||
|
||||
```rust
|
||||
#![feature(custom_attribute)]
|
||||
|
||||
#![mutable_doc]
|
||||
#![macro_export]
|
||||
```
|
||||
|
||||
This will produce:
|
||||
|
||||
```text
|
||||
warning: unused attribute
|
||||
--> src/main.rs:4:1
|
||||
--> src/main.rs:1:1
|
||||
|
|
||||
4 | #![mutable_doc]
|
||||
| ^^^^^^^^^^^^^^^
|
||||
1 | #![macro_export]
|
||||
| ^^^^^^^^^^^^^^^^
|
||||
|
|
||||
```
|
||||
|
||||
|
|
@ -786,7 +787,7 @@ warning: doc comment not used by rustdoc
|
|||
|
||||
## unused-features
|
||||
|
||||
This lint detects unused or unknown features found in crate-level #[feature] directives.
|
||||
This lint detects unused or unknown features found in crate-level `#[feature]` directives.
|
||||
To fix this, simply remove the feature flag.
|
||||
|
||||
## unused-imports
|
||||
|
|
@ -838,7 +839,7 @@ warning: unused macro definition
|
|||
|
||||
## unused-must-use
|
||||
|
||||
This lint detects unused result of a type flagged as #[must_use]. Some
|
||||
This lint detects unused result of a type flagged as `#[must_use]`. Some
|
||||
example code that triggers this lint:
|
||||
|
||||
```rust
|
||||
|
|
|
|||
136
src/doc/rustc/src/profile-guided-optimization.md
Normal file
136
src/doc/rustc/src/profile-guided-optimization.md
Normal file
|
|
@ -0,0 +1,136 @@
|
|||
# Profile Guided Optimization
|
||||
|
||||
`rustc` supports doing profile-guided optimization (PGO).
|
||||
This chapter describes what PGO is, what it is good for, and how it can be used.
|
||||
|
||||
## What Is Profiled-Guided Optimization?
|
||||
|
||||
The basic concept of PGO is to collect data about the typical execution of
|
||||
a program (e.g. which branches it is likely to take) and then use this data
|
||||
to inform optimizations such as inlining, machine-code layout,
|
||||
register allocation, etc.
|
||||
|
||||
There are different ways of collecting data about a program's execution.
|
||||
One is to run the program inside a profiler (such as `perf`) and another
|
||||
is to create an instrumented binary, that is, a binary that has data
|
||||
collection built into it, and run that.
|
||||
The latter usually provides more accurate data and it is also what is
|
||||
supported by `rustc`.
|
||||
|
||||
## Usage
|
||||
|
||||
Generating a PGO-optimized program involves following a workflow with four steps:
|
||||
|
||||
1. Compile the program with instrumentation enabled
|
||||
(e.g. `rustc -Cprofile-generate=/tmp/pgo-data main.rs`)
|
||||
2. Run the instrumented program (e.g. `./main`) which generates a
|
||||
`default_<id>.profraw` file
|
||||
3. Convert the `.profraw` file into a `.profdata` file using
|
||||
LLVM's `llvm-profdata` tool
|
||||
4. Compile the program again, this time making use of the profiling data
|
||||
(for example `rustc -Cprofile-use=merged.profdata main.rs`)
|
||||
|
||||
An instrumented program will create one or more `.profraw` files, one for each
|
||||
instrumented binary. E.g. an instrumented executable that loads two instrumented
|
||||
dynamic libraries at runtime will generate three `.profraw` files. Running an
|
||||
instrumented binary multiple times, on the other hand, will re-use the
|
||||
respective `.profraw` files, updating them in place.
|
||||
|
||||
These `.profraw` files have to be post-processed before they can be fed back
|
||||
into the compiler. This is done by the `llvm-profdata` tool. This tool
|
||||
is most easily installed via
|
||||
|
||||
```bash
|
||||
rustup component add llvm-tools-preview
|
||||
```
|
||||
|
||||
Note that installing the `llvm-tools-preview` component won't add
|
||||
`llvm-profdata` to the `PATH`. Rather, the tool can be found in:
|
||||
|
||||
```bash
|
||||
~/.rustup/toolchains/<toolchain>/lib/rustlib/<target-triple>/bin/
|
||||
```
|
||||
|
||||
Alternatively, an `llvm-profdata` coming with a recent LLVM or Clang
|
||||
version usually works too.
|
||||
|
||||
The `llvm-profdata` tool merges multiple `.profraw` files into a single
|
||||
`.profdata` file that can then be fed back into the compiler via
|
||||
`-Cprofile-use`:
|
||||
|
||||
```bash
|
||||
# STEP 1: Compile the binary with instrumentation
|
||||
rustc -Cprofile-generate=/tmp/pgo-data -O ./main.rs
|
||||
|
||||
# STEP 2: Run the binary a few times, maybe with common sets of args.
|
||||
# Each run will create or update `.profraw` files in /tmp/pgo-data
|
||||
./main mydata1.csv
|
||||
./main mydata2.csv
|
||||
./main mydata3.csv
|
||||
|
||||
# STEP 3: Merge and post-process all the `.profraw` files in /tmp/pgo-data
|
||||
llvm-profdata merge -o ./merged.profdata /tmp/pgo-data
|
||||
|
||||
# STEP 4: Use the merged `.profdata` file during optimization. All `rustc`
|
||||
# flags have to be the same.
|
||||
rustc -Cprofile-use=./merged.profdata -O ./main.rs
|
||||
```
|
||||
|
||||
### A Complete Cargo Workflow
|
||||
|
||||
Using this feature with Cargo works very similar to using it with `rustc`
|
||||
directly. Again, we generate an instrumented binary, run it to produce data,
|
||||
merge the data, and feed it back into the compiler. Some things of note:
|
||||
|
||||
- We use the `RUSTFLAGS` environment variable in order to pass the PGO compiler
|
||||
flags to the compilation of all crates in the program.
|
||||
|
||||
- We pass the `--target` flag to Cargo, which prevents the `RUSTFLAGS`
|
||||
arguments to be passed to Cargo build scripts. We don't want the build
|
||||
scripts to generate a bunch of `.profraw` files.
|
||||
|
||||
- We pass `--release` to Cargo because that's where PGO makes the most sense.
|
||||
In theory, PGO can also be done on debug builds but there is little reason
|
||||
to do so.
|
||||
|
||||
- It is recommended to use *absolute paths* for the argument of
|
||||
`-Cprofile-generate` and `-Cprofile-use`. Cargo can invoke `rustc` with
|
||||
varying working directories, meaning that `rustc` will not be able to find
|
||||
the supplied `.profdata` file. With absolute paths this is not an issue.
|
||||
|
||||
- It is good practice to make sure that there is no left-over profiling data
|
||||
from previous compilation sessions. Just deleting the directory is a simple
|
||||
way of doing so (see `STEP 0` below).
|
||||
|
||||
This is what the entire workflow looks like:
|
||||
|
||||
```bash
|
||||
# STEP 0: Make sure there is no left-over profiling data from previous runs
|
||||
rm -rf /tmp/pgo-data
|
||||
|
||||
# STEP 1: Build the instrumented binaries
|
||||
RUSTFLAGS="-Cprofile-generate=/tmp/pgo-data" \
|
||||
cargo build --release --target=x86_64-unknown-linux-gnu
|
||||
|
||||
# STEP 2: Run the instrumented binaries with some typical data
|
||||
./target/x86_64-unknown-linux-gnu/release/myprogram mydata1.csv
|
||||
./target/x86_64-unknown-linux-gnu/release/myprogram mydata2.csv
|
||||
./target/x86_64-unknown-linux-gnu/release/myprogram mydata3.csv
|
||||
|
||||
# STEP 3: Merge the `.profraw` files into a `.profdata` file
|
||||
llvm-profdata merge -o /tmp/pgo-data/merged.profdata /tmp/pgo-data
|
||||
|
||||
# STEP 4: Use the `.profdata` file for guiding optimizations
|
||||
RUSTFLAGS="-Cprofile-use=/tmp/pgo-data/merged.profdata" \
|
||||
cargo build --release --target=x86_64-unknown-linux-gnu
|
||||
```
|
||||
|
||||
## Further Reading
|
||||
|
||||
`rustc`'s PGO support relies entirely on LLVM's implementation of the feature
|
||||
and is equivalent to what Clang offers via the `-fprofile-generate` /
|
||||
`-fprofile-use` flags. The [Profile Guided Optimization][clang-pgo] section
|
||||
in Clang's documentation is therefore an interesting read for anyone who wants
|
||||
to use PGO with Rust.
|
||||
|
||||
[clang-pgo]: https://clang.llvm.org/docs/UsersManual.html#profile-guided-optimization
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue