Auto merge of #56092 - alexcrichton:no-more-std-subodules, r=Mark-Simulacrum

std: Depend directly on crates.io crates

Ever since we added a Cargo-based build system for the compiler the
standard library has always been a little special, it's never been able
to depend on crates.io crates for runtime dependencies. This has been a
result of various limitations, namely that Cargo doesn't understand that
crates from crates.io depend on libcore, so Cargo tries to build crates
before libcore is finished.

I had an idea this afternoon, however, which lifts the strategy
from #52919 to directly depend on crates.io crates from the standard
library. After all is said and done this removes a whopping three
submodules that we need to manage!

The basic idea here is that for any crate `std` depends on it adds an
*optional* dependency on an empty crate on crates.io, in this case named
`rustc-std-workspace-core`. This crate is overridden via `[patch]` in
this repository to point to a local crate we write, and *that* has a
`path` dependency on libcore.

Note that all `no_std` crates also depend on `compiler_builtins`, but if
we're not using submodules we can publish `compiler_builtins` to
crates.io and all crates can depend on it anyway! The basic strategy
then looks like:

* The standard library (or some transitive dep) decides to depend on a
  crate `foo`.
* The standard library adds

  ```toml
  [dependencies]
  foo = { version = "0.1", features = ['rustc-dep-of-std'] }
  ```
* The crate `foo` has an optional dependency on `rustc-std-workspace-core`
* The crate `foo` has an optional dependency on `compiler_builtins`
* The crate `foo` has a feature `rustc-dep-of-std` which activates these
  crates and any other necessary infrastructure in the crate.

A sample commit for `dlmalloc` [turns out to be quite simple][commit].
After that all `no_std` crates should largely build "as is" and still be
publishable on crates.io! Notably they should be able to continue to use
stable Rust if necessary, since the `rename-dependency` feature of Cargo
is soon stabilizing.

As a proof of concept, this commit removes the `dlmalloc`,
`libcompiler_builtins`, and `libc` submodules from this repository. Long
thorns in our side these are now gone for good and we can directly
depend on crates.io! It's hoped that in the long term we can bring in
other crates as necessary, but for now this is largely intended to
simply make it easier to manage these crates and remove submodules.

This should be a transparent non-breaking change for all users, but one
possible stickler is that this almost for sure breaks out-of-tree
`std`-building tools like `xargo` and `cargo-xbuild`. I think it should
be relatively easy to get them working, however, as all that's needed is
an entry in the `[patch]` section used to build the standard library.
Hopefully we can work with these tools to solve this problem!

[commit]: https://github.com/alexcrichton/dlmalloc-rs/commit/28ee12db813a3b650a7c25d1c36d2c17dcb88ae3
diff --git a/src/ci/docker/mingw-check/Dockerfile b/src/ci/docker/mingw-check/Dockerfile
index aab339f..10aedf6 100644
--- a/src/ci/docker/mingw-check/Dockerfile
+++ b/src/ci/docker/mingw-check/Dockerfile
@@ -20,4 +20,5 @@
 RUN sh /scripts/sccache.sh
 
 ENV RUN_CHECK_WITH_PARALLEL_QUERIES 1
-ENV SCRIPT python2.7 ../x.py check --target=i686-pc-windows-gnu --host=i686-pc-windows-gnu
+ENV SCRIPT python2.7 ../x.py check --target=i686-pc-windows-gnu --host=i686-pc-windows-gnu && \
+           python2.7 ../x.py build --stage 0 src/tools/build-manifest
diff --git a/src/librustc_data_structures/sorted_map.rs b/src/librustc_data_structures/sorted_map.rs
index 29d99a6..3bd3d11 100644
--- a/src/librustc_data_structures/sorted_map.rs
+++ b/src/librustc_data_structures/sorted_map.rs
@@ -10,7 +10,7 @@
 
 use std::borrow::Borrow;
 use std::cmp::Ordering;
-use std::convert::From;
+use std::iter::FromIterator;
 use std::mem;
 use std::ops::{RangeBounds, Bound, Index, IndexMut};
 
@@ -25,11 +25,10 @@
 #[derive(Clone, PartialEq, Eq, PartialOrd, Ord, Hash, Default, Debug, RustcEncodable,
          RustcDecodable)]
 pub struct SortedMap<K: Ord, V> {
-    data: Vec<(K,V)>
+    data: Vec<(K, V)>
 }
 
 impl<K: Ord, V> SortedMap<K, V> {
-
     #[inline]
     pub fn new() -> SortedMap<K, V> {
         SortedMap {
@@ -82,7 +81,10 @@
     }
 
     #[inline]
-    pub fn get(&self, key: &K) -> Option<&V> {
+    pub fn get<Q>(&self, key: &Q) -> Option<&V>
+        where K: Borrow<Q>,
+              Q: Ord + ?Sized
+    {
         match self.lookup_index_for(key) {
             Ok(index) => {
                 unsafe {
@@ -96,7 +98,10 @@
     }
 
     #[inline]
-    pub fn get_mut(&mut self, key: &K) -> Option<&mut V> {
+    pub fn get_mut<Q>(&mut self, key: &Q) -> Option<&mut V>
+        where K: Borrow<Q>,
+              Q: Ord + ?Sized
+    {
         match self.lookup_index_for(key) {
             Ok(index) => {
                 unsafe {
@@ -122,13 +127,13 @@
 
     /// Iterate over the keys, sorted
     #[inline]
-    pub fn keys(&self) -> impl Iterator<Item=&K> + ExactSizeIterator {
+    pub fn keys(&self) -> impl Iterator<Item = &K> + ExactSizeIterator {
         self.data.iter().map(|&(ref k, _)| k)
     }
 
     /// Iterate over values, sorted by key
     #[inline]
-    pub fn values(&self) -> impl Iterator<Item=&V> + ExactSizeIterator {
+    pub fn values(&self) -> impl Iterator<Item = &V> + ExactSizeIterator {
         self.data.iter().map(|&(_, ref v)| v)
     }
 
@@ -138,6 +143,11 @@
     }
 
     #[inline]
+    pub fn is_empty(&self) -> bool {
+        self.len() == 0
+    }
+
+    #[inline]
     pub fn range<R>(&self, range: R) -> &[(K, V)]
         where R: RangeBounds<K>
     {
@@ -207,8 +217,11 @@
 
     /// Looks up the key in `self.data` via `slice::binary_search()`.
     #[inline(always)]
-    fn lookup_index_for(&self, key: &K) -> Result<usize, usize> {
-        self.data.binary_search_by(|&(ref x, _)| x.cmp(key))
+    fn lookup_index_for<Q>(&self, key: &Q) -> Result<usize, usize>
+        where K: Borrow<Q>,
+              Q: Ord + ?Sized
+    {
+        self.data.binary_search_by(|&(ref x, _)| x.borrow().cmp(key))
     }
 
     #[inline]
@@ -247,38 +260,54 @@
 
         (start, end)
     }
+
+    #[inline]
+    pub fn contains_key<Q>(&self, key: &Q) -> bool
+        where K: Borrow<Q>,
+              Q: Ord + ?Sized
+    {
+        self.get(key).is_some()
+    }
 }
 
 impl<K: Ord, V> IntoIterator for SortedMap<K, V> {
     type Item = (K, V);
     type IntoIter = ::std::vec::IntoIter<(K, V)>;
+
     fn into_iter(self) -> Self::IntoIter {
         self.data.into_iter()
     }
 }
 
-impl<K: Ord, V, Q: Borrow<K>> Index<Q> for SortedMap<K, V> {
+impl<'a, K, Q, V> Index<&'a Q> for SortedMap<K, V>
+    where K: Ord + Borrow<Q>,
+          Q: Ord + ?Sized
+{
     type Output = V;
-    fn index(&self, index: Q) -> &Self::Output {
-        let k: &K = index.borrow();
-        self.get(k).unwrap()
+
+    fn index(&self, key: &Q) -> &Self::Output {
+        self.get(key).expect("no entry found for key")
     }
 }
 
-impl<K: Ord, V, Q: Borrow<K>> IndexMut<Q> for SortedMap<K, V> {
-    fn index_mut(&mut self, index: Q) -> &mut Self::Output {
-        let k: &K = index.borrow();
-        self.get_mut(k).unwrap()
+impl<'a, K, Q, V> IndexMut<&'a Q> for SortedMap<K, V>
+    where K: Ord + Borrow<Q>,
+          Q: Ord + ?Sized
+{
+    fn index_mut(&mut self, key: &Q) -> &mut Self::Output {
+        self.get_mut(key).expect("no entry found for key")
     }
 }
 
-impl<K: Ord, V, I: Iterator<Item=(K, V)>> From<I> for SortedMap<K, V> {
-    fn from(data: I) -> Self {
-        let mut data: Vec<(K, V)> = data.collect();
+impl<K: Ord, V> FromIterator<(K, V)> for SortedMap<K, V> {
+    fn from_iter<T: IntoIterator<Item = (K, V)>>(iter: T) -> Self {
+        let mut data: Vec<(K, V)> = iter.into_iter().collect();
+
         data.sort_unstable_by(|&(ref k1, _), &(ref k2, _)| k1.cmp(k2));
         data.dedup_by(|&mut (ref k1, _), &mut (ref k2, _)| {
             k1.cmp(k2) == Ordering::Equal
         });
+
         SortedMap {
             data
         }
diff --git a/src/librustdoc/clean/mod.rs b/src/librustdoc/clean/mod.rs
index 64f66d5..1e33ec8 100644
--- a/src/librustdoc/clean/mod.rs
+++ b/src/librustdoc/clean/mod.rs
@@ -707,8 +707,6 @@
 /// kept separate because of issue #42760.
 #[derive(Clone, RustcEncodable, RustcDecodable, PartialEq, Eq, Debug, Hash)]
 pub enum DocFragment {
-    // FIXME #44229 (misdreavus): sugared and raw doc comments can be brought back together once
-    // hoedown is completely removed from rustdoc.
     /// A doc fragment created from a `///` or `//!` doc comment.
     SugaredDoc(usize, syntax_pos::Span, String),
     /// A doc fragment created from a "raw" `#[doc=""]` attribute.
diff --git a/src/librustdoc/passes/collect_intra_doc_links.rs b/src/librustdoc/passes/collect_intra_doc_links.rs
index 426d3f3..29062ba 100644
--- a/src/librustdoc/passes/collect_intra_doc_links.rs
+++ b/src/librustdoc/passes/collect_intra_doc_links.rs
@@ -459,6 +459,19 @@
     start.to(end)
 }
 
+/// Reports a resolution failure diagnostic.
+///
+/// Ideally we can report the diagnostic with the actual span in the source where the link failure
+/// occurred. However, there's a mismatch between the span in the source code and the span in the
+/// markdown, so we have to do a bit of work to figure out the correspondence.
+///
+/// It's not too hard to find the span for sugared doc comments (`///` and `/**`), because the
+/// source will match the markdown exactly, excluding the comment markers. However, it's much more
+/// difficult to calculate the spans for unsugared docs, because we have to deal with escaping and
+/// other source features. So, we attempt to find the exact source span of the resolution failure
+/// in sugared docs, but use the span of the documentation attributes themselves for unsugared
+/// docs. Because this span might be overly large, we display the markdown line containing the
+/// failure as a note.
 fn resolution_failure(
     cx: &DocContext,
     attrs: &Attributes,
@@ -469,47 +482,75 @@
     let sp = span_of_attrs(attrs);
     let msg = format!("`[{}]` cannot be resolved, ignoring it...", path_str);
 
-    let code_dox = sp.to_src(cx);
-
-    let doc_comment_padding = 3;
     let mut diag = if let Some(link_range) = link_range {
-        // blah blah blah\nblah\nblah [blah] blah blah\nblah blah
-        //                       ^    ~~~~~~
-        //                       |    link_range
-        //                       last_new_line_offset
+        let src = cx.sess().source_map().span_to_snippet(sp);
+        let is_all_sugared_doc = attrs.doc_strings.iter().all(|frag| match frag {
+            DocFragment::SugaredDoc(..) => true,
+            _ => false,
+        });
 
-        let mut diag;
-        if dox.lines().count() == code_dox.lines().count() {
-            let line_offset = dox[..link_range.start].lines().count();
-            // The span starts in the `///`, so we don't have to account for the leading whitespace.
-            let code_dox_len = if line_offset <= 1 {
-                doc_comment_padding
-            } else {
-                // The first `///`.
-                doc_comment_padding +
-                    // Each subsequent leading whitespace and `///`.
-                    code_dox.lines().skip(1).take(line_offset - 1).fold(0, |sum, line| {
-                        sum + doc_comment_padding + line.len() - line.trim_start().len()
-                    })
-            };
+        if let (Ok(src), true) = (src, is_all_sugared_doc) {
+            // The number of markdown lines up to and including the resolution failure.
+            let num_lines = dox[..link_range.start].lines().count();
 
-            // Extract the specific span.
+            // We use `split_terminator('\n')` instead of `lines()` when counting bytes to ensure
+            // that DOS-style line endings do not cause the spans to be calculated incorrectly.
+            let mut src_lines = src.split_terminator('\n');
+            let mut md_lines = dox.split_terminator('\n').take(num_lines).peekable();
+
+            // The number of bytes from the start of the source span to the resolution failure that
+            // are *not* part of the markdown, like comment markers.
+            let mut extra_src_bytes = 0;
+
+            while let Some(md_line) = md_lines.next() {
+                loop {
+                    let source_line = src_lines
+                        .next()
+                        .expect("could not find markdown line in source");
+
+                    match source_line.find(md_line) {
+                        Some(offset) => {
+                            extra_src_bytes += if md_lines.peek().is_some() {
+                                source_line.len() - md_line.len()
+                            } else {
+                                offset
+                            };
+                            break;
+                        }
+                        None => {
+                            // Since this is a source line that doesn't include a markdown line,
+                            // we have to count the newline that we split from earlier.
+                            extra_src_bytes += source_line.len() + 1;
+                        }
+                    }
+                }
+            }
+
             let sp = sp.from_inner_byte_pos(
-                link_range.start + code_dox_len,
-                link_range.end + code_dox_len,
+                link_range.start + extra_src_bytes,
+                link_range.end + extra_src_bytes,
             );
 
-            diag = cx.tcx.struct_span_lint_node(lint::builtin::INTRA_DOC_LINK_RESOLUTION_FAILURE,
-                                                NodeId::from_u32(0),
-                                                sp,
-                                                &msg);
+            let mut diag = cx.tcx.struct_span_lint_node(
+                lint::builtin::INTRA_DOC_LINK_RESOLUTION_FAILURE,
+                NodeId::from_u32(0),
+                sp,
+                &msg,
+            );
             diag.span_label(sp, "cannot be resolved, ignoring");
+            diag
         } else {
-            diag = cx.tcx.struct_span_lint_node(lint::builtin::INTRA_DOC_LINK_RESOLUTION_FAILURE,
-                                                NodeId::from_u32(0),
-                                                sp,
-                                                &msg);
+            let mut diag = cx.tcx.struct_span_lint_node(
+                lint::builtin::INTRA_DOC_LINK_RESOLUTION_FAILURE,
+                NodeId::from_u32(0),
+                sp,
+                &msg,
+            );
 
+            // blah blah blah\nblah\nblah [blah] blah blah\nblah blah
+            //                       ^     ~~~~
+            //                       |     link_range
+            //                       last_new_line_offset
             let last_new_line_offset = dox[..link_range.start].rfind('\n').map_or(0, |n| n + 1);
             let line = dox[last_new_line_offset..].lines().next().unwrap_or("");
 
@@ -522,8 +563,8 @@
                 before=link_range.start - last_new_line_offset,
                 found=link_range.len(),
             ));
+            diag
         }
-        diag
     } else {
         cx.tcx.struct_span_lint_node(lint::builtin::INTRA_DOC_LINK_RESOLUTION_FAILURE,
                                      NodeId::from_u32(0),
diff --git a/src/libtest/lib.rs b/src/libtest/lib.rs
index bca9888..1c87349 100644
--- a/src/libtest/lib.rs
+++ b/src/libtest/lib.rs
@@ -99,6 +99,10 @@
 
 use formatters::{JsonFormatter, OutputFormatter, PrettyFormatter, TerseFormatter};
 
+/// Whether to execute tests concurrently or not
+#[derive(Copy, Clone, Debug, PartialEq, Eq)]
+pub enum Concurrent { Yes, No }
+
 // The name of a test. By convention this follows the rules for rust
 // paths; i.e., it should be a series of identifiers separated by double
 // colons. This way if some test runner wants to arrange the tests
@@ -1073,8 +1077,12 @@
 where
     F: FnMut(TestEvent) -> io::Result<()>,
 {
-    use std::collections::HashMap;
+    use std::collections::{self, HashMap};
+    use std::hash::BuildHasherDefault;
     use std::sync::mpsc::RecvTimeoutError;
+    // Use a deterministic hasher
+    type TestMap =
+        HashMap<TestDesc, Instant, BuildHasherDefault<collections::hash_map::DefaultHasher>>;
 
     let tests_len = tests.len();
 
@@ -1113,9 +1121,9 @@
 
     let (tx, rx) = channel::<MonitorMsg>();
 
-    let mut running_tests: HashMap<TestDesc, Instant> = HashMap::new();
+    let mut running_tests: TestMap = HashMap::default();
 
-    fn get_timed_out_tests(running_tests: &mut HashMap<TestDesc, Instant>) -> Vec<TestDesc> {
+    fn get_timed_out_tests(running_tests: &mut TestMap) -> Vec<TestDesc> {
         let now = Instant::now();
         let timed_out = running_tests
             .iter()
@@ -1133,7 +1141,7 @@
         timed_out
     };
 
-    fn calc_timeout(running_tests: &HashMap<TestDesc, Instant>) -> Option<Duration> {
+    fn calc_timeout(running_tests: &TestMap) -> Option<Duration> {
         running_tests.values().min().map(|next_timeout| {
             let now = Instant::now();
             if *next_timeout >= now {
@@ -1148,7 +1156,7 @@
         while !remaining.is_empty() {
             let test = remaining.pop().unwrap();
             callback(TeWait(test.desc.clone()))?;
-            run_test(opts, !opts.run_tests, test, tx.clone());
+            run_test(opts, !opts.run_tests, test, tx.clone(), Concurrent::No);
             let (test, result, stdout) = rx.recv().unwrap();
             callback(TeResult(test, result, stdout))?;
         }
@@ -1159,7 +1167,7 @@
                 let timeout = Instant::now() + Duration::from_secs(TEST_WARN_TIMEOUT_S);
                 running_tests.insert(test.desc.clone(), timeout);
                 callback(TeWait(test.desc.clone()))?; //here no pad
-                run_test(opts, !opts.run_tests, test, tx.clone());
+                run_test(opts, !opts.run_tests, test, tx.clone(), Concurrent::Yes);
                 pending += 1;
             }
 
@@ -1191,7 +1199,7 @@
         // All benchmarks run at the end, in serial.
         for b in filtered_benchs {
             callback(TeWait(b.desc.clone()))?;
-            run_test(opts, false, b, tx.clone());
+            run_test(opts, false, b, tx.clone(), Concurrent::No);
             let (test, result, stdout) = rx.recv().unwrap();
             callback(TeResult(test, result, stdout))?;
         }
@@ -1393,6 +1401,7 @@
     force_ignore: bool,
     test: TestDescAndFn,
     monitor_ch: Sender<MonitorMsg>,
+    concurrency: Concurrent,
 ) {
     let TestDescAndFn { desc, testfn } = test;
 
@@ -1409,6 +1418,7 @@
         monitor_ch: Sender<MonitorMsg>,
         nocapture: bool,
         testfn: Box<dyn FnBox() + Send>,
+        concurrency: Concurrent,
     ) {
         // Buffer for capturing standard I/O
         let data = Arc::new(Mutex::new(Vec::new()));
@@ -1443,7 +1453,7 @@
         // the test synchronously, regardless of the concurrency
         // level.
         let supports_threads = !cfg!(target_os = "emscripten") && !cfg!(target_arch = "wasm32");
-        if supports_threads {
+        if concurrency == Concurrent::Yes && supports_threads {
             let cfg = thread::Builder::new().name(name.as_slice().to_owned());
             cfg.spawn(runtest).unwrap();
         } else {
@@ -1464,13 +1474,14 @@
         }
         DynTestFn(f) => {
             let cb = move || __rust_begin_short_backtrace(f);
-            run_test_inner(desc, monitor_ch, opts.nocapture, Box::new(cb))
+            run_test_inner(desc, monitor_ch, opts.nocapture, Box::new(cb), concurrency)
         }
         StaticTestFn(f) => run_test_inner(
             desc,
             monitor_ch,
             opts.nocapture,
             Box::new(move || __rust_begin_short_backtrace(f)),
+            concurrency,
         ),
     }
 }
@@ -1753,6 +1764,7 @@
     use std::sync::mpsc::channel;
     use bench;
     use Bencher;
+    use Concurrent;
 
 
     fn one_ignored_one_unignored_test() -> Vec<TestDescAndFn> {
@@ -1793,7 +1805,7 @@
             testfn: DynTestFn(Box::new(f)),
         };
         let (tx, rx) = channel();
-        run_test(&TestOpts::new(), false, desc, tx);
+        run_test(&TestOpts::new(), false, desc, tx, Concurrent::No);
         let (_, res, _) = rx.recv().unwrap();
         assert!(res != TrOk);
     }
@@ -1811,7 +1823,7 @@
             testfn: DynTestFn(Box::new(f)),
         };
         let (tx, rx) = channel();
-        run_test(&TestOpts::new(), false, desc, tx);
+        run_test(&TestOpts::new(), false, desc, tx, Concurrent::No);
         let (_, res, _) = rx.recv().unwrap();
         assert!(res == TrIgnored);
     }
@@ -1831,7 +1843,7 @@
             testfn: DynTestFn(Box::new(f)),
         };
         let (tx, rx) = channel();
-        run_test(&TestOpts::new(), false, desc, tx);
+        run_test(&TestOpts::new(), false, desc, tx, Concurrent::No);
         let (_, res, _) = rx.recv().unwrap();
         assert!(res == TrOk);
     }
@@ -1851,7 +1863,7 @@
             testfn: DynTestFn(Box::new(f)),
         };
         let (tx, rx) = channel();
-        run_test(&TestOpts::new(), false, desc, tx);
+        run_test(&TestOpts::new(), false, desc, tx, Concurrent::No);
         let (_, res, _) = rx.recv().unwrap();
         assert!(res == TrOk);
     }
@@ -1873,7 +1885,7 @@
             testfn: DynTestFn(Box::new(f)),
         };
         let (tx, rx) = channel();
-        run_test(&TestOpts::new(), false, desc, tx);
+        run_test(&TestOpts::new(), false, desc, tx, Concurrent::No);
         let (_, res, _) = rx.recv().unwrap();
         assert!(res == TrFailedMsg(format!("{} '{}'", failed_msg, expected)));
     }
@@ -1891,7 +1903,7 @@
             testfn: DynTestFn(Box::new(f)),
         };
         let (tx, rx) = channel();
-        run_test(&TestOpts::new(), false, desc, tx);
+        run_test(&TestOpts::new(), false, desc, tx, Concurrent::No);
         let (_, res, _) = rx.recv().unwrap();
         assert!(res == TrFailed);
     }
diff --git a/src/test/debuginfo/function-call.rs b/src/test/debuginfo/function-call.rs
new file mode 100644
index 0000000..266e536
--- /dev/null
+++ b/src/test/debuginfo/function-call.rs
@@ -0,0 +1,52 @@
+// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// This test does not passed with gdb < 8.0. See #53497.
+// min-gdb-version 8.0
+
+// compile-flags:-g
+
+// === GDB TESTS ===================================================================================
+
+// gdb-command:run
+
+// gdb-command:print fun(45, true)
+// gdb-check:$1 = true
+// gdb-command:print fun(444, false)
+// gdb-check:$2 = false
+
+// gdb-command:print r.get_x()
+// gdb-check:$3 = 4
+
+#![allow(dead_code, unused_variables)]
+
+struct RegularStruct {
+    x: i32
+}
+
+impl RegularStruct {
+    fn get_x(&self) -> i32 {
+        self.x
+    }
+}
+
+fn main() {
+    let _ = fun(4, true);
+    let r = RegularStruct{x: 4};
+    let _ = r.get_x();
+
+    zzz(); // #break
+}
+
+fn fun(x: isize, y: bool) -> bool {
+    y
+}
+
+fn zzz() { () }
diff --git a/src/test/run-make-fulldeps/libtest-json/output.json b/src/test/run-make-fulldeps/libtest-json/output.json
index d8169ec..80e75c8 100644
--- a/src/test/run-make-fulldeps/libtest-json/output.json
+++ b/src/test/run-make-fulldeps/libtest-json/output.json
@@ -2,7 +2,7 @@
 { "type": "test", "event": "started", "name": "a" }
 { "type": "test", "name": "a", "event": "ok" }
 { "type": "test", "event": "started", "name": "b" }
-{ "type": "test", "name": "b", "event": "failed", "stdout": "thread 'b' panicked at 'assertion failed: false', f.rs:18:5\nnote: Run with `RUST_BACKTRACE=1` for a backtrace.\n" }
+{ "type": "test", "name": "b", "event": "failed", "stdout": "thread 'main' panicked at 'assertion failed: false', f.rs:18:5\nnote: Run with `RUST_BACKTRACE=1` for a backtrace.\n" }
 { "type": "test", "event": "started", "name": "c" }
 { "type": "test", "name": "c", "event": "ok" }
 { "type": "test", "event": "started", "name": "d" }
diff --git a/src/test/rustdoc-ui/.gitattributes b/src/test/rustdoc-ui/.gitattributes
new file mode 100644
index 0000000..2bcabdf
--- /dev/null
+++ b/src/test/rustdoc-ui/.gitattributes
@@ -0,0 +1 @@
+intra-links-warning-crlf.rs eol=crlf
diff --git a/src/test/rustdoc-ui/intra-links-warning-crlf.rs b/src/test/rustdoc-ui/intra-links-warning-crlf.rs
new file mode 100644
index 0000000..20f761f
--- /dev/null
+++ b/src/test/rustdoc-ui/intra-links-warning-crlf.rs
@@ -0,0 +1,23 @@
+// ignore-tidy-cr
+
+// compile-pass
+
+// This file checks the spans of intra-link warnings in a file with CRLF line endings. The
+// .gitattributes file in this directory should enforce it.
+
+/// [error]
+pub struct A;
+
+///
+/// docs [error1]
+
+/// docs [error2]
+///
+pub struct B;
+
+/**
+ * This is a multi-line comment.
+ *
+ * It also has an [error].
+ */
+pub struct C;
diff --git a/src/test/rustdoc-ui/intra-links-warning-crlf.stderr b/src/test/rustdoc-ui/intra-links-warning-crlf.stderr
new file mode 100644
index 0000000..62537f2
--- /dev/null
+++ b/src/test/rustdoc-ui/intra-links-warning-crlf.stderr
@@ -0,0 +1,33 @@
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning-crlf.rs:8:6
+   |
+LL | /// [error]
+   |      ^^^^^ cannot be resolved, ignoring
+   |
+   = note: #[warn(intra_doc_link_resolution_failure)] on by default
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error1]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning-crlf.rs:12:11
+   |
+LL | /// docs [error1]
+   |           ^^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error2]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning-crlf.rs:14:11
+   |
+LL | /// docs [error2]
+   |           ^^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning-crlf.rs:21:20
+   |
+LL |  * It also has an [error].
+   |                    ^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
diff --git a/src/test/rustdoc-ui/intra-links-warning.rs b/src/test/rustdoc-ui/intra-links-warning.rs
index d6bc275..db2fd32 100644
--- a/src/test/rustdoc-ui/intra-links-warning.rs
+++ b/src/test/rustdoc-ui/intra-links-warning.rs
@@ -55,3 +55,33 @@
     }
 }
 f!("Foo\nbar [BarF] bar\nbaz");
+
+/** # for example,
+ *
+ * time to introduce a link [error]*/
+pub struct A;
+
+/**
+ * # for example,
+ *
+ * time to introduce a link [error]
+ */
+pub struct B;
+
+#[doc = "single line [error]"]
+pub struct C;
+
+#[doc = "single line with \"escaping\" [error]"]
+pub struct D;
+
+/// Item docs.
+#[doc="Hello there!"]
+/// [error]
+pub struct E;
+
+///
+/// docs [error1]
+
+/// docs [error2]
+///
+pub struct F;
diff --git a/src/test/rustdoc-ui/intra-links-warning.stderr b/src/test/rustdoc-ui/intra-links-warning.stderr
index c05f99f..ed31421 100644
--- a/src/test/rustdoc-ui/intra-links-warning.stderr
+++ b/src/test/rustdoc-ui/intra-links-warning.stderr
@@ -55,6 +55,76 @@
    |
    = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
 
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:61:30
+   |
+LL |  * time to introduce a link [error]*/
+   |                              ^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:67:30
+   |
+LL |  * time to introduce a link [error]
+   |                              ^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:71:1
+   |
+LL | #[doc = "single line [error]"]
+   | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+   |
+   = note: the link appears in this line:
+           
+           single line [error]
+                        ^^^^^
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:74:1
+   |
+LL | #[doc = "single line with /"escaping/" [error]"]
+   | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+   |
+   = note: the link appears in this line:
+           
+           single line with "escaping" [error]
+                                        ^^^^^
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:77:1
+   |
+LL | / /// Item docs.
+LL | | #[doc="Hello there!"]
+LL | | /// [error]
+   | |___________^
+   |
+   = note: the link appears in this line:
+           
+            [error]
+             ^^^^^
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error1]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:83:11
+   |
+LL | /// docs [error1]
+   |           ^^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
+warning: `[error2]` cannot be resolved, ignoring it...
+  --> $DIR/intra-links-warning.rs:85:11
+   |
+LL | /// docs [error2]
+   |           ^^^^^^ cannot be resolved, ignoring
+   |
+   = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
+
 warning: `[BarA]` cannot be resolved, ignoring it...
   --> $DIR/intra-links-warning.rs:24:10
    |
@@ -64,37 +134,19 @@
    = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
 
 warning: `[BarB]` cannot be resolved, ignoring it...
-  --> $DIR/intra-links-warning.rs:28:1
+  --> $DIR/intra-links-warning.rs:30:9
    |
-LL | / /**
-LL | |  * Foo
-LL | |  * bar [BarB] bar
-LL | |  * baz
-LL | |  */
-   | |___^
+LL |  * bar [BarB] bar
+   |         ^^^^ cannot be resolved, ignoring
    |
-   = note: the link appears in this line:
-           
-            bar [BarB] bar
-                 ^^^^
    = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
 
 warning: `[BarC]` cannot be resolved, ignoring it...
-  --> $DIR/intra-links-warning.rs:35:1
+  --> $DIR/intra-links-warning.rs:37:6
    |
-LL | / /** Foo
-LL | |
-LL | | bar [BarC] bar
-LL | | baz
-...  |
-LL | |
-LL | | */
-   | |__^
+LL | bar [BarC] bar
+   |      ^^^^ cannot be resolved, ignoring
    |
-   = note: the link appears in this line:
-           
-           bar [BarC] bar
-                ^^^^
    = help: to escape `[` and `]` characters, just add '/' before them like `/[` or `/]`
 
 warning: `[BarD]` cannot be resolved, ignoring it...
diff --git a/src/tools/build-manifest/src/main.rs b/src/tools/build-manifest/src/main.rs
index d9834f9..896b380 100644
--- a/src/tools/build-manifest/src/main.rs
+++ b/src/tools/build-manifest/src/main.rs
@@ -14,7 +14,7 @@
 
 use std::collections::BTreeMap;
 use std::env;
-use std::fs::File;
+use std::fs;
 use std::io::{self, Read, Write};
 use std::path::{PathBuf, Path};
 use std::process::{Command, Stdio};