Deprecate obsolete repository

Change-Id: I41117c2947a330133062c26c7e31b23f33459388
Reviewed-on: https://fuchsia-review.googlesource.com/c/scripts/+/418742
Reviewed-by: Mahesh Saripalli <maheshsr@google.com>
API-Review: Dale Sather <dalesat@google.com>
diff --git a/.gitignore b/.gitignore
deleted file mode 100644
index 8c4722d..0000000
--- a/.gitignore
+++ /dev/null
@@ -1,2 +0,0 @@
-*.pyc
-fuchsia-sdk.tgz
diff --git a/AUTHORS b/AUTHORS
deleted file mode 100644
index 4c61558..0000000
--- a/AUTHORS
+++ /dev/null
@@ -1,9 +0,0 @@
-# This is the list of Fuchsia Authors.
-
-# Names should be added to this file as one of
-#     Organization's name
-#     Individual's name <submission email address>
-#     Individual's name <submission email address> <email2> <emailN>
-
-Google Inc.
-The Chromium Authors
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
deleted file mode 100644
index 81e2938..0000000
--- a/CONTRIBUTING.md
+++ /dev/null
@@ -1,9 +0,0 @@
-This repository accepts contributions using Gerrit.
-
-Instructions for using Gerrit:
-
- * https://gerrit-review.googlesource.com/Documentation/
-
-Before we can land your change, you need to sign the Google CLA:
-
- * https://cla.developers.google.com/
diff --git a/LICENSE b/LICENSE
deleted file mode 100644
index ac6402f..0000000
--- a/LICENSE
+++ /dev/null
@@ -1,27 +0,0 @@
-// Copyright 2016 The Fuchsia Authors. All rights reserved.
-//
-// Redistribution and use in source and binary forms, with or without
-// modification, are permitted provided that the following conditions are
-// met:
-//
-//    * Redistributions of source code must retain the above copyright
-// notice, this list of conditions and the following disclaimer.
-//    * Redistributions in binary form must reproduce the above
-// copyright notice, this list of conditions and the following disclaimer
-// in the documentation and/or other materials provided with the
-// distribution.
-//    * Neither the name of Google Inc. nor the names of its
-// contributors may be used to endorse or promote products derived from
-// this software without specific prior written permission.
-//
-// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-// "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/MAINTAINERS b/MAINTAINERS
deleted file mode 100644
index ee1c6c2..0000000
--- a/MAINTAINERS
+++ /dev/null
@@ -1,2 +0,0 @@
-pylaligand@google.com
-raggi@google.com
diff --git a/PATENTS b/PATENTS
deleted file mode 100644
index 2746e78..0000000
--- a/PATENTS
+++ /dev/null
@@ -1,22 +0,0 @@
-Additional IP Rights Grant (Patents)
-
-"This implementation" means the copyrightable works distributed by
-Google as part of the Fuchsia project.
-
-Google hereby grants to you a perpetual, worldwide, non-exclusive,
-no-charge, royalty-free, irrevocable (except as stated in this
-section) patent license to make, have made, use, offer to sell, sell,
-import, transfer, and otherwise run, modify and propagate the contents
-of this implementation of Fuchsia, where such license applies only to
-those patent claims, both currently owned by Google and acquired in
-the future, licensable by Google that are necessarily infringed by
-this implementation. This grant does not include claims that would be
-infringed only as a consequence of further modification of this
-implementation. If you or your agent or exclusive licensee institute
-or order or agree to the institution of patent litigation or any other
-patent enforcement activity against any entity (including a
-cross-claim or counterclaim in a lawsuit) alleging that this
-implementation of Fuchsia constitutes direct or contributory patent
-infringement, or inducement of patent infringement, then any patent
-rights granted to you under this License for this implementation of
-Fuchsia shall terminate as of the date such litigation is filed.
diff --git a/README.md b/README.md
index cf4f625..856f53e 100644
--- a/README.md
+++ b/README.md
@@ -1,38 +1,5 @@
-Scripts
-=======================================
+# Obsolete
 
-This repository is for scripts useful when hacking on Fuchsia. This repository
-should contain scripts that perform tasks spanning multiple repositories.
-Scripts that only operate within a single repository should live in the relevant
-repository.
+This repository has moved into Fuchsia's main repository:
+https://fuchsia.googlesource.com/fuchsia/+/master/script.
 
-
-# push-package.py
-
-The push-package.py script pushes the files listed in the given manifests files.
-No checking is performed for incremental changes.
-
-The sample command lines below can be used to build Modular and then push those
-files to the default device. This assumes you have already booted your device
-with a version of Fuchsia that contains the most recent version of all other
-packages. This command line uses the "system_manifest" file from each of the
-modular directories, such as modular, modular_dev, and modular_tests.
-
-```
-cd $FUCHSIA_DIR
-fx build peridot:modular_all
-scripts/push-package.py out/debug-x64/package/modular*/system_manifest
-```
-
-# fx publish
-
-`fx publish` will take a package from the build and create a Fuchsia package
-manager [package] from a build package. It will then add the package to a local
-update respository which, by default, is put at
-`${FUCHSIA_BUILD_DIR}/amber-files`. It will also add the package content files
-to the update repository and name these file after their [Merkle Root].  If a
-package name is supplied to `fx publish`, only that package will be processed.
-If no name is supplied, all the packages made by the build will be included.
-
-[package]: https://fuchsia.googlesource.com/pm/+/master/README.md#structure-of-a-fuchsia-package
-[Merkle Root]: https://fuchsia.googlesource.com/docs/+/master/merkleroot.md
diff --git a/blobstats/blobstats.dart b/blobstats/blobstats.dart
deleted file mode 100644
index ea2c993..0000000
--- a/blobstats/blobstats.dart
+++ /dev/null
@@ -1,411 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-import "dart:async";
-import "dart:convert";
-import "dart:io";
-
-import 'package:args/args.dart';
-
-ArgResults argResults;
-const lz4Compression = "lz4-compression";
-const zstdCompression = "zstd-compression";
-const output = "output";
-
-class Blob {
-  String hash;
-  String sourcePath = "Unknown";
-  int size;
-  int sizeOnHost;
-  int estimatedCompressedSize;
-  int count;
-
-  int get saved {
-    return size * (count - 1);
-  }
-
-  int get proportional {
-    return size ~/ count;
-  }
-}
-
-class Package {
-  String name;
-  int size;
-  int proportional;
-  int private;
-  int blobCount;
-  Map<String, Blob> blobsByPath;
-}
-
-String pathJoin(String part1, String part2, [String part3]) {
-  var buffer = new StringBuffer();
-  buffer.write(part1);
-  if (!part1.endsWith("/")) buffer.write("/");
-  buffer.write(part2);
-  if (part3 != null) {
-    if (!part2.endsWith("/")) buffer.write("/");
-    buffer.write(part3);
-  }
-  return buffer.toString();
-}
-
-class BlobStats {
-  Directory buildDir;
-  Directory outputDir;
-  String suffix;
-  Map<String, Blob> blobsByHash = new Map<String, Blob>();
-  int duplicatedSize = 0;
-  int deduplicatedSize = 0;
-  List<File> pendingPackages = new List<File>();
-  List<Package> packages = new List<Package>();
-
-  BlobStats(
-      Directory this.buildDir, Directory this.outputDir, String this.suffix);
-
-  Future addManifest(String path) async {
-    var lines = await new File(pathJoin(buildDir.path, path)).readAsLines();
-    for (var line in lines) {
-      var parts = line.split("=");
-      var hash = parts[0];
-      var path = parts[1];
-      if (!path.startsWith("/")) {
-        path = pathJoin(buildDir.path, path);
-      }
-      var file = new File(path);
-      if (path.endsWith("meta.far")) {
-        pendingPackages.add(file);
-      }
-
-      if (suffix != null && !path.endsWith(suffix)) {
-        continue;
-      }
-
-      var stat = await file.stat();
-      if (stat.type == FileSystemEntityType.NOT_FOUND) {
-        print("$path does not exist");
-        continue;
-      }
-      var blob = blobsByHash[hash];
-      if (blob == null) {
-        var blob = new Blob();
-        blob.hash = hash;
-        blob.sizeOnHost = stat.size;
-        blob.estimatedCompressedSize =
-            await estimateCompressedBlobSize(stat.size, hash, path);
-        blob.count = 0;
-        blobsByHash[hash] = blob;
-      }
-    }
-  }
-
-  Future estimateCompressedBlobSize(int size, String hash, String path) async {
-    // TODO(smklein): This is a heuristic matching the internals of blobs.
-    // As this heuristic changes (or the compression algorithm is altered),
-    // this code must be updated.
-    const int minimumSaving = 65536;
-
-    bool compression = argResults[lz4Compression] | argResults[zstdCompression];
-    if (size > minimumSaving && compression) {
-      String tmpPath = Directory.systemTemp.path + "/compressed." + hash;
-      var compressedFile = new File(tmpPath);
-      try {
-        if (argResults[lz4Compression]) {
-          var result = await Process.run("lz4", ["-1", path, tmpPath]);
-          if (result.exitCode > 0) {
-            print("Could not compress $path");
-            return size;
-          }
-        } else if (argResults[zstdCompression]) {
-          var result = await Process.run("zstd", [path, "-o", tmpPath]);
-          if (result.exitCode > 0) {
-            print("Could not compress $path");
-            return size;
-          }
-        } else {
-          print("Bad compression algorithm");
-        }
-        var stat = await compressedFile.stat();
-        if (stat.type == FileSystemEntityType.NOT_FOUND) {
-          print("Could not compress $path");
-          return size;
-        }
-        if (stat.size < size - minimumSaving) {
-          return stat.size;
-        }
-      } finally {
-        await compressedFile.delete();
-      }
-    }
-    return size; // No compression
-  }
-
-  Future addBlobSizes(String path) async {
-    var lines = await new File(pathJoin(buildDir.path, path)).readAsLines();
-    for (var line in lines) {
-      var parts = line.split("=");
-      var hash = parts[0];
-      var blob = blobsByHash[hash];
-      blob.size = int.parse(parts[1]);
-    }
-  }
-
-  void printBlobList(List<Blob> blobs, int limit) {
-    print("     Size Share      Prop     Saved Hash");
-    var n = 0;
-    for (var blob in blobs) {
-      if (n++ > limit) return;
-
-      var sb = new StringBuffer();
-      sb.write(blob.size.toString().padLeft(9));
-      sb.write(" ");
-      sb.write(blob.count.toString().padLeft(5));
-      sb.write(" ");
-      sb.write(blob.proportional.toString().padLeft(9));
-      sb.write(" ");
-      sb.write(blob.saved.toString().padLeft(9));
-      sb.write(" ");
-      sb.write(blob.sourcePath);
-      print(sb);
-    }
-  }
-
-  void printBlobs(int limit) {
-    var blobs = blobsByHash.values.toList();
-    print("Top blobs by size ($limit of ${blobs.length})");
-    blobs.sort((a, b) => b.size.compareTo(a.size));
-    printBlobList(blobs, limit);
-
-    blobs.removeWhere((blob) => blob.count == 1);
-
-    print("");
-    print("Top deduplicated blobs by proportional ($limit of ${blobs.length})");
-    blobs.sort((a, b) => b.proportional.compareTo(a.proportional));
-    printBlobList(blobs, limit);
-
-    print("");
-    print("Top deduplicated blobs by saved ($limit of ${blobs.length})");
-    blobs.sort((a, b) => b.saved.compareTo(a.saved));
-    printBlobList(blobs, limit);
-
-    var percent = (duplicatedSize - deduplicatedSize) * 100 ~/ duplicatedSize;
-
-    print("");
-    print("Total savings from deduplication:");
-    print("   $percent% $deduplicatedSize / $duplicatedSize");
-  }
-
-  String metaFarToBlobsJson(String farPath) {
-    // Assumes details of //build/package.gni, namely that it generates
-    //   <build-dir>/.../<package>.meta/meta.far
-    // and puts a blobs.json file into
-    //   <build-dir>/.../<package>.meta/blobs.json
-    if (!farPath.endsWith(".meta/meta.far")) {
-      throw "Build details have changed";
-    }
-    return farPath.substring(0, farPath.length - "meta.far".length) +
-        "blobs.json";
-  }
-
-  Future computePackagesInParallel(int jobs) async {
-    var tasks = new List<Future>();
-    for (var i = 0; i < jobs; i++) {
-      tasks.add(computePackages());
-    }
-    await Future.wait(tasks);
-  }
-
-  Future computePackages() async {
-    while (!pendingPackages.isEmpty) {
-      File far = pendingPackages.removeLast();
-
-      var package = new Package();
-      package.name = far.path.substring(buildDir.path.length);
-      package.size = 0;
-      package.proportional = 0;
-      package.private = 0;
-      package.blobCount = 0;
-      package.blobsByPath = new Map<String, Blob>();
-
-      var blobs = json
-          .decode(await new File(metaFarToBlobsJson(far.path)).readAsString());
-
-      for (var blob in blobs) {
-        var hash = blob["merkle"];
-        var path = blob["path"];
-        var b = blobsByHash[hash];
-        if (b == null) {
-          print(
-              "$path $hash is in a package manifest but not the final manifest");
-          continue;
-        }
-        b.count++;
-        var sourcePath = blob["source_path"];
-        // If the source_path looks like <something>/blobs/<merkle>, it from a prebuilt package and has no
-        // meaningful source. Instead, use the path within the package as its identifier.
-        if (sourcePath.endsWith("/blobs/$hash")) {
-          sourcePath = path;
-        }
-        // We may see the same blob referenced from different packages with different source paths.
-        // If all references agree with each other, use that.
-        // Otherwise record the first observed path and append " *" to denote that the path is only one of many.
-        if (b.sourcePath == "Unknown") {
-          b.sourcePath = sourcePath;
-        } else if (b.sourcePath != sourcePath && !b.sourcePath.endsWith(" *")) {
-          b.sourcePath = b.sourcePath + " *";
-        }
-        package.blobsByPath[path] = b;
-      }
-
-      packages.add(package);
-    }
-  }
-
-  void computeStats() {
-    var filteredBlobs = new Map<String, Blob>();
-    blobsByHash.forEach((hash, blob) {
-      if (blob.count == 0) {
-        print(
-            "${blob.hash} is in the final manifest but not any package manifest");
-      } else {
-        filteredBlobs[hash] = blob;
-      }
-    });
-    blobsByHash = filteredBlobs;
-
-    for (var blob in blobsByHash.values) {
-      duplicatedSize += (blob.size * blob.count);
-      deduplicatedSize += blob.size;
-    }
-
-    for (var package in packages) {
-      for (var blob in package.blobsByPath.values) {
-        package.size += blob.size;
-        package.proportional += blob.proportional;
-        if (blob.count == 1) {
-          package.private += blob.size;
-        }
-        package.blobCount++;
-      }
-    }
-  }
-
-  void printPackages() {
-    packages.sort((a, b) => b.proportional.compareTo(a.proportional));
-    print("");
-    print("Packages by proportional (${packages.length})");
-    print("     Size      Prop   Private Path");
-    for (var package in packages) {
-      var sb = new StringBuffer();
-      sb.write(package.size.toString().padLeft(9));
-      sb.write(" ");
-      sb.write(package.proportional.toString().padLeft(9));
-      sb.write(" ");
-      sb.write(package.private.toString().padLeft(9));
-      sb.write(" ");
-      sb.write(package.name);
-      print(sb);
-    }
-  }
-
-  Future packagesToChromiumBinarySizeTree() async {
-    var rootTree = {};
-    rootTree["n"] = "packages";
-    rootTree["children"] = new List();
-    rootTree["k"] = "p"; // kind=path
-    for (var pkg in packages) {
-      var parts = pkg.name.split("/");
-      var pkgName = parts.length > 1 ? parts[parts.length - 2] : parts.last;
-      var pkgTree = {};
-      pkgTree["n"] = pkgName;
-      pkgTree["children"] = new List();
-      pkgTree["k"] = "p"; // kind=path
-      rootTree["children"].add(pkgTree);
-      pkg.blobsByPath.forEach((path, blob) {
-        var blobName = path; //path.split("/").last;
-        var blobTree = {};
-        blobTree["n"] = blobName;
-        blobTree["k"] = "s"; // kind=blob
-        var isUnique = blob.count == 1;
-        var isDart =
-            blobName.endsWith(".dilp") || blobName.endsWith(".aotsnapshot");
-        if (isDart) {
-          if (isUnique) {
-            blobTree["t"] = "uniDart";
-          } else {
-            blobTree["t"] = "dart"; // type=Shared Dart ("blue")
-          }
-        } else {
-          if (isUnique) {
-            blobTree["t"] = "unique";
-          } else {
-            blobTree["t"] = "?"; // type=Other ("red")
-          }
-        }
-        blobTree["c"] = blob.count;
-        blobTree["value"] = blob.proportional;
-        blobTree["originalSize"] = blob.sizeOnHost;
-        blobTree["estimatedCompressedSize"] = blob.estimatedCompressedSize;
-        pkgTree["children"].add(blobTree);
-      });
-    }
-
-    await outputDir.create(recursive: true);
-
-    var sink = new File(pathJoin(outputDir.path, "data.js")).openWrite();
-    sink.write("var tree_data=");
-    sink.write(json.encode(rootTree));
-    await sink.close();
-
-    await new Directory(pathJoin(outputDir.path, "d3")).create(recursive: true);
-    var d3Dir = pathJoin(buildDir.path, "../../scripts/third_party/d3/");
-    for (var file in ["LICENSE", "d3.js"]) {
-      await new File(d3Dir + file).copy(pathJoin(outputDir.path, "d3", file));
-    }
-    var templateDir =
-        pathJoin(buildDir.path, "../../scripts/blobstats/template/");
-    for (var file in ["index.html", "D3BlobTreeMap.js"]) {
-      await new File(templateDir + file).copy(pathJoin(outputDir.path, file));
-    }
-
-    print("");
-    print("  Wrote visualization to file://" +
-        pathJoin(outputDir.path, "index.html"));
-  }
-}
-
-Future main(List<String> args) async {
-  final parser = new ArgParser()
-    ..addFlag("help", abbr: "h", help: "give this help")
-    ..addOption(output, abbr: "o", help: "Directory to output report to")
-    ..addFlag(lz4Compression,
-        abbr: "l", defaultsTo: false, help: "Use (lz4) compressed size")
-    ..addFlag(zstdCompression,
-        abbr: "z", defaultsTo: false, help: "Use (zstd) compressed size");
-
-  argResults = parser.parse(args);
-  if (argResults["help"]) {
-    print("Usage: fx blobstats [OPTION]...\n\nOptions:\n" + parser.usage);
-    return;
-  }
-
-  var suffix;
-  if (argResults.rest.length > 0) {
-    suffix = argResults.rest[0];
-  }
-
-  var outputDir = Directory.current;
-  if (argResults[output] != null) {
-    outputDir = new Directory(argResults[output]);
-  }
-
-  var stats = new BlobStats(Directory.current, outputDir, suffix);
-  await stats.addManifest("blob.manifest");
-  await stats.addBlobSizes("blob.sizes");
-  await stats.computePackagesInParallel(Platform.numberOfProcessors);
-  stats.computeStats();
-  stats.printBlobs(40);
-  stats.printPackages();
-  await stats.packagesToChromiumBinarySizeTree();
-}
diff --git a/blobstats/blobstats.packages b/blobstats/blobstats.packages
deleted file mode 100644
index 4d3fca0..0000000
--- a/blobstats/blobstats.packages
+++ /dev/null
@@ -1,8 +0,0 @@
-# Copyright 2019 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-async:../../third_party/dart-pkg/pub/async/lib/
-convert:../../third_party/dart-pkg/pub/convert/lib/
-io:../../third_party/dart-pkg/pub/io/lib/
-args:../../third_party/dart-pkg/pub/args/lib/
diff --git a/blobstats/template/D3BlobTreeMap.js b/blobstats/template/D3BlobTreeMap.js
deleted file mode 100644
index f98d8ce..0000000
--- a/blobstats/template/D3BlobTreeMap.js
+++ /dev/null
@@ -1,884 +0,0 @@
-// Copyright 2014 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-function D3BlobTreeMap(mapWidth, mapHeight, levelsToShow) {
-  this._mapContainer = undefined;
-  this._mapWidth = mapWidth;
-  this._mapHeight = mapHeight;
-  this.boxPadding = {'l': 5, 'r': 5, 't': 20, 'b': 5};
-  this.infobox = undefined;
-  this._maskContainer = undefined;
-  this._highlightContainer = undefined;
-  // Transition in this order:
-  // 1. Exiting items go away.
-  // 2. Updated items move.
-  // 3. New items enter.
-  this._exitDuration=500;
-  this._updateDuration=500;
-  this._enterDuration=500;
-  this._firstTransition=true;
-  this._layout = undefined;
-  this._currentRoot = undefined;
-  this._currentNodes = undefined;
-  this._treeData = undefined;
-  this._maxLevelsToShow = levelsToShow;
-  this._currentMaxDepth = this._maxLevelsToShow;
-}
-
-/**
- * Make a number pretty, with comma separators.
- */
-D3BlobTreeMap._pretty = function(num) {
-  var asString = String(num);
-  var result = '';
-  var counter = 0;
-  for (var x = asString.length - 1; x >= 0; x--) {
-    counter++;
-    if (counter === 4) {
-      result = ',' + result;
-      counter = 1;
-    }
-    result = asString.charAt(x) + result;
-  }
-  return result;
-}
-
-/**
- * Express a number in terms of KiB, MiB, GiB, etc.
- * Note that these are powers of 2, not of 10.
- */
-D3BlobTreeMap._byteify = function(num) {
-  var suffix;
-  if (num >= 1024) {
-    if (num >= 1024 * 1024 * 1024) {
-      suffix = 'GiB';
-      num = num / (1024 * 1024 * 1024);
-    } else if (num >= 1024 * 1024) {
-      suffix = 'MiB';
-      num = num / (1024 * 1024);
-    } else if (num >= 1024) {
-      suffix = 'KiB'
-      num = num / 1024;
-    }
-    return num.toFixed(2) + ' ' + suffix;
-  }
-  return num + ' B';
-}
-
-D3BlobTreeMap._BLOB_TYPE_DESCRIPTIONS = {
-  'dart': 'Shared Dart',
-  '?': 'Shared Unrecognized',
-  'unique': 'Unique',
-  'uniDart': 'Unique Dart',
-};
-D3BlobTreeMap._BLOB_TYPES = [];
-for (var blob_type in D3BlobTreeMap._BLOB_TYPE_DESCRIPTIONS) {
-  D3BlobTreeMap._BLOB_TYPES.push(blob_type);
-}
-
-/**
- * Given a blob type code, look up and return a human-readable description
- * of that blob type. If the blob type does not match one of the known
- * types, the unrecognized description (corresponding to blob type '?') is
- * returned instead of null or undefined.
- */
-D3BlobTreeMap._getblobDescription = function(type) {
-  var result = D3BlobTreeMap._BLOB_TYPE_DESCRIPTIONS[type];
-  if (result === undefined) {
-    result = D3BlobTreeMap._BLOB_TYPE_DESCRIPTIONS['?'];
-  }
-  return result;
-}
-
-D3BlobTreeMap._colorArray = [
-  'rgb(148,116,204)',  // Shared Dart - Deep Purple 300
-  'rgb(255,213,79)',  // Unique - Amber 300
-  'rgb(77,208,225)',  // Shared Other - Cyan 300
-  'rgb(161,136,127)', // Unique Dart - Brown 300
-];
-
-D3BlobTreeMap._initColorMap = function() {
-  var map = {};
-  var numColors = D3BlobTreeMap._colorArray.length;
-  var count = 0;
-  for (var key in D3BlobTreeMap._BLOB_TYPE_DESCRIPTIONS) {
-    var index = count++ % numColors;
-    map[key] = d3.rgb(D3BlobTreeMap._colorArray[index]);
-  }
-  D3BlobTreeMap._colorMap = map;
-}
-D3BlobTreeMap._initColorMap();
-
-D3BlobTreeMap.getColorForType = function(type) {
-  var result = D3BlobTreeMap._colorMap[type];
-  if (result === undefined) return d3.rgb('rgb(255,255,255)');
-  return result;
-}
-
-D3BlobTreeMap.prototype.init = function() {
-  this.infobox = this._createInfoBox();
-  this._mapContainer = d3.select('body').append('div')
-      .style('position', 'relative')
-      .style('width', this._mapWidth)
-      .style('height', this._mapHeight)
-      .style('padding', 0)
-      .style('margin', 0)
-      .style('box-shadow', '5px 5px 5px #888');
-  this._layout = this._createTreeMapLayout();
-  this._setData(tree_data); // TODO: Don't use global 'tree_data'
-}
-
-/**
- * Sets the data displayed by the treemap and lays out the map.
- */
-D3BlobTreeMap.prototype._setData = function(data) {
-  this._treeData = data;
-  console.time('_crunchStats');
-  this._crunchStats(data);
-  console.timeEnd('_crunchStats');
-  this._currentRoot = this._treeData;
-  this._currentNodes = this._layout.nodes(this._currentRoot);
-  this._currentMaxDepth = this._maxLevelsToShow;
-  this._doLayout();
-}
-
-/**
- * Recursively traverses the entire tree starting from the specified node,
- * computing statistics and recording metadata as it goes. Call this method
- * only once per imported tree.
- */
-D3BlobTreeMap.prototype._crunchStats = function(node) {
-  var stack = [];
-  stack.idCounter = 0;
-  this._crunchStatsHelper(stack, node);
-}
-
-/**
- * Invoke the specified visitor function on all data elements currently shown
- * in the treemap including any and all of their children, starting at the
- * currently-displayed root and descending recursively. The function will be
- * passed the datum element representing each node. No traversal guarantees
- * are made.
- */
-D3BlobTreeMap.prototype.visitFromDisplayedRoot = function(visitor) {
-  this._visit(this._currentRoot, visitor);
-}
-
-/**
- * Helper function for visit functions.
- */
-D3BlobTreeMap.prototype._visit = function(datum, visitor) {
-  visitor.call(this, datum);
-  if (datum.children) for (var i = 0; i < datum.children.length; i++) {
-    this._visit(datum.children[i], visitor);
-  }
-}
-
-D3BlobTreeMap.prototype._crunchStatsHelper = function(stack, node) {
-  // Only overwrite the node ID if it isn't already set.
-  // This allows stats to be crunched multiple times on subsets of data
-  // without breaking the data-to-ID bindings. New nodes get new IDs.
-  if (node.id === undefined) node.id = stack.idCounter++;
-  if (node.children === undefined) {
-    // Leaf node (blob); accumulate stats.
-    for (var i = 0; i < stack.length; i++) {
-      var ancestor = stack[i];
-      if (!ancestor.blob_stats) ancestor.blob_stats = {};
-      if (ancestor.blob_stats[node.t] === undefined) {
-        // New blob type we haven't seen before, just record.
-        ancestor.blob_stats[node.t] = {'count': 1,
-                                         'size': node.value};
-      } else {
-        // Existing blob type, increment.
-        ancestor.blob_stats[node.t].count++;
-        ancestor.blob_stats[node.t].size += node.value;
-      }
-    }
-  } else for (var i = 0; i < node.children.length; i++) {
-    stack.push(node);
-    this._crunchStatsHelper(stack, node.children[i]);
-    stack.pop();
-  }
-}
-
-D3BlobTreeMap.prototype._createTreeMapLayout = function() {
-  var result = d3.layout.treemap()
-      .padding([this.boxPadding.t, this.boxPadding.r,
-                this.boxPadding.b, this.boxPadding.l])
-      .size([this._mapWidth, this._mapHeight]);
-  return result;
-}
-
-D3BlobTreeMap.prototype.resize = function(width, height) {
-  this._mapWidth = width;
-  this._mapHeight = height;
-  this._mapContainer.style('width', width).style('height', height);
-  this._layout.size([this._mapWidth, this._mapHeight]);
-  this._currentNodes = this._layout.nodes(this._currentRoot);
-  this._doLayout();
-}
-
-D3BlobTreeMap.prototype._zoomDatum = function(datum) {
-  if (this._currentRoot === datum) return; // already here
-  this._hideHighlight(datum);
-  this._hideInfoBox(datum);
-  this._currentRoot = datum;
-  this._currentNodes = this._layout.nodes(this._currentRoot);
-  this._currentMaxDepth = this._currentRoot.depth + this._maxLevelsToShow;
-  console.log('zooming into datum ' + this._currentRoot.n);
-  this._doLayout();
-}
-
-D3BlobTreeMap.prototype.setMaxLevels = function(levelsToShow) {
-  this._maxLevelsToShow = levelsToShow;
-  this._currentNodes = this._layout.nodes(this._currentRoot);
-  this._currentMaxDepth = this._currentRoot.depth + this._maxLevelsToShow;
-  console.log('setting max levels to show: ' + this._maxLevelsToShow);
-  this._doLayout();
-}
-
-/**
- * Clone the specified tree, returning an independent copy of the data.
- * Only the original attributes expected to exist prior to invoking
- * _crunchStatsHelper are retained, with the exception of the 'id' attribute
- * (which must be retained for proper transitions).
- * If the optional filter parameter is provided, it will be called with 'this'
- * set to this treemap instance and passed the 'datum' object as an argument.
- * When specified, the copy will retain only the data for which the filter
- * function returns true.
- */
-D3BlobTreeMap.prototype._clone = function(datum, filter) {
-  var trackingStats = false;
-  if (this.__cloneState === undefined) {
-    console.time('_clone');
-    trackingStats = true;
-    this.__cloneState = {'accepted': 0, 'rejected': 0,
-                         'forced': 0, 'pruned': 0};
-  }
-
-  // Must go depth-first. All parents of children that are accepted by the
-  // filter must be preserved!
-  var copy = {'n': datum.n, 'k': datum.k};
-  var childAccepted = false;
-  if (datum.children !== undefined) {
-    for (var i = 0; i < datum.children.length; i++) {
-      var copiedChild = this._clone(datum.children[i], filter);
-      if (copiedChild !== undefined) {
-        childAccepted = true; // parent must also be accepted.
-        if (copy.children === undefined) copy.children = [];
-        copy.children.push(copiedChild);
-      }
-    }
-  }
-
-  // Ignore nodes that don't match the filter, when present.
-  var accept = false;
-  if (childAccepted) {
-    // Parent of an accepted child must also be accepted.
-    this.__cloneState.forced++;
-    accept = true;
-  } else if (filter !== undefined && filter.call(this, datum) !== true) {
-    this.__cloneState.rejected++;
-  } else if (datum.children === undefined) {
-    // Accept leaf nodes that passed the filter
-    this.__cloneState.accepted++;
-    accept = true;
-  } else {
-    // Non-leaf node. If no children are accepted, prune it.
-    this.__cloneState.pruned++;
-  }
-
-  if (accept) {
-    if (datum.id !== undefined) copy.id = datum.id;
-    if (datum.lastPathElement !== undefined) {
-      copy.lastPathElement = datum.lastPathElement;
-    }
-    if (datum.t !== undefined) copy.t = datum.t;
-    if (datum.value !== undefined && datum.children === undefined) {
-      copy.value = datum.value;
-    }
-  } else {
-    // Discard the copy we were going to return
-    copy = undefined;
-  }
-
-  if (trackingStats === true) {
-    // We are the fist call in the recursive chain.
-    console.timeEnd('_clone');
-    var totalAccepted = this.__cloneState.accepted +
-                        this.__cloneState.forced;
-    console.log(
-        totalAccepted + ' nodes retained (' +
-        this.__cloneState.forced + ' forced by accepted children, ' +
-        this.__cloneState.accepted + ' accepted on their own merits), ' +
-        this.__cloneState.rejected + ' nodes (and their children) ' +
-                                     'filtered out,' +
-        this.__cloneState.pruned + ' nodes pruned because because no ' +
-                                   'children remained.');
-    delete this.__cloneState;
-  }
-  return copy;
-}
-
-D3BlobTreeMap.prototype.filter = function(filter) {
-  // Ensure we have a copy of the original root.
-  if (this._backupTree === undefined) this._backupTree = this._treeData;
-  this._mapContainer.selectAll('div').remove();
-  this._setData(this._clone(this._backupTree, filter));
-}
-
-D3BlobTreeMap.prototype._doLayout = function() {
-  console.time('_doLayout');
-  this._handleInodes();
-  this._handleLeaves();
-  this._firstTransition = false;
-  console.timeEnd('_doLayout');
-}
-
-D3BlobTreeMap.prototype._highlightElement = function(datum, selection) {
-  this._showHighlight(datum, selection);
-}
-
-D3BlobTreeMap.prototype._unhighlightElement = function(datum, selection) {
-  this._hideHighlight(datum, selection);
-}
-
-D3BlobTreeMap.prototype._handleInodes = function() {
-  console.time('_handleInodes');
-  var thisTreeMap = this;
-  var inodes = this._currentNodes.filter(function(datum){
-    return (datum.depth <= thisTreeMap._currentMaxDepth) &&
-            datum.children !== undefined;
-  });
-  var cellsEnter = this._mapContainer.selectAll('div.inode')
-      .data(inodes, function(datum) { return datum.id; })
-      .enter()
-      .append('div').attr('class', 'inode').attr('id', function(datum){
-          return 'node-' + datum.id;});
-
-
-  // Define enter/update/exit for inodes
-  cellsEnter
-      .append('div')
-      .attr('class', 'rect inode_rect_entering')
-      .style('z-index', function(datum) { return datum.id * 2; })
-      .style('position', 'absolute')
-      .style('left', function(datum) { return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum){ return datum.dx; })
-      .style('height', function(datum){ return datum.dy; })
-      .style('opacity', '0')
-      .style('border', '1px solid black')
-      .style('background-image', function(datum) {
-        return thisTreeMap._makeblobBucketBackgroundImage.call(
-               thisTreeMap, datum);
-      })
-      .style('background-color', function(datum) {
-        if (datum.t === undefined) return 'rgb(220,220,220)';
-        return D3BlobTreeMap.getColorForType(datum.t).toString();
-      })
-      .on('mouseover', function(datum){
-        thisTreeMap._highlightElement.call(
-            thisTreeMap, datum, d3.select(this));
-        thisTreeMap._showInfoBox.call(thisTreeMap, datum);
-      })
-      .on('mouseout', function(datum){
-        thisTreeMap._unhighlightElement.call(
-            thisTreeMap, datum, d3.select(this));
-        thisTreeMap._hideInfoBox.call(thisTreeMap, datum);
-      })
-      .on('mousemove', function(){
-          thisTreeMap._moveInfoBox.call(thisTreeMap, event);
-      })
-      .on('dblclick', function(datum){
-        if (datum !== thisTreeMap._currentRoot) {
-          // Zoom into the selection
-          thisTreeMap._zoomDatum(datum);
-        } else if (datum.parent) {
-          console.log('event.shiftKey=' + event.shiftKey);
-          if (event.shiftKey === true) {
-            // Back to root
-            thisTreeMap._zoomDatum(thisTreeMap._treeData);
-          } else {
-            // Zoom out of the selection
-            thisTreeMap._zoomDatum(datum.parent);
-          }
-        }
-      });
-  cellsEnter
-      .append('div')
-      .attr('class', 'label inode_label_entering')
-      .style('z-index', function(datum) { return (datum.id * 2) + 1; })
-      .style('position', 'absolute')
-      .style('left', function(datum){ return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum) { return datum.dx; })
-      .style('height', function(datum) { return thisTreeMap.boxPadding.t; })
-      .style('opacity', '0')
-      .style('pointer-events', 'none')
-      .style('-webkit-user-select', 'none')
-      .style('overflow', 'hidden') // required for ellipsis
-      .style('white-space', 'nowrap') // required for ellipsis
-      .style('text-overflow', 'ellipsis')
-      .style('text-align', 'center')
-      .style('vertical-align', 'top')
-      .style('visibility', function(datum) {
-        return (datum.dx < 15 || datum.dy < 15) ? 'hidden' : 'visible';
-      })
-      .text(function(datum) {
-        var sizeish = ' [' + D3BlobTreeMap._byteify(datum.value) + ']'
-        var text;
-        if (datum === thisTreeMap._currentRoot) {
-          // The top-most level should always show the complete path
-          text = thisTreeMap.pathFor(datum);
-        } else {
-          // Anything that isn't a bucket or a leaf (blob) or the
-          // current root should just show its name.
-          text = datum.n;
-        }
-        return text + sizeish;
-      }
-  );
-
-  // Complicated transition logic:
-  // For nodes that are entering, we want to fade them in in-place AFTER
-  // any adjusting nodes have resized and moved around. That way, new nodes
-  // seamlessly appear in the right spot after their containers have resized
-  // and moved around.
-  // To do this we do some trickery:
-  // 1. Define a '_entering' class on the entering elements
-  // 2. Use this to select only the entering elements and apply the opacity
-  //    transition.
-  // 3. Use the same transition to drop the '_entering' suffix, so that they
-  //    will correctly update in later zoom/resize/whatever operations.
-  // 4. The update transition is achieved by selecting the elements without
-  //    the '_entering_' suffix and applying movement and resizing transition
-  //    effects.
-  this._mapContainer.selectAll('div.inode_rect_entering').transition()
-      .duration(thisTreeMap._enterDuration).delay(
-          this._firstTransition ? 0 : thisTreeMap._exitDuration +
-              thisTreeMap._updateDuration)
-      .attr('class', 'rect inode_rect')
-      .style('opacity', '1')
-  this._mapContainer.selectAll('div.inode_label_entering').transition()
-      .duration(thisTreeMap._enterDuration).delay(
-          this._firstTransition ? 0 : thisTreeMap._exitDuration +
-              thisTreeMap._updateDuration)
-      .attr('class', 'label inode_label')
-      .style('opacity', '1')
-  this._mapContainer.selectAll('div.inode_rect').transition()
-      .duration(thisTreeMap._updateDuration).delay(thisTreeMap._exitDuration)
-      .style('opacity', '1')
-      .style('background-image', function(datum) {
-        return thisTreeMap._makeblobBucketBackgroundImage.call(
-            thisTreeMap, datum);
-      })
-      .style('left', function(datum) { return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum){ return datum.dx; })
-      .style('height', function(datum){ return datum.dy; });
-  this._mapContainer.selectAll('div.inode_label').transition()
-      .duration(thisTreeMap._updateDuration).delay(thisTreeMap._exitDuration)
-      .style('opacity', '1')
-      .style('visibility', function(datum) {
-        return (datum.dx < 15 || datum.dy < 15) ? 'hidden' : 'visible';
-      })
-      .style('left', function(datum){ return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum) { return datum.dx; })
-      .style('height', function(datum) { return thisTreeMap.boxPadding.t; })
-      .text(function(datum) {
-        var sizeish = ' [' + D3BlobTreeMap._byteify(datum.value) + ']'
-        var text;
-        if (datum === thisTreeMap._currentRoot) {
-          // The top-most level should always show the complete path
-          text = thisTreeMap.pathFor(datum);
-        } else {
-          // Anything that isn't a bucket or a leaf (blob) or the
-          // current root should just show its name.
-          text = datum.n;
-        }
-        return text + sizeish;
-      });
-  var exit = this._mapContainer.selectAll('div.inode')
-      .data(inodes, function(datum) { return 'inode-' + datum.id; })
-      .exit();
-  exit.selectAll('div.inode_rect').transition().duration(
-      thisTreeMap._exitDuration).style('opacity', 0);
-  exit.selectAll('div.inode_label').transition().duration(
-      thisTreeMap._exitDuration).style('opacity', 0);
-  exit.transition().delay(thisTreeMap._exitDuration + 1).remove();
-
-  console.log(inodes.length + ' inodes layed out.');
-  console.timeEnd('_handleInodes');
-}
-
-D3BlobTreeMap.prototype._handleLeaves = function() {
-  console.time('_handleLeaves');
-  var color_fn = d3.scale.category10();
-  var thisTreeMap = this;
-  var leaves = this._currentNodes.filter(function(datum){
-    return (datum.depth <= thisTreeMap._currentMaxDepth) &&
-        datum.children === undefined; });
-  var cellsEnter = this._mapContainer.selectAll('div.leaf')
-      .data(leaves, function(datum) { return datum.id; })
-      .enter()
-      .append('div').attr('class', 'leaf').attr('id', function(datum){
-        return 'node-' + datum.id;
-      });
-
-  // Define enter/update/exit for leaves
-  cellsEnter
-      .append('div')
-      .attr('class', 'rect leaf_rect_entering')
-      .style('z-index', function(datum) { return datum.id * 2; })
-      .style('position', 'absolute')
-      .style('left', function(datum){ return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum){ return datum.dx; })
-      .style('height', function(datum){ return datum.dy; })
-      .style('opacity', '0')
-      .style('background-color', function(datum) {
-        if (datum.t === undefined) return 'rgb(220,220,220)';
-        return D3BlobTreeMap.getColorForType(datum.t)
-            .darker(0.3).toString();
-      })
-      .style('border', '1px solid black')
-      .on('mouseover', function(datum){
-        thisTreeMap._highlightElement.call(
-            thisTreeMap, datum, d3.select(this));
-        thisTreeMap._showInfoBox.call(thisTreeMap, datum);
-      })
-      .on('mouseout', function(datum){
-        thisTreeMap._unhighlightElement.call(
-            thisTreeMap, datum, d3.select(this));
-        thisTreeMap._hideInfoBox.call(thisTreeMap, datum);
-      })
-      .on('mousemove', function(){ thisTreeMap._moveInfoBox.call(
-        thisTreeMap, event);
-      });
-  cellsEnter
-      .append('div')
-      .attr('class', 'label leaf_label_entering')
-      .style('z-index', function(datum) { return (datum.id * 2) + 1; })
-      .style('position', 'absolute')
-      .style('left', function(datum){ return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum) { return datum.dx; })
-      .style('height', function(datum) { return datum.dy; })
-      .style('opacity', '0')
-      .style('pointer-events', 'none')
-      .style('-webkit-user-select', 'none')
-      .style('overflow', 'hidden') // required for ellipsis
-      .style('white-space', 'nowrap') // required for ellipsis
-      .style('text-overflow', 'ellipsis')
-      .style('text-align', 'center')
-      .style('vertical-align', 'middle')
-      .style('visibility', function(datum) {
-        return (datum.dx < 15 || datum.dy < 15) ? 'hidden' : 'visible';
-      })
-      .text(function(datum) { return datum.n; });
-
-  // Complicated transition logic: See note in _handleInodes()
-  this._mapContainer.selectAll('div.leaf_rect_entering').transition()
-      .duration(thisTreeMap._enterDuration).delay(
-          this._firstTransition ? 0 : thisTreeMap._exitDuration +
-              thisTreeMap._updateDuration)
-      .attr('class', 'rect leaf_rect')
-      .style('opacity', '1')
-  this._mapContainer.selectAll('div.leaf_label_entering').transition()
-      .duration(thisTreeMap._enterDuration).delay(
-          this._firstTransition ? 0 : thisTreeMap._exitDuration +
-              thisTreeMap._updateDuration)
-      .attr('class', 'label leaf_label')
-      .style('opacity', '1')
-  this._mapContainer.selectAll('div.leaf_rect').transition()
-      .duration(thisTreeMap._updateDuration).delay(thisTreeMap._exitDuration)
-      .style('opacity', '1')
-      .style('left', function(datum){ return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum){ return datum.dx; })
-      .style('height', function(datum){ return datum.dy; });
-  this._mapContainer.selectAll('div.leaf_label').transition()
-      .duration(thisTreeMap._updateDuration).delay(thisTreeMap._exitDuration)
-      .style('opacity', '1')
-      .style('visibility', function(datum) {
-          return (datum.dx < 15 || datum.dy < 15) ? 'hidden' : 'visible';
-      })
-      .style('left', function(datum){ return datum.x; })
-      .style('top', function(datum){ return datum.y; })
-      .style('width', function(datum) { return datum.dx; })
-      .style('height', function(datum) { return datum.dy; });
-  var exit = this._mapContainer.selectAll('div.leaf')
-      .data(leaves, function(datum) { return 'leaf-' + datum.id; })
-      .exit();
-  exit.selectAll('div.leaf_rect').transition()
-      .duration(thisTreeMap._exitDuration)
-      .style('opacity', 0);
-  exit.selectAll('div.leaf_label').transition()
-      .duration(thisTreeMap._exitDuration)
-      .style('opacity', 0);
-  exit.transition().delay(thisTreeMap._exitDuration + 1).remove();
-
-  console.log(leaves.length + ' leaves layed out.');
-  console.timeEnd('_handleLeaves');
-}
-
-D3BlobTreeMap.prototype._makeblobBucketBackgroundImage = function(datum) {
-  if (!(datum.t === undefined && datum.depth == this._currentMaxDepth)) {
-    return 'none';
-  }
-  var text = '';
-  var lastStop = 0;
-  for (var x = 0; x < D3BlobTreeMap._BLOB_TYPES.length; x++) {
-    blob_type = D3BlobTreeMap._BLOB_TYPES[x];
-    var stats = datum.blob_stats[blob_type];
-    if (stats !== undefined) {
-      if (text.length !== 0) {
-        text += ', ';
-      }
-      var percent = 100 * (stats.size / datum.value);
-      var nowStop = lastStop + percent;
-      var tempcolor = D3BlobTreeMap.getColorForType(blob_type);
-      var color = d3.rgb(tempcolor).toString();
-      text += color + ' ' + lastStop + '%, ' + color + ' ' +
-          nowStop + '%';
-      lastStop = nowStop;
-    }
-  }
-  return 'linear-gradient(' + (datum.dx > datum.dy ? 'to right' :
-                               'to bottom') + ', ' + text + ')';
-}
-
-D3BlobTreeMap.prototype.pathFor = function(datum) {
-  if (datum.__path) return datum.__path;
-  parts=[];
-  node = datum;
-  while (node) {
-    if (node.k === 'p') { // path node
-      if(node.n !== '/') parts.unshift(node.n);
-    }
-    node = node.parent;
-  }
-  datum.__path = '/' + parts.join('/');
-  return datum.__path;
-}
-
-D3BlobTreeMap.prototype._createHighlight = function(datum, selection) {
-  var x = parseInt(selection.style('left'));
-  var y = parseInt(selection.style('top'));
-  var w = parseInt(selection.style('width'));
-  var h = parseInt(selection.style('height'));
-  datum.highlight = this._mapContainer.append('div')
-      .attr('id', 'h-' + datum.id)
-      .attr('class', 'highlight')
-      .style('pointer-events', 'none')
-      .style('-webkit-user-select', 'none')
-      .style('z-index', '999999')
-      .style('position', 'absolute')
-      .style('top', y-2)
-      .style('left', x-2)
-      .style('width', w+4)
-      .style('height', h+4)
-      .style('margin', 0)
-      .style('padding', 0)
-      .style('border', '4px outset rgba(250,40,200,0.9)')
-      .style('box-sizing', 'border-box')
-      .style('opacity', 0.0);
-}
-
-D3BlobTreeMap.prototype._showHighlight = function(datum, selection) {
-  if (datum === this._currentRoot) return;
-  if (datum.highlight === undefined) {
-    this._createHighlight(datum, selection);
-  }
-  datum.highlight.transition().duration(200).style('opacity', 1.0);
-}
-
-D3BlobTreeMap.prototype._hideHighlight = function(datum, selection) {
-  if (datum.highlight === undefined) return;
-  datum.highlight.transition().duration(750)
-      .style('opacity', 0)
-      .each('end', function(){
-        if (datum.highlight) datum.highlight.remove();
-        delete datum.highlight;
-      });
-}
-
-D3BlobTreeMap.prototype._createInfoBox = function() {
-  return d3.select('body')
-      .append('div')
-      .attr('id', 'infobox')
-      .style('z-index', '2147483647') // (2^31) - 1: Hopefully safe :)
-      .style('position', 'absolute')
-      .style('visibility', 'hidden')
-      .style('background-color', 'rgba(255,255,255, 0.9)')
-      .style('border', '1px solid black')
-      .style('padding', '10px')
-      .style('-webkit-user-select', 'none')
-      .style('box-shadow', '3px 3px rgba(70,70,70,0.5)')
-      .style('border-radius', '10px')
-      .style('white-space', 'nowrap');
-}
-
-D3BlobTreeMap.prototype._showInfoBox = function(datum) {
-  this.infobox.text('');
-  var numBlobs = 0;
-  var sizeish = D3BlobTreeMap._pretty(datum.value) + ' bytes (' +
-      D3BlobTreeMap._byteify(datum.value) + ')';
-  if (datum.k === 'p') { // path
-    if (datum.blob_stats) { // can be empty if filters are applied
-      for (var x = 0; x < D3BlobTreeMap._BLOB_TYPES.length; x++) {
-        blob_type = D3BlobTreeMap._BLOB_TYPES[x];
-        var stats = datum.blob_stats[blob_type];
-        if (stats !== undefined) numBlobs += stats.count;
-      }
-    }
-  } else if (datum.k === 's') { // blob
-    numBlobs = 1;
-  }
-
-  if (datum.k === 'p' && !datum.lastPathElement) {
-    this.infobox.append('div').text('Package: ' + this.pathFor(datum))
-    this.infobox.append('div').text('Size: ' + sizeish);
-  } else {
-    if (datum.k === 'p') { // path
-      this.infobox.append('div').text('File: ' + this.pathFor(datum))
-      this.infobox.append('div').text('Size: ' + sizeish);
-    } else if (datum.k === 's') { // blob
-      this.infobox.append('div').text('blob: ' + datum.n);
-      this.infobox.append('div').text('Type: ' +
-          D3BlobTreeMap._getblobDescription(datum.t));
-      this.infobox.append('div').text('Size (proportional): ' + sizeish);
-      this.infobox.append('div').text('Reference count: ' + datum.c);
-      this.infobox.append('div').text('Location: ' + this.pathFor(datum))
-    }
-  }
-  if (datum.k === 'p') {
-    this.infobox.append('div')
-        .text('Number of blobs: ' + D3BlobTreeMap._pretty(numBlobs));
-    if (datum.blob_stats) { // can be empty if filters are applied
-      var table = this.infobox.append('table')
-          .attr('border', 1).append('tbody');
-      var header = table.append('tr');
-      header.append('th').text('Type');
-      header.append('th').text('Count');
-      header.append('th')
-          .style('white-space', 'nowrap')
-          .text('Total Size (Bytes)');
-      for (var x = 0; x < D3BlobTreeMap._BLOB_TYPES.length; x++) {
-        blob_type = D3BlobTreeMap._BLOB_TYPES[x];
-        var stats = datum.blob_stats[blob_type];
-        if (stats !== undefined) {
-          var tr = table.append('tr');
-          tr.append('td')
-              .style('white-space', 'nowrap')
-              .text(D3BlobTreeMap._getblobDescription(
-                  blob_type));
-          tr.append('td').text(D3BlobTreeMap._pretty(stats.count));
-          tr.append('td').text(D3BlobTreeMap._pretty(stats.size));
-        }
-      }
-    }
-  }
-  this.infobox.style('visibility', 'visible');
-}
-
-D3BlobTreeMap.prototype._hideInfoBox = function(datum) {
-  this.infobox.style('visibility', 'hidden');
-}
-
-D3BlobTreeMap.prototype._moveInfoBox = function(event) {
-  var element = document.getElementById('infobox');
-  var w = element.offsetWidth;
-  var h = element.offsetHeight;
-  var offsetLeft = 10;
-  var offsetTop = 10;
-
-  var rightLimit = window.innerWidth;
-  var rightEdge = event.pageX + offsetLeft + w;
-  if (rightEdge > rightLimit) {
-    // Too close to screen edge, reflect around the cursor
-    offsetLeft = -1 * (w + offsetLeft);
-  }
-
-  var bottomLimit = window.innerHeight;
-  var bottomEdge = event.pageY + offsetTop + h;
-  if (bottomEdge > bottomLimit) {
-    // Too close ot screen edge, reflect around the cursor
-    offsetTop = -1 * (h + offsetTop);
-  }
-
-  this.infobox.style('top', (event.pageY + offsetTop) + 'px')
-      .style('left', (event.pageX + offsetLeft) + 'px');
-}
-
-D3BlobTreeMap.prototype.biggestblobs = function(maxRecords) {
-  var result = undefined;
-  var smallest = undefined;
-  var sortFunction = function(a,b) {
-    var result = b.value - a.value;
-    if (result !== 0) return result; // sort by size
-    var pathA = treemap.pathFor(a); // sort by path
-    var pathB = treemap.pathFor(b);
-    if (pathA > pathB) return 1;
-    if (pathB > pathA) return -1;
-    return a.n - b.n; // sort by blob name
-  };
-  this.visitFromDisplayedRoot(function(datum) {
-    if (datum.children) return; // ignore non-leaves
-    if (!result) { // first element
-      result = [datum];
-      smallest = datum.value;
-      return;
-    }
-    if (result.length < maxRecords) { // filling the array
-      result.push(datum);
-      return;
-    }
-    if (datum.value > smallest) { // array is already full
-      result.push(datum);
-      result.sort(sortFunction);
-      result.pop(); // get rid of smallest element
-      smallest = result[maxRecords - 1].value; // new threshold for entry
-    }
-  });
-  result.sort(sortFunction);
-  return result;
-}
-
-D3BlobTreeMap.prototype.biggestPaths = function(maxRecords) {
-  var result = undefined;
-  var smallest = undefined;
-  var sortFunction = function(a,b) {
-    var result = b.value - a.value;
-    if (result !== 0) return result; // sort by size
-    var pathA = treemap.pathFor(a); // sort by path
-    var pathB = treemap.pathFor(b);
-    if (pathA > pathB) return 1;
-    if (pathB > pathA) return -1;
-    console.log('warning, multiple entries for the same path: ' + pathA);
-    return 0; // should be impossible
-  };
-  this.visitFromDisplayedRoot(function(datum) {
-    if (!datum.lastPathElement) return; // ignore non-files
-    if (!result) { // first element
-      result = [datum];
-      smallest = datum.value;
-      return;
-    }
-    if (result.length < maxRecords) { // filling the array
-      result.push(datum);
-      return;
-    }
-    if (datum.value > smallest) { // array is already full
-      result.push(datum);
-      result.sort(sortFunction);
-      result.pop(); // get rid of smallest element
-      smallest = result[maxRecords - 1].value; // new threshold for entry
-    }
-  });
-  result.sort(sortFunction);
-  return result;
-}
diff --git a/blobstats/template/index.html b/blobstats/template/index.html
deleted file mode 100644
index 7331446..0000000
--- a/blobstats/template/index.html
+++ /dev/null
@@ -1,453 +0,0 @@
-<!--
-  Copyright 2014 The Fuchsia Authors. All rights reserved.
-  Use of this source code is governed by a BSD-style license that can be
-  found in the LICENSE file.
--->
-<html>
-<head>
-<title>Blob Size Analysis</title>
-<script src="d3/d3.js" charset="utf-8"></script>
-<script src="D3BlobTreeMap.js" charset="utf-8"></script>
-<script src="data.js" charset="utf-8"></script>
-<style>
-body {
-    margin: 0px;
-    padding: 5px;
-}
-.swatch {
-    border: 1px solid rgb(100,100,100);
-    -webkit-user-select: none;
-    cursor: default;
-}
-</style>
-<script>
-var treemap;
-var filterChanging = false;
-var savedSettings = {};
-
-function init() {
-    if (window.metadata !== undefined && window.metadata.subtitle) {
-        document.getElementById('subtitle').innerHTML = ': ' + escape(metadata.subtitle);
-    }
-    initFilterOptions();
-    treemap = new D3BlobTreeMap(
-        savedSettings.width,
-        savedSettings.height,
-        savedSettings.maxLevels);
-    treemap.init();
-}
-
-function getIdealSizes() {
-    var width = window.innerWidth - 20;
-    var height = window.innerHeight - 70;
-    return {'width': width, 'height': height};
-}
-
-function showReport(title, data, headers, dataFunction, styleFunction) {
-    var div =  d3.select('body').append('div')
-        .style('margin', '0')
-        .style('padding', '5px')
-        .style('position', 'absolute')
-        .style('top', '10%')
-        .style('left', '10%')
-        .style('background-color', 'rgba(255,255,255,0.9)')
-        .style('width', '80%')
-        .style('height', '80%')
-        .style('z-index', '2147483647')
-        .style('border', '3px ridge grey')
-        .style('box-shadow', '10px 10px 5px rgba(80,80,80,0.7)')
-        .style('text-align', 'center')
-        .style('border-radius', '10px');
-    var titlebar = div.append('div')
-        .style('margin', '0')
-        .style('padding', '5px')
-        .style('position', 'absolute')
-        .style('top', '0%')
-        .style('left', '0%')
-        .style('width', '100%')
-        .style('height', '10%')
-        .style('font-size', 'x-large');
-    titlebar.text(title);
-    var controls = div.append('div')
-        .style('margin', '0')
-        .style('padding', '5px')
-        .style('position', 'absolute')
-        .style('top', '90%')
-        .style('left', '0%')
-        .style('width', '100%')
-        .style('height', '10%');
-    controls.append('input').attr('type', 'button')
-        .attr('value', 'Dismiss')
-        .on('click', function(){div.remove();});
-
-    var tableDiv = div.append('div')
-        .style('overflow', 'auto')
-        .style('position', 'absolute')
-        .style('top', '10%')
-        .style('left', '0%')
-        .style('width', '100%')
-        .style('height', '80%')
-        .style('border-top', '1px solid rgb(230,230,230)')
-        .style('border-bottom', '1px solid rgb(230,230,230)');
-    var table = tableDiv.append('table')
-        .attr('border', '1')
-        .attr('cellspacing', '0')
-        .attr('cellpadding', '2')
-        .style('margin-left', 'auto')
-        .style('margin-right', 'auto');
-    var header = table.append('tr');
-    for (var i = 0; i < headers.length; i++) {
-        header.append('th').text(headers[i]);
-    }
-
-    for (var i = 0; i < data.length; i++) {
-        var row = table.append('tr');
-        for (j = 0; j < headers.length; j++) {
-            var td = row.append('td');
-            if (styleFunction) {
-                styleFunction.call(this, td, j);
-            }
-            dataFunction.call(this, data[i], j, td);
-        }
-    }
-}
-
-function bigBlobsReport() {
-    var list = treemap.biggestblobs(100);
-    var headers = ['Rank', 'Size (Bytes)', 'Type', 'Location'];
-    var styleFunction = function(selection, index) {
-        if (index === 3) {
-            selection.style('font-family', 'monospace');
-        }
-    };
-    var recordIndex = 1;
-    var dataFunction = function(record, index, cell) {
-        if (index === 0) {
-            cell.text(recordIndex++);
-        } else if (index === 1) {
-            cell.text(D3BlobTreeMap._pretty(record.value));
-        } else if (index === 2) {
-            cell.text(record.t);
-        } else {
-            cell.append('span').text(treemap.pathFor(record));
-            cell.append('br');
-            cell.append('span').text('Blob: ');
-            cell.append('span').text(record.n);
-        }
-    };
-    showReport('100 Largest Blobs', list, headers, dataFunction, styleFunction);
-}
-
-function blobFilterTextChanged() {
-    if (filterChanging) return true;
-    filterChanging = true;
-    var enabled = document.getElementById('blob_types_filter').value;
-    for (var x=0; x<3; x++) {
-        var checkBox = document.getElementById('check_' + x);
-        checkBox.checked = (enabled.indexOf(checkBox.value) != -1);
-    }
-    filterChanging = false;
-}
-
-function updateFilterText() {
-    if (filterChanging) return true;
-    filterChanging = true;
-    var text = '';
-    for (var x=0; x<D3BlobTreeMap._BLOB_TYPES.length; x++) {
-        var checkBox = document.getElementById('check_' + x);
-        if (checkBox.checked) {
-            text += checkBox.value;
-        }
-    }
-    document.getElementById('blob_types_filter').value=text;
-    filterChanging = false;
-}
-
-function initFilterOptions() {
-    updateFilterText();
-    for (var x=0; x<D3BlobTreeMap._BLOB_TYPES.length; x++) {
-        var checkBox = document.getElementById('check_' + x);
-        checkBox.onchange=updateFilterText;
-        var swatch = document.getElementById('swatch_' + x);
-        swatch.style.backgroundColor = D3BlobTreeMap.getColorForType(checkBox.value).toString();
-    }
-    var gteCheckbox = document.getElementById('check_gte');
-    gteCheckbox.onchange = function() {
-        document.getElementById('blob_filter_gte').disabled = !gteCheckbox.checked;
-    }
-    var regexCheckbox = document.getElementById('check_regex');
-    regexCheckbox.onchange = function() {
-        document.getElementById('blob_filter_regex').disabled = !regexCheckbox.checked;
-    }
-    var excludeRegexCheckbox = document.getElementById('check_exclude_regex');
-    excludeRegexCheckbox.onchange = function() {
-        document.getElementById('blob_filter_exclude_regex').disabled = !excludeRegexCheckbox.checked;
-    }
-    var idealSizes = getIdealSizes();
-    document.getElementById('width').value = idealSizes.width;
-    document.getElementById('height').value = idealSizes.height;
-    saveFilterSettings();
-}
-
-function filterSetAll(enabled) {
-    for (var x=0; x<D3BlobTreeMap._BLOB_TYPES.length; x++) {
-        var checkBox = document.getElementById('check_' + x);
-        checkBox.checked = enabled;
-    }
-    updateFilterText();
-}
-
-function showOptions() {
-    loadFilterSettings();
-    var container = document.getElementById('options_container');
-    var w = container.offsetWidth;
-    var h = container.offsetHeight;
-    container.style.margin = '-' + (h/2) + 'px 0 0 -' + (w/2) + 'px';
-    container.style.visibility = 'visible';
-}
-
-function hideOptions() {
-    var container = document.getElementById('options_container');
-    container.style.visibility = 'hidden';
-}
-
-function applySettings() {
-    hideOptions();
-    var oldWidth = savedSettings.width;
-    var oldHeight = savedSettings.height;
-    var oldblobs = savedSettings.blobTypes;
-    var oldRegex = savedSettings.regex;
-    var oldExcludeRegex = savedSettings.excludeRegex;
-    var oldGte = savedSettings.gte;
-    var oldMaxLevels = savedSettings.maxLevels;
-    saveFilterSettings();
-    var resizeNeeded = oldWidth !== savedSettings.width || oldHeight !== savedSettings.height;
-    var regexChanged = oldRegex !== savedSettings.regex;
-    var excludeRegexChanged = oldExcludeRegex !== savedSettings.excludeRegex;
-    var blobsChanged = oldblobs !== savedSettings.blobTypes;
-    var gteChanged = oldGte !== savedSettings.gte;
-    var filterChanged = regexChanged || excludeRegexChanged || blobsChanged || gteChanged;
-    var maxLevelsChanged = oldMaxLevels !== savedSettings.maxLevels;
-
-    if (filterChanged) {
-        // Type filters
-        typeFilter = function(datum) {
-            if (datum.depth === 0) return true; // root node
-            if (datum.t === undefined) return true;
-            return savedSettings.blobTypes !== undefined &&
-                savedSettings.blobTypes.indexOf(datum.t) !== -1;
-        }
-
-        // Regex filter
-        var regexFilter = undefined;
-        if (savedSettings.regex !== undefined && savedSettings.regex.length > 0) {
-            console.log('filter: regex is "' + savedSettings.regex + '"');
-            var regex = new RegExp(savedSettings.regex);
-            regexFilter = function(datum) {
-                if (datum.depth === 0) return true; // root node
-                var fullName = this.pathFor(datum);
-                if (datum.children === undefined) { // it is a leaf node (blob)
-                    fullName += ':' + datum.n;
-                }
-                return regex.test(fullName);
-            }
-        }
-
-        // Exclude regex filter
-        var excludeRegexFilter = undefined;
-        if (savedSettings.excludeRegex !== undefined && savedSettings.excludeRegex.length > 0) {
-            console.log('filter: exclude-regex is "' + savedSettings.excludeRegex + '"');
-            var excludeRegex = new RegExp(savedSettings.excludeRegex);
-            excludeRegexFilter = function(datum) {
-                if (datum.depth === 0) return true; // root node
-                var fullName = this.pathFor(datum);
-                if (datum.children === undefined) { // it is a leaf node (blob)
-                    fullName += ':' + datum.n;
-                }
-                return !excludeRegex.test(fullName);
-            }
-        }
-
-        // Size filter
-        var sizeFilter = undefined;
-        if (savedSettings.gte !== undefined) {
-            console.log('filter: minimum size is ' + savedSettings.gte + ' bytes');
-            sizeFilter = function(datum) {
-                if (datum.children !== undefined) return true; // non-leaf
-                if (datum.value === undefined) console.log('whoops');
-                return datum.value >= savedSettings.gte;
-            }
-        }
-
-        // Make a filter to apply to the tree
-        var filter = function(datum) {
-            if (typeFilter && !typeFilter.call(this, datum)) return false;
-            if (regexFilter && !regexFilter.call(this, datum)) return false;
-            if (excludeRegexFilter && !excludeRegexFilter.call(this, datum)) return false;
-            if (sizeFilter && !sizeFilter.call(this, datum)) return false;
-            return true;
-        };
-        treemap.filter(filter);
-    }
-
-    // Adjust levels if needed.
-    if (maxLevelsChanged) {
-        treemap.setMaxLevels(savedSettings.maxLevels);
-    }
-
-    // Resize map if necessary.
-    if (resizeNeeded) {
-        console.log('desired treemap dimensions have changed, requesting resize');
-        treemap.resize(savedSettings.width, savedSettings.height);
-    }
-}
-
-function cancelSettings() {
-    hideOptions();
-    loadFilterSettings();
-}
-
-function saveFilterSettings() {
-    savedSettings.blobTypes = document.getElementById('blob_types_filter').value;
-    if (document.getElementById('check_regex').checked) {
-        savedSettings.regex = document.getElementById('blob_filter_regex').value;
-    } else {
-        savedSettings.regex = undefined;
-    }
-    if (document.getElementById('check_exclude_regex').checked) {
-        savedSettings.excludeRegex = document.getElementById('blob_filter_exclude_regex').value;
-    } else {
-        savedSettings.excludeRegex = undefined;
-    }
-    if (document.getElementById('check_gte').checked) {
-        savedSettings.gte = parseInt(document.getElementById('blob_filter_gte').value);
-    } else {
-        savedSettings.gte = undefined;
-    }
-    savedSettings.width = parseInt(document.getElementById('width').value);
-    savedSettings.height = parseInt(document.getElementById('height').value);
-    savedSettings.maxLevels = parseInt(document.getElementById('max_levels').value);
-}
-
-function loadFilterSettings() {
-    document.getElementById('blob_types_filter').value = savedSettings.blobTypes;
-    blobFilterTextChanged();
-    if (savedSettings.regex !== undefined) {
-        document.getElementById('check_regex').checked = true;
-        document.getElementById('blob_filter_regex').value = savedSettings.regex;
-    } else {
-        document.getElementById('check_regex').checked = false;
-    }
-    if (savedSettings.excludeRegex !== undefined) {
-        document.getElementById('check_exclude_regex').checked = true;
-        document.getElementById('blob_filter_exclude_regex').value = savedSettings.excludeRegex;
-    } else {
-        document.getElementById('check_exclude_regex').checked = false;
-    }
-    if (savedSettings.gte !== undefined) {
-        document.getElementById('check_gte').checked = true;
-        document.getElementById('blob_filter_gte').value = savedSettings.gte;
-    } else {
-        document.getElementById('check_gte').checked = false;
-    }
-    document.getElementById('width').value = savedSettings.width;
-    document.getElementById('height').value = savedSettings.height;
-    document.getElementById('max_levels').value = savedSettings.maxLevels;
-}
-
-function escape(str) {
-    return str.replace(/&/g, '&amp;')
-              .replace(/"/g, '&quot;')
-              .replace(/</g, '&lt;')
-              .replace(/>/g, '&gt;');
-}
-</script>
-</head>
-<body onload='init()'>
-<div style='position: absolute; top: 5px; left: 5px;'>
-  <input type='button' onclick='showOptions()' value='Options &amp; Legend...'>
-  <span style='-webkit-user-select: none; cursor: help;' title='Click to view the blob legend or to configure filters and options for the treemap'>[?]</span>
-</div>
-<div style='position: absolute; right: 5px; top: 5px; white-space: nowrap;'>
-    Reports:
-    <input type='button' onclick='bigBlobsReport()' value='Large Blobs' title='Click to view a report of the largest 100 blobs that are with the bounds of the treemap that is currently displayed.'>
-</div>
-<div style='text-align: center; margin-bottom: 5px;'>
-    <span style='font-size: x-large; font-weight: bold; font-variant: small-caps'>Blob Size Analysis<span id='subtitle'></span></span>
-    <br><span style='font-size: small; font-style: italic;'>Double-click a box to zoom in, double-click outermost title to zoom out.</span>
-</div>
-<table id='options_container' style='visibility: hidden; border: 3px ridge grey; padding: 0px; top: 50%; left: 50%; position: fixed; z-index: 2147483646; overflow: auto; background-color: rgba(255,255,255,0.9); border-radius: 10px; box-shadow: 10px 10px 5px rgba(80,80,80,0.7);'><tr><td style='vertical-align: top'>
-    <table cellspacing=0 cellborder=0 style='width:100%'>
-        <tr><th colspan=3 style='padding-bottom: .25em; text-decoration: underline;'>blob Types To Show</th></tr>
-        <tr>
-            <td style='white-space: nowrap; vertical-align: top;'>
-                    <span class='swatch' id='swatch_0'>&nbsp;&nbsp;&nbsp;</span><input checked type='checkbox' id='check_0' value='dart'>Shared Dart
-                <br><span class='swatch' id='swatch_1'>&nbsp;&nbsp;&nbsp;</span><input checked type='checkbox' id='check_1' value='?'>Shared Unrecognized
-                <br><span class='swatch' id='swatch_2'>&nbsp;&nbsp;&nbsp;</span><input checked type='checkbox' id='check_2' value='unique'>Unique
-                <br><span class='swatch' id='swatch_3'>&nbsp;&nbsp;&nbsp;</span><input checked type='checkbox' id='check_3' value='uniDart'>Unique Dart
-        </tr>
-        <tr><td colspan=3 style='text-align: center; white-space: nowrap; padding-top: 1em;'>
-            Select <input type='button' onclick='filterSetAll(true)' value='All'>,
-            <input type='button' onclick='filterSetAll(false)' value='None'>,
-            or type a string: <input id='blob_types_filter' size=30 value='' onkeyup='blobFilterTextChanged()' onblur='updateFilterText()'>
-            <span style='-webkit-user-select: none; cursor: help;' title='Enter codes from the list above for the blobs you want to see. The checkboxes will update automatically to match the string that you enter.'>[?]</span>
-        </td></tr>
-   </table>
-</td></tr><tr><td style='vertical-align: top; padding-top: 10px; border-top: 1px solid grey;'>
-    <table cellspacing=0 cellborder=0 style='width: 100%'>
-        <tr><th colspan=2 style='padding-bottom: .25em; text-decoration: underline;'>Advanced Options</th></tr>
-        <tr>
-            <td style='white-space: nowrap; vertical-align: top;'>
-                <input type='checkbox' id='check_regex'>
-                Only include blobs matching this regex:
-            </td>
-            <td style='text-align: right; vertical-align: top;'>
-                <input disabled id='blob_filter_regex' size=30 value='' style='text-align: right;'>
-                <span style='-webkit-user-select: none; cursor: help;' title='Enter a javascript regex. Only blobs that match this regex will be shown. This filter applies before any exclusion regex specified below. The format of each blob is [path]:[blob_name]'>[?]</span>
-            </td>
-        </tr>
-        <tr>
-            <td style='white-space: nowrap; vertical-align: top;'>
-                <input type='checkbox' id='check_exclude_regex'>
-                Exclude all blobs matching this regex:
-            </td>
-            <td style='text-align: right; vertical-align: top;'>
-                <input disabled id='blob_filter_exclude_regex' size=30 value='' style='text-align: right;'>
-                <span style='-webkit-user-select: none; cursor: help;' title='Enter a javascript regex. blobs that match this tegex will not be shown. This filter applies after any inclusion filter specified above. The format of each blob is [path]:[blob_name]'>[?]</span>
-            </td>
-        </tr>
-        <tr>
-            <td style='white-space: nowrap; vertical-align: top;'>
-                <input type='checkbox' id='check_gte'>
-                Only include blobs that are at least <span style='font-style: italic;'>n</span> bytes:
-            </td>
-            <td style='text-align: right; vertical-align: top;'>
-                <input disabled id='blob_filter_gte' size=8 value='' style='text-align: right;'>
-                <span style='-webkit-user-select: none; cursor: help;' title='blobs whose size is less than this value will be hidden.'>[?]</span>
-            </td>
-        </tr>
-        <tr>
-            <td style='white-space: nowrap vertical-align: top;;'>
-                Show at most <span style='font-style: italic;'>n</span> levels of detail at a time:
-            </td>
-            <td style='text-align: right; vertical-align: top;'>
-                <input id='max_levels' size=4 value='2' style='text-align: right;'><span style='-webkit-user-select: none; cursor: help;' title='Increasing this value shows more detail without the need to zoom, but uses more computing power.'>[?]</span>
-            </td>
-        </tr>
-        <tr>
-            <td style='white-space: nowrap vertical-align: top;;'>
-                Set the size of the treemap to <span style='font-style: italic;'>W x H</span> pixels:
-            </td>
-            <td style='text-align: right; vertical-align: top;'>
-                <input id='width' size=4 value='' style='text-align: right;'>
-                &nbsp;x&nbsp;<input id='height' size=4 value='' style='text-align: right;'>
-            </td>
-        </tr>
-    </table>
-</td></tr>
-<tr><td style='padding-top: 10px; text-align: right; border-top: 1px solid grey'>
-    <input type='button' value='Apply' onclick='applySettings()'>
-    <input type='button' value='Cancel' onclick='cancelSettings()'>
-</td></tr></table>
-</body>
-</html>
diff --git a/bootstrap b/bootstrap
deleted file mode 100755
index 6c71b7d..0000000
--- a/bootstrap
+++ /dev/null
@@ -1,44 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-set -e
-
-function usage {
-  cat <<END
-usage: bootstrap [zircon|garnet|peridot|topaz]
-
-Bootstrap a Fuchsia development environment for the given project. Defaults to
-boostraping at topaz. For more information about the Fuchsia tree
-see <https://fuchsia.googlesource.com/docs/+/master/development/source_code/layers.md>.
-END
-}
-
-if [[ $# -gt 1 ]]; then
-  usage
-  exit 1
-fi
-
-project=${1:-topaz}
-
-if [[ "${project}" != "zircon" ]] &&
-   [[ "${project}" != "garnet" ]] &&
-   [[ "${project}" != "peridot" ]] &&
-   [[ "${project}" != "topaz" ]]; then
-  usage
-  exit 1
-fi
-
-# The fetched script will
-# - create "fuchsia" directory if it does not exist,
-# - download "jiri" command to "fuchsia/.jiri_root/bin"
-curl -s "https://fuchsia.googlesource.com/jiri/+/master/scripts/bootstrap_jiri?format=TEXT" | base64 --decode | bash -s fuchsia
-cd fuchsia
-
-.jiri_root/bin/jiri import -name="integration" "${project}/${project}" "https://fuchsia.googlesource.com/integration"
-.jiri_root/bin/jiri override ${project} "https://fuchsia.googlesource.com/${project}"
-.jiri_root/bin/jiri update
-
-echo "Done creating ${project} development environment at \"$(pwd)\"."
-echo "Recommended: export PATH=\"$(pwd)/.jiri_root/bin:\$PATH\""
diff --git a/build-qemu.sh b/build-qemu.sh
deleted file mode 100755
index eec1b44..0000000
--- a/build-qemu.sh
+++ /dev/null
@@ -1,92 +0,0 @@
-#!/usr/bin/env bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly ROOT_DIR="$(dirname "${SCRIPT_DIR}")"
-
-readonly HOST_ARCH="$(uname -m)"
-readonly HOST_OS="$(uname | tr '[:upper:]' '[:lower:]')"
-readonly HOST_TRIPLE="${HOST_ARCH}-${HOST_OS}"
-
-if [[ "x${HOST_OS}" == "xlinux" ]]; then
-  readonly QEMU_HOST_FLAGS="${QEMU_FLAGS} --disable-gtk --enable-sdl=internal --enable-kvm"
-elif [[ "x${HOST_OS}" == "xdarwin" ]]; then
-  readonly QEMU_HOST_FLAGS="${QEMU_FLAGS} --enable-cocoa"
-else
-  echo "unsupported system: ${HOST_OS}" 1>&2
-  exit 1
-fi
-
-set -eo pipefail; [[ "${TRACE}" ]] && set -x
-
-usage() {
-  printf >&2 '%s: [-c] [-o outdir] [-d destdir] [-j jobs] [-s srcdir]\n' "$0"
-  exit 1
-}
-
-build() {
-  local srcdir="$1" outdir="$2" destdir="$3" clean="$4" jobs="$5"
-
-  if [[ "${clean}" = "true" ]]; then
-    rm -rf -- "${outdir}/build-qemu-${HOST_TRIPLE}"
-  fi
-
-  rm -rf "${destdir}/qemu-${HOST_TRIPLE}"
-
-  mkdir -p -- "${outdir}/build-qemu-${HOST_TRIPLE}"
-  pushd "${outdir}/build-qemu-${HOST_TRIPLE}"
-  ${srcdir}/configure \
-    ${QEMU_HOST_FLAGS} \
-    --prefix= \
-    --target-list=aarch64-softmmu,x86_64-softmmu \
-    --without-system-pixman \
-    --without-system-fdt \
-    --disable-vnc-jpeg \
-    --disable-vnc-png \
-    --disable-vnc-sasl \
-    --disable-vte \
-    --disable-docs \
-    --disable-curl \
-    --disable-debug-info \
-    --disable-qom-cast-debug \
-    --disable-guest-agent \
-    --disable-bluez \
-    --disable-brlapi \
-    --disable-gnutls \
-    --disable-gcrypt \
-    --disable-nettle \
-    --disable-virtfs \
-    --disable-vhost-net \
-    --disable-vhost-scsi \
-    --disable-vhost-vsock \
-    --disable-libusb \
-    --disable-smartcard \
-    --disable-tools \
-    --disable-tasn1
-  make -j "${jobs}"
-  make DESTDIR="${destdir}/qemu-${HOST_TRIPLE}" install
-  popd
-}
-
-declare CLEAN="${CLEAN:-false}"
-declare SRCDIR="${SRCDIR:-${ROOT_DIR}/third_party/qemu}"
-declare OUTDIR="${OUTDIR:-${ROOT_DIR}/out}"
-declare DESTDIR="${DESTDIR:-${OUTDIR}}"
-declare JOBS="${JOBS:-$(getconf _NPROCESSORS_ONLN)}"
-
-while getopts "cd:j:o:s:" opt; do
-  case "${opt}" in
-    c) CLEAN="true" ;;
-    d) DESTDIR="$(cd "${OPTARG}" >/dev/null 2>&1; pwd -P)" ;;
-    j) JOBS="${OPTARG}" ;;
-    o) OUTDIR="$(cd "${OPTARG}" >/dev/null 2>&1; pwd -P)" ;;
-    s) SRCDIR="$(cd "${OPTARG}" >/dev/null 2>&1; pwd -P)" ;;
-    *) usage;;
-  esac
-done
-
-readonly CLEAN SRCDIR OUTDIR DESTDIR JOBS
-
-build "${SRCDIR}" "${OUTDIR}" "${DESTDIR}" "${CLEAN}" "${JOBS}"
diff --git a/build-zircon.sh b/build-zircon.sh
deleted file mode 100755
index f18edf4..0000000
--- a/build-zircon.sh
+++ /dev/null
@@ -1,117 +0,0 @@
-#!/usr/bin/env bash
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly ROOT_DIR="$(dirname "${SCRIPT_DIR}")"
-
-JOBS=`getconf _NPROCESSORS_ONLN` || {
-  Cannot get number of processors
-  exit 1
-}
-
-set -eo pipefail; [[ "${TRACE}" ]] && set -x
-
-usage() {
-  echo "$0 <options> <extra-make-arguments>"
-  echo ""
-  echo "Options:"
-  echo "  -c: Clean before building"
-  echo "  -v: Level 1 verbosity"
-  echo "  -V: Level 2 verbosity"
-  echo "  -A: Build with ASan"
-  echo "  -H: Build host tools with ASan"
-  echo "  -n: Just print make recipes to STDOUT."
-  echo "  -l: Only build tools; do not build zircon."
-  echo "  -j N: Passed along to make (number of parallel jobs)"
-  echo "  -t <target>: Architecture (GN style) to build, instead of all"
-  echo "  -o <outdir>: Directory in which to put the build-zircon directory."
-  echo ""
-  echo "Additional arguments may be passed to make using standard FOO=bar syntax."
-  echo "E.g., build-zircon.sh USE_CLANG=true"
-}
-
-declare ASAN="false"
-declare CLEAN="false"
-declare DRY_RUN="false"
-declare HOST_ASAN="false"
-declare TOOLS_ONLY="false"
-declare OUTDIR="${ROOT_DIR}/out"
-declare VERBOSE="0"
-declare -a ARCHLIST=(arm64 x64)
-
-while getopts "AcHhlnj:t:p:o:vV" opt; do
-  case "${opt}" in
-    A) ASAN="true" ;;
-    c) CLEAN="true" ;;
-    H) HOST_ASAN="true" ;;
-    h) usage ; exit 0 ;;
-    n) DRY_RUN="true" ;;
-    j) JOBS="${OPTARG}" ;;
-    l) TOOLS_ONLY="true" ;;
-    o) OUTDIR="${OPTARG}" ;;
-    t) ARCHLIST=("${OPTARG}") ;;
-    v) VERBOSE="1" ;;
-    V) VERBOSE="2" ;;
-    *) usage 1>&2 ; exit 1 ;;
-  esac
-done
-shift $(($OPTIND - 1))
-
-readonly ASAN CLEAN DRY_RUN HOST_ASAN PROJECTS OUTDIR VERBOSE
-readonly ZIRCON_BUILDROOT="${OUTDIR}/build-zircon"
-readonly -a ARCHLIST
-
-if [[ "${CLEAN}" = "true" ]]; then
-  rm -rf -- "${ZIRCON_BUILDROOT}"
-fi
-
-# These variables are picked up by make from the environment.
-case "${VERBOSE}" in
-  1) QUIET=0 ; V=0 ;;
-  2) QUIET=0 ; V=1 ;;
-  *) QUIET=1 ; V=0 ;;
-esac
-export QUIET V
-
-if [[ "${ASAN}" = "true" ]]; then
-  readonly NOT_ASAN=false
-else
-  readonly NOT_ASAN=true
-fi
-
-if [[ "${DRY_RUN}" = "true" ]]; then
-  readonly DRY_RUN_ARGS="-Bnwk"
-fi
-
-make_zircon_common() {
-  (test $QUIET -ne 0 || set -x
-   exec make ${DRY_RUN_ARGS} --no-print-directory -C "${ROOT_DIR}/zircon" \
-             -j ${JOBS} DEBUG_BUILDROOT=../../zircon "$@")
-}
-
-# Build host tools.
-make_zircon_common \
-  BUILDDIR=${OUTDIR}/build-zircon HOST_USE_ASAN="${HOST_ASAN}" tools "$@"
-
-if [[ "${TOOLS_ONLY}" = "true" ]]; then
-  exit 0
-fi
-
-make_zircon_target() {
-  make_zircon_common \
-    BUILDROOT=${ZIRCON_BUILDROOT} TOOLS=${OUTDIR}/build-zircon/tools "$@"
-}
-
-for ARCH in "${ARCHLIST[@]}"; do
-    # Build without ASan for sysroot.  If all of userland will be ASan,
-    # then this build is only user libraries.
-    make_zircon_target PROJECT="${ARCH}" \
-        BUILDDIR_SUFFIX= ENABLE_ULIB_ONLY="${ASAN}" "$@"
-
-    # Always build at least the libraries with ASan, but never the sysroot.
-    make_zircon_target PROJECT="${ARCH}" \
-        BUILDDIR_SUFFIX=-asan USE_ASAN=true ENABLE_BUILD_SYSROOT=false \
-        ENABLE_ULIB_ONLY="${NOT_ASAN}"
-done
diff --git a/build_id_conv.py b/build_id_conv.py
deleted file mode 100755
index 80b67f0..0000000
--- a/build_id_conv.py
+++ /dev/null
@@ -1,128 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""
-This script allows for the conversion between ids.txt and .build-id formats.
-"""
-
-import os
-import sys
-import argparse
-
-# rel_to is expected to be absolute
-def abs_path(path, rel_to):
-    if os.path.isabs(path):
-        return path
-    else:
-        return os.path.abspath(os.path.join(rel_to, path))
-
-# rel_to is assumed to be absolute
-def read_ids_txt(ids_path, rel_to):
-    with open(ids_path) as f:
-        return {build_id:abs_path(path, rel_to) for (build_id, path) in (x.split() for x in f.readlines())}
-
-# build_id_dir is assumed to be absolute
-def read_build_id_dir(build_id_dir):
-    out = {}
-    for root, dirs, files in os.walk(build_id_dir):
-        if len(files) != 0 and len(dirs) != 0:
-            raise Exception("%s is not a valid .build-id directory" % build_id_dir)
-        for f in files:
-            out[os.path.basename(root) + f] = os.path.join(root, f)
-    return out
-
-def link(src, dst):
-    src = os.path.realpath(src)
-    if os.path.exists(dst):
-        os.remove(dst)
-    os.link(src, dst)
-
-def mkdir(path):
-    try:
-        os.makedirs(path)
-    except OSError as e:
-        if e.errno != os.errno.EEXIST:
-            raise e
-
-def touch(path):
-    if os.path.exists(path):
-        os.utime(path, None)
-    else:
-        with open(path, 'w'):
-            return
-
-def write_build_id_dir(build_id_dir, mods):
-    for build_id, path in mods.iteritems():
-        mkdir(os.path.join(build_id_dir, build_id[:2]))
-        link(path, os.path.join(build_id_dir, build_id[:2], build_id[2:] + ".debug"))
-
-# rel_to and path are assumed to be absolute
-# if rel_to is None fix_path returns the absolute path. If rel_to
-# is not None it turns the path into a relative path.
-def fix_path(path, rel_to):
-    if rel_to is None:
-        return path
-    return os.path.relpath(path, rel_to)
-
-# rel_to is assumed to be an absolute path
-def write_ids_txt(ids_path, rel_to, mods):
-    with open(ids_path, "w") as f:
-        for build_id, path in sorted(mods.iteritems()):
-            path = fix_path(mods[build_id], rel_to)
-            f.write("%s %s\n" % (build_id, path))
-
-def main():
-    ids_fmt = "ids.txt"
-    build_id_fmt = ".build-id"
-
-    parser = argparse.ArgumentParser(description="Convert between ids.txt and .build-id")
-    parser.add_argument("-O", "--output-format", help="Sets the output format.",
-                        metavar="FMT",
-                        choices=[ids_fmt, build_id_fmt])
-    parser.add_argument("--ids-rel-to-in",
-                        help="When reading ids.txt use paths relative to DIR",
-                        metavar="DIR")
-    parser.add_argument("--ids-rel-to-out",
-                        help="When writing ids.txt use paths relative to DIR",
-                        metavar="DIR")
-    parser.add_argument("--stamp",
-                        help="Touch STAMP after finishing",
-                        metavar="STAMP")
-    parser.add_argument("input")
-    parser.add_argument("output")
-
-    args = parser.parse_args()
-
-    input_path = os.path.abspath(args.input)
-    output_path = args.output
-    in_fmt = build_id_fmt if os.path.isdir(input_path) else ids_fmt
-    rel_to_in = os.path.abspath(args.ids_rel_to_in) if args.ids_rel_to_in is not None else None
-    rel_to_out = os.path.abspath(args.ids_rel_to_out) if args.ids_rel_to_out is not None else None
-
-    if in_fmt == ids_fmt:
-        if rel_to_in is None:
-          rel_to_in = os.path.abspath(os.path.dirname(input_path))
-        mods = read_ids_txt(input_path, rel_to_in)
-    else:
-        mods = read_build_id_dir(input_path)
-
-    if args.output_format == None:
-        if in_fmt == ids_fmt:
-            out_fmt = build_id_fmt
-        else:
-            out_fmt = ids_fmt
-    else:
-        out_fmt = args.output_format
-
-    if out_fmt == ids_fmt:
-        write_ids_txt(output_path, rel_to_out, mods)
-    else:
-        write_build_id_dir(output_path, mods)
-
-    if args.stamp is not None:
-        touch(args.stamp)
-
-if __name__ == "__main__":
-    main()
diff --git a/check-gn-format b/check-gn-format
deleted file mode 100755
index 83e0417..0000000
--- a/check-gn-format
+++ /dev/null
@@ -1,60 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly BUILDTOOLS_DIR="${SCRIPT_DIR}/../buildtools"
-readonly GN="${BUILDTOOLS_DIR}/gn"
-
-if [[ $# -gt 1 && "$1" = --fix ]]; then
-  readonly mode=fix
-  shift
-else
-  readonly mode=check
-fi
-
-if [[ $# -eq 0 ]]; then
-  echo >&2 "Usage: $0 [--fix] project..."
-  exit 1
-fi
-
-check_file() {
-  local -r file="$1"
-  "$GN" format --dry-run "$file" && return
-  local -r status=$?
-  if [[ $status -eq 2 ]]; then
-    echo "*** File $file is not formatted correctly."
-    echo "*** To fix, run: gn format $file"
-    echo
-    diff -u "$file" <("$GN" format --stdin < "$file")
-  fi
-  return $status
-}
-
-fix_file() {
-  local -r file="$1"
-  "$GN" format "$file"
-}
-
-handle_files() {
-  local status=0
-  while read file; do
-    ${mode}_file "$project/$file" || status=$?
-  done
-  return $status
-}
-
-set -o pipefail
-status=0
-for project in "$@"; do
-  git -C "$project" ls-files -- '*.gn*' | handle_files || status=$?
-done
-
-if [[ $status -ne 0 ]]; then
-  echo
-  echo "*** For a good time, try git file-format"
-  echo
-fi
-
-exit $status
diff --git a/colorize_logs b/colorize_logs
deleted file mode 100755
index 80c99b3..0000000
--- a/colorize_logs
+++ /dev/null
@@ -1,133 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright 2017 The Fuchsia Authors
-#
-# Use of this source code is governed by a MIT-style
-# license that can be found in the LICENSE file or at
-# https://opensource.org/licenses/MIT
-
-"""
-
-This tool will color the lines from loglistener
-
-Example usage #1:
-  # Colorize log lines, with all messages from the same thread in the same color
-  loglistener | scripts/colorize_logs -t
-
-Example usage #2:
-  # Colorize log lines, with all messages from the same process in the same color
-  loglistener | scripts/colorize_logs
-
-Example usage #3:
-  # Print the colorization of log.txt to stdout
-  # Identical to `scripts/colorize_logs < log.txt`
-  scripts/colorize_logs log.txt
-
-Example usage #4:
-  # Colorize all ERROR and INFO lines in log.txt
-  scripts/colorize_logs -r ERROR -r INFO log.txt
-
-Example usage #5:
-  # Colorize all lines with drv='<something>' in log.txt with distinct colors
-  # for each <something>
-  scripts/colorize_logs -r "drv='[^']*'" log.txt
-
-"""
-
-import argparse
-import re
-import sys
-
-BASE_COLORS = [
-    #'\033[40m', # black
-    '\033[91m', # red
-    '\033[92m', # green
-    '\033[93m', # yellow
-    '\033[94m', # blue
-    '\033[95m', # magenta
-    '\033[66m', # cyan
-    #'\033[47m', # white
-]
-RESET_BG = '\033[49m'
-RESET_FG = '\033[39m'
-
-class ColorAssigner(object):
-  def __init__(self, colors):
-    self.lru = list(colors)
-    self.task_colors = { }
-
-  def get_bg_color(self, task_id):
-    if task_id not in self.task_colors:
-      c = self.lru.pop(0)
-      self.task_colors[task_id] = c
-    else:
-      c = self.task_colors[task_id]
-      self.lru.remove(c)
-
-    self.lru.append(c)
-    return c
-
-
-PROCESS_RE = r'^\[\d+\.\d+] (\d+)\.\d+> .*$'
-THREAD_RE = r'^\[\d+\.\d+] (\d+\.\d+)> .*$'
-
-def main():
-  parser = argparse.ArgumentParser(
-      description=__doc__,
-      formatter_class=argparse.RawDescriptionHelpFormatter)
-  parser.add_argument("--process", "-p", dest="patterns", action="append_const",
-                      const=PROCESS_RE, help="Color code by process (default)")
-  parser.add_argument("--thread", "-t", dest="patterns", action="append_const",
-                      const=THREAD_RE, help="Color code by thread")
-  parser.add_argument("--regex", "-r", dest="patterns", action="append",
-                      help="Color by matching regexp")
-  parser.add_argument("input", nargs='?', action="store", default=None,
-                      help="The file to colorize.  Defaults to stdin")
-  args = parser.parse_args()
-
-  if args.input:
-    f = open(args.input, 'r')
-  else:
-    f = sys.stdin
-
-  # If no patterns were specified, use the process pattern.
-  if not args.patterns:
-    args.patterns = [PROCESS_RE]
-
-  # Define the identifier extractor.  It should be in group 1.
-  patterns = []
-  for pattern in args.patterns:
-    regex = re.compile(pattern)
-    if not regex.groups:
-      # if there's no group, wrap the pattern
-      regex = re.compile(r'^.*(' + pattern + r').*$')
-    patterns.append(regex)
-
-  assigner = ColorAssigner(BASE_COLORS);
-
-  while True:
-    line = f.readline()
-    if not line:
-      break
-
-    line = line.strip()
-    matched = False
-    for line_re in patterns:
-      m = line_re.match(line)
-      if m:
-        matched = True
-        task_id = m.group(1)
-        color = assigner.get_bg_color(task_id)
-
-        # Use join to avoid python putting a space between each value being
-        # printed.
-        print ''.join([color, line, RESET_BG, RESET_FG])
-        sys.stdout.flush()
-        break
-
-    if not matched:
-      print line
-      sys.stdout.flush()
-
-if __name__ == '__main__':
-  main()
diff --git a/crash/upload-symbols.go b/crash/upload-symbols.go
deleted file mode 100755
index 86e8294..0000000
--- a/crash/upload-symbols.go
+++ /dev/null
@@ -1,204 +0,0 @@
-///bin/true ; exec /usr/bin/env go run "$0" "$@"
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-package main
-
-import (
-	"bufio"
-	"flag"
-	"fmt"
-	"io/ioutil"
-	"log"
-	"os"
-	"os/exec"
-	"path"
-	"path/filepath"
-	"sort"
-	"strings"
-)
-
-var dryRun = flag.Bool("n", false, "Dry run - print what would happen but don't actually do it")
-var symbolDir = flag.String("symbol-dir", "", "Location for the symbol files")
-var upload = flag.Bool("upload", true, "Whether to upload the dumped symbols")
-var url = flag.String("url", "https://clients2.google.com/cr/symbol", "Endpoint to use")
-var verbose = flag.Bool("v", false, "Verbose output")
-var x64Symbols = flag.Bool("x64-symbols", true, "Include symbols from x64 build")
-var armSymbols = flag.Bool("arm-symbols", true, "Include symbols from arm64 build")
-
-func getBinariesFromIds(idsFilename string) sort.StringSlice {
-	var binaries []string
-	file, err := os.Open(idsFilename)
-	if err != nil {
-		log.Fatal(err)
-	}
-	defer file.Close()
-
-	scanner := bufio.NewScanner(file)
-	for scanner.Scan() {
-		binary := strings.SplitAfterN(scanner.Text(), " ", 2)[1]
-		binaries = append(binaries, binary)
-	}
-
-	if err := scanner.Err(); err != nil {
-		log.Fatal(err)
-	}
-
-	return binaries
-}
-
-func dump(bin, fuchsiaRoot string) string {
-	if *verbose || *dryRun {
-		fmt.Println("Dumping binary", bin)
-	}
-	relToRoot, err := filepath.Rel(fuchsiaRoot, bin)
-	if err != nil {
-		log.Fatal("could not filepath.Rel ", bin, ": ", err)
-	}
-	symfile := path.Join(*symbolDir, strings.Replace(relToRoot, "/", "#", -1)+".sym")
-	if *dryRun {
-		return symfile
-	}
-	out, err := exec.Command("buildtools/linux-x64/dump_syms/dump_syms", bin).Output()
-	if err != nil {
-		log.Fatal("could not dump_syms ", bin, ": ", err)
-	}
-
-	// Many Fuchsia binaries are built as "something.elf", but then packaged as
-	// just "something". In the ids.txt file, the name still includes the ".elf"
-	// extension, which dump_syms emits into the .sym file, and the crash server
-	// uses as part of the lookup (that is, both the name and the buildid have to
-	// match). So, if the first header line ends in ".elf" strip it off.
-	lines := strings.Split(string(out), "\n")
-	lines[0] = strings.TrimSuffix(lines[0], ".elf")
-	out = []byte(strings.Join(lines, "\n"))
-
-	err = ioutil.WriteFile(symfile, out, 0644)
-	if err != nil {
-		log.Fatal("could not write output file", symfile, ": ", err)
-	}
-
-	return symfile
-}
-
-func uploadSymbols(symfile string) {
-	if *verbose || *dryRun {
-		fmt.Println("Uploading symbols", symfile)
-	}
-	if *dryRun {
-		return
-	}
-
-	out, err := exec.Command("buildtools/linux-x64/symupload/sym_upload", symfile, *url).CombinedOutput()
-	if err != nil {
-		log.Fatal("sym_upload for ", symfile, " failed with output ", string(out), " error ", err)
-	}
-}
-
-func mkdir(d string) {
-	if *verbose || *dryRun {
-		fmt.Println("Making directory", d)
-	}
-	if *dryRun {
-		return
-	}
-	_, err := exec.Command("mkdir", "-p", d).Output()
-	if err != nil {
-		log.Fatal("could not create directory", d)
-	}
-}
-
-// Based on https://github.com/xtgo/set/blob/master/mutators.go#L17.
-func uniq(data sort.StringSlice) sort.StringSlice {
-	p, l := 0, data.Len()
-	if l <= 1 {
-		return data
-	}
-	for i := 1; i < l; i++ {
-		if !data.Less(p, i) {
-			continue
-		}
-		p++
-		if p < i {
-			data.Swap(p, i)
-		}
-	}
-	return data[:p+1]
-}
-
-func main() {
-	flag.Usage = func() {
-		fmt.Fprintf(os.Stderr, `Usage ./upload-symbols.go [flags] /path/to/fuchsia/root
-
-This script converts the symbols for a built tree into a format suitable for the
-crash server and then optionally uploads them.
-`)
-		flag.PrintDefaults()
-	}
-
-	flag.Parse()
-
-	fuchsiaRoot := flag.Arg(0)
-	if _, err := os.Stat(fuchsiaRoot); os.IsNotExist(err) {
-		flag.Usage()
-		log.Fatalf("Fuchsia root not found at \"%v\"\n", fuchsiaRoot)
-	}
-	cwd, err := os.Getwd()
-	if err != nil {
-		log.Fatalf("Could not Getwd")
-	}
-	cwd, err = filepath.EvalSymlinks(cwd)
-	if err != nil {
-		log.Fatalf("Could not EvalSymlinks")
-	}
-	fuchsiaRoot = filepath.Join(cwd, fuchsiaRoot)
-
-	if *symbolDir == "" {
-		var err error
-		*symbolDir, err = ioutil.TempDir("", "crash-symbols")
-		if err != nil {
-			log.Fatal("Could not create temporary directory: ", err)
-		}
-		defer os.RemoveAll(*symbolDir)
-	} else if _, err := os.Stat(*symbolDir); os.IsNotExist(err) {
-		mkdir(*symbolDir)
-	}
-
-	var binaries sort.StringSlice
-	zxBuildDir := "out/build-zircon"
-	if *x64Symbols {
-		x64BuildDir := "out/release-x64"
-		x64ZxBuildDir := path.Join(zxBuildDir, "build-x64")
-		binaries = append(binaries, getBinariesFromIds(path.Join(fuchsiaRoot, x64BuildDir, "ids.txt"))...)
-		binaries = append(binaries, getBinariesFromIds(path.Join(fuchsiaRoot, x64ZxBuildDir, "ids.txt"))...)
-	}
-	if *armSymbols {
-		armBuildDir := "out/release-arm64"
-		armZxBuildDir := path.Join(zxBuildDir, "build-arm64")
-		binaries = append(binaries, getBinariesFromIds(path.Join(fuchsiaRoot, armBuildDir, "ids.txt"))...)
-		binaries = append(binaries, getBinariesFromIds(path.Join(fuchsiaRoot, armZxBuildDir, "ids.txt"))...)
-	}
-	sort.Sort(binaries)
-	binaries = uniq(binaries)
-	for i, v := range binaries {
-		binaries[i], err = filepath.EvalSymlinks(v)
-		if err != nil {
-			log.Fatalf("Could not EvalSymlinks")
-		}
-	}
-
-	sem := make(chan bool, len(binaries))
-	for _, binary := range binaries {
-		go func(binary string) {
-			symfile := dump(binary, fuchsiaRoot)
-			if *upload {
-				uploadSymbols(symfile)
-			}
-			sem <- true
-		}(binary)
-	}
-	for i := 0; i < len(binaries); i++ {
-		<-sem
-	}
-}
diff --git a/dart/package_importer.py b/dart/package_importer.py
deleted file mode 100755
index 22daf3d..0000000
--- a/dart/package_importer.py
+++ /dev/null
@@ -1,288 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import paths
-import shutil
-import subprocess
-import sys
-import tempfile
-
-sys.path += [os.path.join(paths.FUCHSIA_ROOT, 'third_party', 'pyyaml', 'lib')]
-import yaml
-
-
-LICENSE_FILES = ['LICENSE', 'LICENSE.txt']
-
-
-IGNORED_EXTENSIONS = ['css', 'html', 'js', 'log', 'old', 'out',
-                      'packages', 'snapshot', 'zip']
-
-LOCAL_PACKAGES = {
-  'analyzer': '//third_party/dart/pkg/analyzer',
-  'build_integration': '//third_party/dart/pkg/build_integration',
-  'flutter': '//third_party/dart-pkg/git/flutter/packages/flutter',
-  'flutter_test': '//third_party/dart-pkg/git/flutter/packages/flutter_test',
-  'front_end': '//third_party/dart/pkg/front_end',
-  'func': '//third_party/dart/third_party/pkg/func',
-  'intl': '//third_party/dart/third_party/pkg/intl',
-  'kernel': '//third_party/dart/pkg/kernel',
-  'testing': '//third_party/dart/pkg/testing',
-  'linter': '//third_party/dart/third_party/pkg/linter',
-  'typed_mock': '//third_party/dart/pkg/typed_mock',
-}
-
-FORBIDDEN_PACKAGES = ['mojo', 'mojo_services']
-
-def parse_packages_file(dot_packages_path):
-    """ parse the list of packages and paths in .packages file """
-    packages = []
-    with open(dot_packages_path) as dot_packages:
-        # The packages specification says both '\r' and '\n' are valid line
-        # delimiters, which matches Python's 'universal newline' concept.
-        # Packages specification: https://github.com/dart-lang/dart_enhancement_proposals/blob/master/Accepted/0005%20-%20Package%20Specification/DEP-pkgspec.md
-        contents = dot_packages.read()
-        for line in unicode.splitlines(unicode(contents)):
-            if line.startswith('#'):
-                continue
-            delim = line.find(':')
-            if delim == -1:
-                continue
-            name = line[:delim]
-            path = line[delim + 1:-1]
-            packages.append((name, path))
-    return packages
-
-
-def parse_full_dependencies(yaml_path):
-    """ parse the content of a pubspec.yaml """
-    with open(yaml_path) as yaml_file:
-        parsed = yaml.safe_load(yaml_file)
-        if not parsed:
-            raise Exception('Could not parse yaml file: %s' % yaml_file)
-        package_name = parsed['name']
-        get_deps = lambda dep_type: parsed[dep_type] if dep_type in parsed and parsed[dep_type] else {}
-        deps = get_deps('dependencies')
-        dev_deps = get_deps('dev_dependencies')
-        dep_overrides = get_deps('dependency_overrides')
-        return (package_name, deps, dev_deps, dep_overrides)
-
-
-def parse_dependencies(yaml_path):
-    """ parse the dependency map out of a pubspec.yaml """
-    _, deps, _, _ = parse_full_dependencies(yaml_path)
-    return deps
-
-
-def write_build_file(build_gn_path, package_name, name_with_version, deps):
-    """ writes BUILD.gn file for Dart package with dependencies """
-    with open(build_gn_path, 'w') as build_gn:
-        build_gn.write('''# This file is generated by importer.py for %s
-
-import("//build/dart/dart_library.gni")
-
-dart_library("%s") {
-  package_name = "%s"
-
-  # This parameter is left empty as we don't care about analysis or exporting
-  # these sources outside of the tree.
-  sources = []
-
-  disable_analysis = true
-
-  deps = [
-''' % (name_with_version, package_name, package_name))
-        for dep in deps:
-            if dep in LOCAL_PACKAGES:
-                build_gn.write('    "%s",\n' % LOCAL_PACKAGES[dep])
-            else:
-                build_gn.write('    "//third_party/dart-pkg/pub/%s",\n' % dep)
-        build_gn.write('''  ]
-}
-''')
-
-
-def read_package_versions(base):
-    '''Scans the packages in a given directory.'''
-    result = {}
-    for (root, dirs, files) in os.walk(base):
-        for dir in dirs:
-            spec = os.path.join(root, dir, 'pubspec.yaml')
-            if not os.path.exists(spec):
-                continue
-            with open(spec, 'r') as spec_file:
-                data = yaml.safe_load(spec_file)
-                result[data['name']] = data['version']
-        break
-    return result
-
-
-def generate_package_diff(old_packages, new_packages, changelog):
-    '''Writes a changelog file with package version changes.'''
-    old = set(old_packages.iteritems())
-    new = set(new_packages.iteritems())
-    changed_keys = set([k for (k, _) in (old | new) - (old & new)])
-    if not changed_keys:
-        return
-    max_key_width = max(map(lambda k: len(k), changed_keys))
-    with open(changelog, 'w') as changelog_file:
-        for key in sorted(changed_keys):
-            old = old_packages.get(key, '<none>')
-            new = new_packages.get(key, '<none>')
-            changelog_file.write('%s %s --> %s\n' % (key.rjust(max_key_width),
-                                                     old.rjust(10),
-                                                     new.ljust(10)))
-
-
-def main():
-    parser = argparse.ArgumentParser('Import dart packages from pub')
-    parser.add_argument('--pub', required=True,
-                        help='Path to the pub executable')
-    parser.add_argument('--pubspecs', nargs='+',
-                        help='Paths to packages containing pubspec.yaml files')
-    parser.add_argument('--projects', nargs='+',
-                        help='Paths to projects containing dependency files')
-    parser.add_argument('--output', required=True,
-                        help='Path to the output directory')
-    parser.add_argument('--changelog',
-                        help='Path to the changelog file to write',
-                        default=None)
-    parser.add_argument('--debug',
-                        help='Turns on debugging mode',
-                        action='store_true')
-    args = parser.parse_args()
-
-    def debug_print(message):
-        if args.debug:
-            print(message)
-
-    tempdir = tempfile.mkdtemp()
-    debug_print('Working directory: ' + tempdir)
-    try:
-        importer_dir = os.path.join(tempdir, 'importer')
-        os.mkdir(importer_dir)
-
-        # Read the requested dependencies from the canonical packages.
-        packages = {}
-        additional_deps = {}
-        debug_print('------------------------')
-        debug_print('Development dependencies')
-        debug_print('------------------------')
-        for path in args.pubspecs:
-            yaml_file = os.path.join(path, 'pubspec.yaml')
-            package_name, _, dev_deps, _ = parse_full_dependencies(yaml_file)
-            packages[package_name] = path
-            additional_deps.update(dev_deps)
-            debug_print('# From ' + yaml_file)
-            for pair in sorted(dev_deps.items()):
-                debug_print(' - %s: %s' % pair)
-
-        # Generate a manifest containing all the dependencies we care about.
-        manifest = {
-            'name': 'importer',
-        }
-        dependencies = {}
-        for package_name in packages.keys():
-            dependencies[package_name] = 'any'
-        for dep, version in additional_deps.iteritems():
-            if dep in packages:
-                continue
-            dependencies[dep] = version
-        debug_print('-------------------------')
-        debug_print('Manually-set dependencies')
-        debug_print('-------------------------')
-        for project in args.projects:
-            yaml_file = os.path.join(project, 'dart_dependencies.yaml')
-            project_deps = parse_dependencies(yaml_file)
-            debug_print('# From ' + yaml_file)
-            for dep, version in sorted(project_deps.iteritems()):
-                dependencies[dep] = version
-                debug_print(' - %s: %s' % (dep, version))
-        manifest['dependencies'] = dependencies
-        overrides = {}
-        for package_name, path in packages.iteritems():
-            overrides[package_name] = {
-                'path': path,
-            }
-        manifest['dependency_overrides'] = overrides
-        with open(os.path.join(importer_dir, 'pubspec.yaml'), 'w') as pubspec:
-            yaml.safe_dump(manifest, pubspec)
-
-        old_packages = read_package_versions(args.output)
-
-        # Use pub to load the dependencies into a local cache.
-        pub_cache_dir = os.path.join(tempdir, 'pub_cache')
-        os.mkdir(pub_cache_dir)
-        env = os.environ
-        env['PUB_CACHE'] = pub_cache_dir
-        subprocess.check_call([args.pub, 'get'], cwd=importer_dir, env=env)
-
-        # Walk the cache and copy the packages we are interested in.
-        if os.path.exists(args.output):
-            for (root, dirs, files) in os.walk(args.output):
-                for dir in dirs:
-                    if dir != '.git':
-                        shutil.rmtree(os.path.join(root, dir))
-                # Only process the root of the output tree.
-                break
-
-        pub_packages = parse_packages_file(os.path.join(importer_dir, '.packages'))
-        for package in pub_packages:
-            if package[0] in packages:
-                # Skip canonical packages.
-                continue
-            if not package[1].startswith('file://'):
-                continue
-            source_dir = package[1][len('file://'):]
-            if not os.path.exists(source_dir):
-                continue
-            if source_dir.find('pub.dartlang.org') == -1:
-                print 'Package %s not from dartlang (%s), ignoring' % (package[0], source_dir)
-                continue
-            package_name = package[0]
-            # Don't import packages that live canonically in the tree.
-            if package_name in LOCAL_PACKAGES:
-                continue
-            if package_name in FORBIDDEN_PACKAGES:
-                print 'Warning: dependency on forbidden package %s' % package_name
-                continue
-            # We expect the .packages file to point to a directory called 'lib'
-            # inside the overall package, which will contain the LICENSE file
-            # and other potentially useful directories like 'bin'.
-            source_base_dir = os.path.dirname(os.path.abspath(source_dir))
-            name_with_version = os.path.basename(source_base_dir)
-            has_license = any(os.path.exists(os.path.join(source_base_dir, file_name))
-                              for file_name in LICENSE_FILES)
-            if not has_license:
-                print 'Could not find license file for %s, skipping' % package_name
-                continue
-            pubspec_path = os.path.join(source_base_dir, 'pubspec.yaml')
-            deps = []
-            if os.path.exists(pubspec_path):
-                deps = parse_dependencies(pubspec_path)
-            dest_dir = os.path.join(args.output, package_name)
-            shutil.copytree(source_base_dir, dest_dir,
-                            ignore=shutil.ignore_patterns(
-                                *('*.' + extension for extension in IGNORED_EXTENSIONS)))
-            # We don't need the 'test' directory of packages we import as that
-            # directory exists to test that package and some of our packages
-            # have very heavy test directories, so nuke those.
-            test_path = os.path.join(dest_dir, 'test')
-            if os.path.exists(test_path):
-                shutil.rmtree(test_path)
-            write_build_file(os.path.join(dest_dir, 'BUILD.gn'), package_name,
-                             name_with_version, deps)
-
-        if args.changelog:
-            new_packages = read_package_versions(args.output)
-            generate_package_diff(old_packages, new_packages, args.changelog)
-
-    finally:
-        if not args.debug:
-            shutil.rmtree(tempdir)
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/dart/paths.py b/dart/paths.py
deleted file mode 100755
index 68e16a3..0000000
--- a/dart/paths.py
+++ /dev/null
@@ -1,11 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import os
-
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # dart
-    os.path.abspath(__file__))))
diff --git a/dart/report_coverage.py b/dart/report_coverage.py
deleted file mode 100755
index 7045570..0000000
--- a/dart/report_coverage.py
+++ /dev/null
@@ -1,195 +0,0 @@
-#!/usr/bin/env python
-
-# This program generates a combined coverage report for all host-side dart tests.
-# See example_commands and arg help strings in ParseArgs() for usage.
-#
-# Implementation sketch:
-# Search the host_tests directory for tests that use dart-tools/fuchsia_tester.
-# Run each test with --coverage and --coverage-path.
-# Combine the coverage data from each test into one.
-# Generate an HTML report.
-#
-# This is all pretty hacky. Longer term efforts to make this more automatic and
-# less hacky tracked by IN-427.
-
-from __future__ import print_function
-import argparse
-import collections
-import distutils.spawn
-import glob
-import os
-from multiprocessing.pool import ThreadPool
-import paths
-import re
-import subprocess
-import sys
-import tempfile
-
-
-TestResult = collections.namedtuple(
-    'TestResult', ('exit_code', 'coverage_data_path', 'package_dir'))
-DEV_NULL = open('/dev/null', 'w')
-LCOV = 'lcov'
-GENHTML = 'genhtml'
-
-def ParseArgs():
-  example_commands = """
-
-  Examples:
-  $ report_coverage.py --report-dir /tmp/cov
-  $ report_coverage.py --test-patterns 'foo_*_test,bar_test' --report-dir ...
-  $ report_coverage.py --out-dir out/x64 --report-dir ...
-  """
-  p = argparse.ArgumentParser(
-      description='Generates a coverage report for dart tests',
-      epilog=example_commands,
-      formatter_class=argparse.RawDescriptionHelpFormatter)
-
-  p.add_argument(
-      '--report-dir',
-      type=str,
-      help='Where to write the report. Will be created if needed',
-      required=True)
-  p.add_argument(
-      '--test-patterns',
-      type=str,
-      help=('Comma-separated list of glob patterns to match against test file '
-            'base names'),
-      default='*')
-  p.add_argument('--out-dir', type=str, help='fuchsia build out dir')
-
-  return p.parse_args()
-
-
-def OutDir(args):
-  if args.out_dir:
-    out_dir = args.out_dir
-
-    if not os.path.isabs(out_dir):
-      out_dir = os.path.join(paths.FUCHSIA_ROOT, out_dir)
-
-    if not os.path.isdir(out_dir):
-      sys.exit(out_dir + ' is not a directory')
-    return out_dir
-
-  if os.environ.get('FUCHSIA_BUILD_DIR'):
-    return os.environ.get('FUCHSIA_BUILD_DIR')
-
-  fuchsia_dir = os.environ.get('FUCHSIA_DIR', paths.FUCHSIA_ROOT)
-  fuchsia_config_file = os.path.join(fuchsia_dir, '.config')
-  if os.path.isfile(fuchsia_config_file):
-    fuchsia_config = open(fuchsia_config_file).read()
-    m = re.search(r'FUCHSIA_BUILD_DIR=[\'"]([^\s\'"]*)', fuchsia_config)
-    if m:
-      return os.path.join(fuchsia_dir, m.group(1))
-
-  return None
-
-
-class TestRunner(object):
-
-  def __init__(self, out_dir):
-    self.out_dir = out_dir
-
-  def RunTest(self, test_path):
-    # This whole function super hacky. Assumes implementation details which are
-    # not meant to be public.
-
-    # test_path actually refers to a script that executes other tests.
-    # The other tests that get executed go into this list.
-    leaf_test_paths = []
-    test_lines = open(test_path, 'r').readlines()
-    # We expect a script that starts with shebang.
-    if not test_lines or not test_lines[0].startswith('#!'):
-      return []
-    for test_line in test_lines[1:]:  # Skip the shebang.
-      test_line_parts = test_line.strip().split()
-      if not test_line_parts:
-        continue
-      if os.path.join(self.out_dir, 'dartlang', 'gen') in test_line_parts[0]:
-        leaf_test_paths.append(test_line_parts[0])
-    results = [self._RunLeafTest(p) for p in leaf_test_paths]
-    return [result for result in results if result] # filter None
-
-  def _RunLeafTest(self, test_path):
-    test_lines = open(test_path, 'r').readlines()
-    for test_line in test_lines:
-      test_line_parts = test_line.strip().split()
-      if not test_line_parts:
-        continue
-      if test_line_parts[0].endswith('dart-tools/fuchsia_tester'):
-        is_dart_test = True
-      elif test_line_parts[0].startswith('--test-directory='):
-        test_directory = test_line_parts[0].split('=')[1]
-    if not is_dart_test:
-      return None
-    if not test_directory:
-      raise ValueError('Failed to find --test-directory arg in %s' % test_path)
-    coverage_data_handle, coverage_data_path = tempfile.mkstemp()
-    os.close(coverage_data_handle)
-    exit_code = subprocess.call((
-        test_path, '--coverage', '--coverage-path=%s' % coverage_data_path),
-        stdout=DEV_NULL, stderr=DEV_NULL)
-    if not os.stat(coverage_data_path).st_size:
-      print('%s produced no coverage data' % os.path.basename(test_path),
-            file=sys.stderr)
-      return None
-    return TestResult(
-        exit_code, coverage_data_path, os.path.dirname(test_directory))
-
-
-def MakeRelativePathsAbsolute(test_result):
-  """Change source-file paths from relative-to-the-package to absolute."""
-  with open(test_result.coverage_data_path, 'r+') as coverage_data_file:
-    fixed_data = coverage_data_file.read().replace(
-        'SF:', 'SF:%s/' % test_result.package_dir)
-    coverage_data_file.seek(0)
-    coverage_data_file.write(fixed_data)
-
-
-def CombineCoverageData(test_results):
-  output_handle, output_path = tempfile.mkstemp()
-  os.close(output_handle)
-  lcov_cmd = [LCOV, '--output-file', output_path]
-  for test_result in test_results:
-    lcov_cmd.extend(['--add-tracefile', test_result.coverage_data_path])
-  subprocess.check_call(lcov_cmd, stdout=DEV_NULL, stderr=DEV_NULL)
-  return output_path
-
-
-def main():
-  args = ParseArgs()
-  out_dir = OutDir(args)
-  if not out_dir:
-    sys.exit('Couldn\'t find the output directory, pass --out-dir '
-             '(absolute or relative to Fuchsia root) or set FUCHSIA_BUILD_DIR.')
-  if not (distutils.spawn.find_executable(LCOV) and
-          distutils.spawn.find_executable(GENHTML)):
-    sys.exit('\'lcov\' and \'genhtml\' must be installed and in the PATH')
-  host_tests_dir = os.path.join(out_dir, 'host_tests')
-  test_patterns = args.test_patterns.split(',')
-  test_paths = []
-  for test_pattern in test_patterns:
-    test_paths.extend(glob.glob(os.path.join(host_tests_dir, test_pattern)))
-  thread_pool = ThreadPool()
-  test_runner = TestRunner(out_dir)
-  results_lists = thread_pool.map(test_runner.RunTest, test_paths)
-  # flatten
-  results = [result for sublist in results_lists for result in sublist]
-  if not results:
-    sys.exit('Found no dart tests that produced coverage data')
-  for result in results:
-    if result.exit_code:
-      sys.exit('%s failed' % test_path)
-  thread_pool.map(MakeRelativePathsAbsolute, results)
-  combined_coverage_path = CombineCoverageData(results)
-  subprocess.check_call(
-      (GENHTML, combined_coverage_path, '--output-directory', args.report_dir),
-      stdout=DEV_NULL, stderr=DEV_NULL)
-  print('Open file://%s to view the report' %
-        os.path.join(os.path.abspath(args.report_dir), 'index.html'),
-        file=sys.stderr)
-
-
-if __name__ == '__main__':
-    main()
diff --git a/dart/update_3p_packages.py b/dart/update_3p_packages.py
deleted file mode 100755
index fa809c4..0000000
--- a/dart/update_3p_packages.py
+++ /dev/null
@@ -1,94 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import paths
-import subprocess
-import sys
-
-# These are the locations of pubspec files that are roots of the dependency
-# graph.  If they contain conflicting requirements for a package 'pub get' will
-# error out and the conflicts will have to be resolved before the packages can
-# be updated.
-ROOT_PUBSPECS = [
-    'third_party/dart/pkg/analyzer',
-    'third_party/dart/pkg/build_integration',
-    'third_party/dart/pkg/expect',
-    'third_party/dart/pkg/front_end',
-    'third_party/dart/pkg/kernel',
-    'third_party/dart/pkg/testing',
-    'third_party/dart-pkg/git/flutter/examples/flutter_gallery',
-    'third_party/dart-pkg/git/flutter/packages/flutter',
-    'third_party/dart-pkg/git/flutter/packages/flutter_test',
-    'third_party/dart-pkg/git/flutter/packages/flutter_tools',
-    'third_party/flutter/sky/packages/sky_engine',
-]
-
-# These are the locations of yaml files listing the Dart dependencies of a git
-# project.
-PROJECT_DEPENDENCIES = [
-    'topaz/app/dashboard',
-    'topaz/public/dart/widgets',
-    'topaz/tools',
-]
-
-
-def main():
-    parser = argparse.ArgumentParser('Update third-party Dart packages')
-    parser.add_argument('--changelog',
-                        help='Path to the changelog file to write',
-                        default=None)
-    parser.add_argument('--debug',
-                        help='Turns on debugging mode',
-                        action='store_true')
-    script_args = parser.parse_args()
-
-    if sys.platform.startswith('linux'):
-        platform = 'linux-x64'
-    elif sys.platform.startswith('darwin'):
-        platform = 'mac-x64'
-    else:
-        print('Unsupported platform: %s' % sys.platform)
-        return 1
-    pub_path = os.path.join(paths.FUCHSIA_ROOT, 'topaz', 'tools',
-                            'prebuilt-dart-sdk', platform, 'bin', 'pub')
-    importer_path = os.path.join(paths.FUCHSIA_ROOT, 'scripts', 'dart',
-                                 'package_importer.py')
-    output_path = os.path.join(paths.FUCHSIA_ROOT, 'third_party', 'dart-pkg',
-                               'pub')
-    flutter_root = os.path.join(paths.FUCHSIA_ROOT, 'third_party', 'dart-pkg',
-                                'git', 'flutter');
-
-    # flutter --version has the side effect of creating a version file that pub
-    # uses to find which package versions are compatible with the current version
-    # of flutter
-    flutter_tool = os.path.join(flutter_root, 'bin', 'flutter')
-    subprocess.check_call([flutter_tool, "--version"])
-
-    args = [importer_path]
-    if script_args.debug:
-        args.append('--debug')
-    args.append('--pub')
-    args.append(pub_path)
-    args.append('--output')
-    args.append(output_path)
-    args.append('--pubspecs')
-    for root in ROOT_PUBSPECS:
-        args.append(os.path.join(paths.FUCHSIA_ROOT, root))
-    args.append('--projects')
-    for project in PROJECT_DEPENDENCIES:
-        args.append(os.path.join(paths.FUCHSIA_ROOT, project))
-    if script_args.changelog:
-        args.extend([
-            '--changelog',
-            script_args.changelog,
-        ])
-
-    subprocess.check_call(args, env={"FLUTTER_ROOT": flutter_root});
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/dart/vmos.py b/dart/vmos.py
deleted file mode 100755
index 04d7572..0000000
--- a/dart/vmos.py
+++ /dev/null
@@ -1,122 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import paths
-import subprocess
-import sys
-
-CATEGORIES = [
-  'dart-codespace',
-  'dart-oldspace',
-  'dart-newspace',
-  'jemalloc-heap',
-  'pthread_t',
-  'magma_create_buffer',
-  'ScudoPrimary',
-  'ScudoSecondary',
-  'lib',
-]
-
-def FxSSH(address, command):
-  fx = os.path.join(paths.FUCHSIA_ROOT, 'scripts', 'fx')
-  cmd = [fx, 'ssh', address] + command
-  try:
-    result = subprocess.check_output(cmd, stderr=subprocess.STDOUT)
-  except subprocess.CalledProcessError as e:
-    print ("command failed: " + ' '.join(cmd) + "\n" +
-           "output: " + e.output)
-    return None
-  return result
-
-
-def HumanToBytes(size_str):
-  last = size_str[-1]
-  KB = 1024
-  if last == 'B':
-    multiplier = 1
-  elif last == 'k':
-    multiplier = KB
-  elif last == 'M':
-    multiplier = KB * KB
-  elif last == 'G':
-    multiplier = KB * KB * KB
-  elif last == 'T':
-    multiplier = KB * KB * KB * KB
-  else:
-    raise Exception('Unknown multiplier ' + last)
-  return float(size_str[:-1]) * multiplier
-
-
-def BytesToHuman(num, suffix='B'):
-  for unit in ['','Ki','Mi','Gi','Ti','Pi','Ei','Zi']:
-    if abs(num) < 1024.0:
-      return "%3.1f%s%s" % (num, unit, suffix)
-    num /= 1024.0
-  return "%.1f%s%s" % (num, 'Yi', suffix)
-
-
-# The output of vmos is:
-# rights  koid parent #chld #map #shr    size   alloc name [app]
-# on each line
-def ParseVmos(vmos, matchers):
-  vmo_lines = vmos.strip().split('\n')
-  sizes = {}
-  koids = {}
-  for vmo in vmo_lines:
-    # 1: koid, 5: process sharing, 6: size, 7: alloc, 8: name [9: app]
-    data = vmo.split()
-    if len(data) < 9:
-      continue
-    name = data[8]
-    if len(data) >= 10:
-      name = name + ' ' + data[9]
-    try:
-      b = HumanToBytes(data[7])
-    except:
-      continue
-    koid = int(data[1])
-    if koid in koids:
-      continue
-    koids[koid] = True
-    sharing = int(data[5])
-    if sharing == 0:
-      continue
-    for matcher in matchers:
-      if matcher not in name:
-        continue
-      if matcher in sizes:
-        sizes[matcher] = sizes[matcher] + (b / sharing)
-      else:
-        sizes[matcher] = (b / sharing)
-      break
-    if 'total' in sizes:
-      sizes['total'] = sizes['total'] + (b / sharing)
-    else:
-      sizes['total'] = (b / sharing)
-  return sizes
-
-
-def Main():
-  parser = argparse.ArgumentParser('Display stats about Dart VMOs')
-  parser.add_argument('--pid', '-p',
-      required=True,
-      help='pid of the target process.')
-  parser.add_argument('--address', '-a',
-      required=True,
-      help='ipv4 address of the target device')
-  args = parser.parse_args()
-
-  vmos = FxSSH(args.address, ['vmos', args.pid])
-  sizes = ParseVmos(vmos, CATEGORIES)
-  for k, v in sizes.iteritems():
-    print k + ", " + BytesToHuman(v)
-
-  return 0
-
-
-if __name__ == '__main__':
-  sys.exit(Main())
diff --git a/devshell/add-driver b/devshell/add-driver
deleted file mode 100755
index 488fa2b..0000000
--- a/devshell/add-driver
+++ /dev/null
@@ -1,26 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### builds, copies to target, and adds a driver to devmgr
-
-## usage: fx add-driver TARGET
-## Builds the specified target (e.g., fxl_unittests), copies it to the target, and
-## adds it to devmgr. Useful for tight iterations on drivers.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-if [[ $# -eq 0 ]]; then
-  fx-command-help
-  exit 1
-fi
-
-target="$1"
-
-fx-command-run build "${target}"
-fx-command-run cp "${FUCHSIA_BUILD_DIR}/${target}" "/tmp/${target}"
-fx-command-run shell "dm add-driver:/tmp/${target}"
diff --git a/devshell/add-update-source b/devshell/add-update-source
deleted file mode 100755
index 264c897..0000000
--- a/devshell/add-update-source
+++ /dev/null
@@ -1,107 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### register dev host as target's update source
-
-## usage: fx add-update-source [--type=TYPE] [--name=NAME] [--disable-source=NAME]
-##
-## Configure target device to use a new update source.
-##
-## When --type is devhost (the default), configure the target to use the dev
-## host's address as seen from the target fuchsia device.
-##
-## When --type is localhost, configure the target to use 127.0.0.1 as its
-## update source.
-##
-## --name=NAME           Name the generated update source config NAME. Defaults to the config type.
-## --disable-source=NAME Disable the update source with NAME after adding the new update source.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help add-update-source
-}
-
-function main {
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  config_type="devhost"
-  source_name=""
-  disable_source=""
-  while [[ $# -ne 0 ]]; do
-    case "$1" in
-      --type)
-        config_type="$2"
-        shift
-        ;;
-      --name)
-        source_name="$2"
-        shift
-        ;;
-      --disable-source)
-        disable_source="$2"
-        shift
-        ;;
-      *)
-        echo >&2 "Unrecognized option: $1"
-        usage
-        exit 1
-    esac
-    shift
-  done
-
-  if [[ -z "${source_name}" ]]; then
-    source_name="${config_type}"
-  fi
-
-  local local_addr="$(fx-command-run netaddr --local "$(get-device-name)")"
-  if [[ $? -ne 0 || -z "${local_addr}" ]]; then
-    echo >&2 "Unable to determine host's IP.  Is the target up?"
-    exit 1
-  fi
-  # Strip interface name suffix.
-  local_addr="${local_addr%%%*}"
-
-  repository_dir="${FUCHSIA_BUILD_DIR}/amber-files/repository"
-  if [[ ! -d "${repository_dir}" ]]; then
-    echo >&2 "Amber repository does not exist.  Please build first."
-    exit 1
-  fi
-
-  config_url="http://[${local_addr}]:8083/config.json"
-
-  if [[ "${config_type}" == "devhost" ]]; then
-    repo_url="http://[${local_addr}]:8083"
-  elif [[ "${config_type}" == "localhost" ]]; then
-    repo_url="http://127.0.0.1:8083"
-  else
-    echo >&2 "Unknown config type. Valid options: devhost, localhost"
-    exit 1
-  fi
-
-  fx-command-run shell amber_ctl add_src \
-    -n "${source_name}" \
-    -f "${config_url}"
-  err=$?
-
-  if [[ $err -ne 0 ]]; then
-    echo >&2 "Unable to register update source."
-    if [[ $err -eq 2 ]]; then
-      # The GET request failed.
-      echo >&2 " - Is 'fx serve' or 'fx serve-updates' running?"
-      echo >&2 " - Can the target reach the development host on tcp port 8083?"
-    fi
-    return "$err"
-  fi
-
-  if [[ -n "${disable_source}" ]]; then
-    # Best effort, don't show status or fail on error
-    fx-command-run shell amber_ctl disable_src -n "${disable_source}" >/dev/null 2>/dev/null || true
-  fi
-}
-
-main "$@"
diff --git a/devshell/blobstats b/devshell/blobstats
deleted file mode 100755
index 09af448..0000000
--- a/devshell/blobstats
+++ /dev/null
@@ -1,25 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### compute some blobfs statistics from the build
-
-DEVSHELL_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-source "${DEVSHELL_DIR}/lib/vars.sh" || exit $?
-fx-config-read
-
-case "$(uname -s)" in
-    Linux)
-        PREBUILT_DART="${DEVSHELL_DIR}/../../topaz/tools/prebuilt-dart-sdk/linux-x64/bin/dart"
-        ;;
-    Darwin)
-        PREBUILT_DART="${DEVSHELL_DIR}/../../topaz/tools/prebuilt-dart-sdk/mac-x64/bin/dart"
-        ;;
-esac
-
-cd "${FUCHSIA_BUILD_DIR}"
-
-exec "${PREBUILT_DART}" \
-  --packages="${DEVSHELL_DIR}/../blobstats/blobstats.packages" \
-  "${DEVSHELL_DIR}/../blobstats/blobstats.dart" "$@"
diff --git a/devshell/build b/devshell/build
deleted file mode 100755
index 97ae565..0000000
--- a/devshell/build
+++ /dev/null
@@ -1,65 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build Fuchsia
-
-## usage: fx build [ninja option,...] [target,...]
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function main {
-  local args=()
-  local have_load=false
-  local have_jobs=false
-  while [[ $# -gt 0 ]]; do
-    case "$1" in
-    -l) have_load=true ;;
-    -j) have_jobs=true ;;
-    esac
-    args+=("$1")
-    shift
-  done
-
-  if ! $have_load; then
-    if [[ "$(uname -s)" == "Darwin" ]]; then
-      # Load level on Darwin is quite different from that of Linux, wherein a
-      # load level of 1 per CPU is not necessarily a prohibitive load level. An
-      # unscientific study of build side effects suggests that cpus*20 is a
-      # reaosnable value to prevent catastrophic load (i.e. user can not kill
-      # the build, can not lock the screen, etc).
-      local cpus="$(fx-cpu-count)"
-      args=("-l" $(($cpus * 20)) "${args[@]}")
-    fi
-  fi
-
-  if ! $have_jobs; then
-    local concurrency="$(fx-choose-build-concurrency)"
-    # macOS in particular has a low default for number of open file descriptors
-    # per process, which is prohibitive for higher job counts. Here we raise
-    # the number of allowed file descriptors per process if it appears to be
-    # low in order to avoid failures due to the limit. See `getrlimit(2)` for
-    # more information.
-    local min_limit=$((${concurrency} * 2))
-    if [[ $(ulimit -n) -lt "${min_limit}" ]]; then
-      ulimit -n "${min_limit}"
-    fi
-    args=("-j" "${concurrency}" "${args[@]}")
-  fi
-
-
-  # TERM is passed for the pretty ninja UI
-  # PATH is passed as some tools are referenced via $PATH due to platform differences.
-  # TMPDIR is passed for Goma on macOS. TMPDIR must be set, or unset, not
-  # empty. Some Dart build tools have been observed writing into source paths
-  # when TMPDIR="" - it is deliberately unquoted and using the ${+} expansion
-  # expression).
-  fx-try-locked env -i TERM="${TERM}" PATH="${PATH}" \
-    ${NINJA_STATUS+"NINJA_STATUS=${NINJA_STATUS}"} \
-    ${TMPDIR+"TMPDIR=$TMPDIR"} \
-    "${FUCHSIA_DIR}/buildtools/ninja" -C "${FUCHSIA_BUILD_DIR}" "${args[@]}"
-}
-
-main "$@"
diff --git a/devshell/build-push b/devshell/build-push
deleted file mode 100755
index d484807..0000000
--- a/devshell/build-push
+++ /dev/null
@@ -1,52 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build Fuchsia and push to device
-
-## usage: fx build-push [ninja option,...] [target,...]
-##
-## Build ALL targets. After building all the targets, push the ones that were
-## supplied to this command. If no targets were specified push all of them.
-## The packages are pushed to the device specified. If no device is supplied
-## explictly this will push to the single connected device. If multiple devices
-## are connected and no device is specified, pushing will fail.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function main {
-  local args=()
-  local targets=()
-  while (($#)); do
-    case "$1" in
-      -C|-f|-j|-k|-l|-t|-w)
-        args+=("$1")
-        shift
-        args+=("$1")
-        ;;
-      -n|-v)
-        args+=("$1")
-        ;;
-      *)
-        targets+=("$1")
-    esac
-    shift
-  done
-
-  fx-command-run build "${args[@]}"
-
-  if [[ -z "$(pgrep -f "amber-files/repository")" ]]; then
-    echo
-    echo "WARNING: It looks like serve-updates is not running."
-    echo "WARNING: You probably need to start \"fx serve\""
-    echo
-  fi
-
-  fx-command-run push-package "${targets[@]}"
-}
-
-main "$@"
diff --git a/devshell/build-zircon b/devshell/build-zircon
deleted file mode 100755
index 2071072..0000000
--- a/devshell/build-zircon
+++ /dev/null
@@ -1,15 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build the kernel
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-echo "Building zircon..."
-"${FUCHSIA_DIR}/scripts/build-zircon.sh" \
-  -t "${FUCHSIA_ARCH}" \
-  -j "$(fx-choose-build-concurrency)" \
-  "${FUCHSIA_BUILD_ZIRCON_ARGS[@]}" "$@"
diff --git a/devshell/clean b/devshell/clean
deleted file mode 100755
index cc59db1..0000000
--- a/devshell/clean
+++ /dev/null
@@ -1,22 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### `gn clean` the FUCHSIA_BUILD_DIR
-
-## If FUCHSIA_BUILD_DIR is out/x64, this is simply:
-##   gn clean out/x64
-## It is useful to clean the build directory without having to re-gen.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-set -ex
-
-"${FUCHSIA_DIR}/buildtools/gn" clean "$FUCHSIA_BUILD_DIR"
-
-# Also clean zircon, it's not yet gn-ized, but we shouldn't care.
-# "fx build-zircon -c" interprets "clean" as rm -rf $ZIRCON_BUILDROOT,
-# so we do too.
-rm -rf -- "$ZIRCON_BUILDROOT"
diff --git a/devshell/clean-build b/devshell/clean-build
deleted file mode 100755
index 729f601..0000000
--- a/devshell/clean-build
+++ /dev/null
@@ -1,16 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### remove the out directory, then set and build
-
-## See `fx set` for usage.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-rm -rf -- "${FUCHSIA_DIR}/out"
-fx-command-run set "$@"
-fx-command-run full-build
diff --git a/devshell/compdb b/devshell/compdb
deleted file mode 100755
index eaa12c0..0000000
--- a/devshell/compdb
+++ /dev/null
@@ -1,60 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### generate a compilation database for the current build configuration
-
-## usage: fx compdb <-z>
-##        -z|--zircon to additonally generate compile_commands.json for Zircon
-##        -z option also concatenates the two compile_commands.json
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-generate_zircon () {
-    zirc_compdb=$(which compiledb || echo)
-    if [[ -z ${zirc_compdb} ]]; then
-        echo "Could not find compiledb, cannot generate Zircon compile_commands.json"
-        exit
-    fi
-
-    ( cd ${FUCHSIA_DIR}/zircon; "${FUCHSIA_DIR}/scripts/build-zircon.sh" -t "${FUCHSIA_ARCH}" -n | compiledb -o compile_commands.json -n) \
-    || echo "An unknown error has occurred"     #TODO what errors could this be?
-}
-
-main () {
-    zirc=0 #boolean for if compiledb has been run
-
-    case $1 in
-        -z|--zircon)
-            zirc=1
-        ;;
-        -h|--help)
-            echo "Script to generate compile_commands.json files"
-            echo "-z|--zircon to additonally generate compile_commands.json for Zircon"
-            echo "-z option also concatenates the two compile_commands.json"
-            echo
-            exit
-        ;;
-    esac
-
-    source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-    fx-config-read
-
-    fx-try-locked "${FUCHSIA_DIR}/buildtools/gn" gen "${FUCHSIA_BUILD_DIR}" --export-compile-commands
-    ln -sf "${FUCHSIA_BUILD_DIR}/compile_commands.json" "${FUCHSIA_DIR}/compile_commands.json"
-
-
-    if [[ $zirc -eq 1 ]]; then
-        generate_zircon
-        #concatenate the two files together by making a tmp file and then removing it when done
-
-        ${FUCHSIA_DIR}/scripts/editors/cat_compile_commands.py ${FUCHSIA_DIR}/zircon/compile_commands.json \
-        ${FUCHSIA_DIR}/out/${FUCHSIA_ARCH}/compile_commands.json > ${FUCHSIA_DIR}/compile_commands-tmp.json \
-        && cp ${FUCHSIA_DIR}/compile_commands-tmp.json ${FUCHSIA_DIR}/compile_commands.json \
-        && rm ${FUCHSIA_DIR}/compile_commands-tmp.json
-    fi
-}
-
-main "$@"
diff --git a/devshell/cp b/devshell/cp
deleted file mode 100755
index a39e81b..0000000
--- a/devshell/cp
+++ /dev/null
@@ -1,55 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### copy a file to/from a target device
-
-## usage: fx cp [--to-target|--to-host] SRC DST
-##
-## Copies a file from the host to the target device, or vice versa.
-##
-## --to-target: copy file SRC from host to DST on the target
-## --to-host: copy file SRC from target to DST on the host
-##
-## The default is to copy files to the target.
-
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-to_target=true
-if [[ $# -eq 3 ]]; then
-  case "$1" in
-  --to-target)
-    to_target=true
-    ;;
-  --to-host)
-    to_target=false
-    ;;
-  *)
-    fx-command-help
-    exit 1
-  esac
-  shift
-fi
-
-if [[ $# -ne 2 ]]; then
-  fx-command-help
-  exit 1
-fi
-
-src=$1
-dst=$2
-host="$(get-fuchsia-device-addr)"
-
-if [[ "${to_target}" = "true" ]]; then
-  fx-command-run sftp -q -b - "[${host}]" > /dev/null << EOF
-- rm ${dst}
-put ${src} ${dst}
-EOF
-else
-  rm -f -- "${dst}"
-  fx-command-run sftp -q -b - "[${host}]" > /dev/null << EOF
-get ${src} ${dst}
-EOF
-fi
diff --git a/devshell/dart-remote-test b/devshell/dart-remote-test
deleted file mode 100755
index a33271f..0000000
--- a/devshell/dart-remote-test
+++ /dev/null
@@ -1,47 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### runs a single remote test target through //scripts/run-dart-action.py
-
-## usage: fx dart-remote-test
-##            [-h|--help] <target-pattern>
-##
-## Attempts to run a test targeting a remote fuchsia device.
-##
-## This command requires Topaz in order to run. An example would be to run
-## this command against //topaz/example/test/driver_mod_example/* to run
-## all dart_remote_test targets.
-##
-## Arguments:
-##   -h|--help    Print out this message.
-
-DEVSHELL_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-VERBOSE=false
-source "${DEVSHELL_DIR}"/lib/vars.sh || exit $?
-fx-config-read
-
-case $1 in
-  -h|--help)
-  fx-command-help
-  exit 0
-  ;;
-esac
-
-TARGET_PATTERN="$1"
-shift
-if [ -z "$TARGET_PATTERN" ]; then
-  echo "Expected a target pattern"
-  fx-command-help
-  exit 1
-fi
-
-IP_ADDR_LINK_LOCAL="$(get-fuchsia-device-addr)"
-SSH_CONFIG="${FUCHSIA_BUILD_DIR}/ssh-keys/ssh_config"
-RUN_DART_ACTION="${FUCHSIA_DIR}/scripts/run-dart-action.py"
-"${RUN_DART_ACTION}" target-test \
-  --tree="${TARGET_PATTERN}" \
-  --out="${FUCHSIA_BUILD_DIR}" \
-  "${IP_ADDR_LINK_LOCAL}" \
-  "${SSH_CONFIG}"
diff --git a/devshell/dart-tunnel b/devshell/dart-tunnel
deleted file mode 100755
index 16748be..0000000
--- a/devshell/dart-tunnel
+++ /dev/null
@@ -1,73 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### forward local ports to Dart VMs on the device.
-
-## usage: fx dart-tunnel  [-h|--help] [-v|--verbose] [<isolate>]
-##
-## Creates an SSH tunnel with the device's running Dart VM(s) and leaves it open
-## until the user chooses to close it. Supplying an Isolate name will attempt to
-## connect to all isolates whose name match.
-##
-## The verbose flag is strongly discouraged unless you are debugging.
-##
-## This command requires Topaz in order to run.
-##
-## Arguments:
-##   -h|--help    Print out this message.
-
-DEVSHELL_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-VERBOSE=false
-source "${DEVSHELL_DIR}"/lib/vars.sh || exit $?
-fx-config-read
-
-case $1 in
-  -h|--help)
-  fx-command-help
-  exit 0
-  ;;
-  -v|--verbose)
-  shift # name
-  VERBOSE=true
-  ;;
-esac
-
-case "$(uname -s)" in
-    Linux)
-        PREBUILT_DART="${DEVSHELL_DIR}/../../topaz/tools/prebuilt-dart-sdk/linux-x64/bin/dart"
-        ;;
-    Darwin)
-        PREBUILT_DART="${DEVSHELL_DIR}/../../topaz/tools/prebuilt-dart-sdk/mac-x64/bin/dart"
-        ;;
-esac
-IP_ADDR_LINK_LOCAL="$(get-fuchsia-device-addr)"
-
-# Strips the back of the string the longest match of '%' followed by
-# any characters (the first two '%' characters are just the bash syntax).
-IP_ADDR="${IP_ADDR_LINK_LOCAL%%%*}"
-# Splits from the front of the string the longest match of any character
-# followed by '%'.
-IP_IFACE="${IP_ADDR_LINK_LOCAL##*%}"
-SSH_CONFIG="${FUCHSIA_BUILD_DIR}/ssh-keys/ssh_config"
-# Just compress the rest of the args to an array here.
-ARGS="$(echo "$@")"
-DART_TUNNEL_LIB_DIR="${DEVSHELL_DIR}/dart-tunnel-lib"
-DART_TUNNEL_PACKAGES="${DART_TUNNEL_LIB_DIR}/dart-tunnel.packages"
-DART_BIN="${DART_TUNNEL_LIB_DIR}/dart-tunnel.dart"
-
-# Conditionally prints the --verbose flag.
-function _verbose_flag() {
-  if [ "${VERBOSE}" = true ]; then
-    echo "--verbose"
-  fi
-}
-
-"${PREBUILT_DART}" --packages="${DART_TUNNEL_PACKAGES}" \
-  "${DART_BIN}" \
-  --ssh-config="${SSH_CONFIG}" \
-  --ip-address="${IP_ADDR}" \
-  --network-interface="${IP_IFACE}" \
-  --isolate="$ARGS" \
-  "$(_verbose_flag)"
diff --git a/devshell/dart-tunnel-lib/dart-tunnel.dart b/devshell/dart-tunnel-lib/dart-tunnel.dart
deleted file mode 100644
index c0efcae..0000000
--- a/devshell/dart-tunnel-lib/dart-tunnel.dart
+++ /dev/null
@@ -1,77 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-import 'dart:async';
-import 'dart:core';
-import 'dart:io';
-
-import 'package:args/args.dart';
-import 'package:fuchsia_remote_debug_protocol/fuchsia_remote_debug_protocol.dart';
-import 'package:fuchsia_remote_debug_protocol/logging.dart';
-
-const String kIpAddrFlag = 'ip-address';
-const String kNetIfaceFlag = 'network-interface';
-const String kSshConfigFlag = 'ssh-config';
-const String kIsolateNameFlag = 'isolate';
-const String kVerboseFlag = 'verbose';
-
-/// Utility function: returns `true` if `flag` is neither null nor empty.
-bool flagIsValid(String flag) {
-  return flag != null && !flag.isEmpty;
-}
-
-Future<Null> main(List<String> args) async {
-  final ArgParser parser = new ArgParser()
-    ..addFlag(kVerboseFlag, defaultsTo: false, negatable: false)
-    ..addOption(kIpAddrFlag)
-    ..addOption(kNetIfaceFlag)
-    ..addOption(kSshConfigFlag)
-    ..addOption(kIsolateNameFlag);
-  final ArgResults results = parser.parse(args);
-  // Since this is being run from a parent script, just return an error instead
-  // of printing help text, as extra help text would be confusing.
-  if (!flagIsValid(results[kIpAddrFlag]) ||
-      !flagIsValid(results[kSshConfigFlag]) ||
-      !flagIsValid(results[kNetIfaceFlag])) {
-    return 1;
-  }
-  if (results[kVerboseFlag]) {
-    Logger.globalLevel = LoggingLevel.all;
-  }
-  final String ipAddress = results[kIpAddrFlag];
-  final String sshConfigFlag = results[kSshConfigFlag];
-  final String netInterfaceFlag = results[kNetIfaceFlag];
-  print('Connecting to device at ${ipAddress} . . .');
-  final FuchsiaRemoteConnection connection =
-      await FuchsiaRemoteConnection.connect(
-    ipAddress,
-    netInterfaceFlag.isEmpty ? null : netInterfaceFlag,
-    sshConfigFlag,
-  );
-  final String isolateName = results[kIsolateNameFlag];
-  final Pattern isolatePattern = flagIsValid(isolateName) ? isolateName : r'';
-  final List<IsolateRef> isolates = await connection.getMainIsolatesByPattern(
-      isolatePattern,
-      includeNonFlutterIsolates: true);
-  final String plural =
-      isolates.length == 0 || isolates.length > 1 ? 'isolates' : 'isolate';
-  final String isolateResultString = 'Found ${isolates.length} $plural';
-  print(isolateResultString);
-  // Creates a fancy dotted line under the result string.
-  print(new String.fromCharCodes(
-      isolateResultString.codeUnits.map((int codeUnit) => '-'.codeUnitAt(0))));
-  for (IsolateRef ref in isolates) {
-    // Replace the websocket protocol with http for browser-friendly links.
-    final Uri dartVmUri = ref.dartVm.uri.replace(scheme: 'http', path: '');
-    print('${ref.name}: $dartVmUri');
-  }
-
-  ProcessSignal.SIGINT.watch().listen((ProcessSignal signal) async {
-    print('');
-    print('SIGINT received. Shutting down.');
-    await connection.stop();
-    exit(0);
-  });
-  print('<Press Ctrl-C to exit>');
-}
diff --git a/devshell/dart-tunnel-lib/dart-tunnel.packages b/devshell/dart-tunnel-lib/dart-tunnel.packages
deleted file mode 100644
index 0f324c9..0000000
--- a/devshell/dart-tunnel-lib/dart-tunnel.packages
+++ /dev/null
@@ -1,20 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-fuchsia_remote_debug_protocol:../../../third_party/dart-pkg/git/flutter/packages/fuchsia_remote_debug_protocol/lib/
-json_rpc_2:../../../third_party/dart-pkg/pub/json_rpc_2/lib/
-web_socket_channel:../../../third_party/dart-pkg/pub/web_socket_channel/lib/
-meta:../../../third_party/dart-pkg/pub/meta/lib/
-process:../../../third_party/dart-pkg/pub/process/lib/
-async:../../../third_party/dart-pkg/pub/async/lib/
-stream_channel:../../../third_party/dart-pkg/pub/stream_channel/lib/
-stack_trace:../../../third_party/dart-pkg/pub/stack_trace/lib/
-crypto:../../../third_party/dart-pkg/pub/crypto/lib/
-collection:../../../third_party/dart-pkg/pub/collection/lib/
-file:../../../third_party/dart-pkg/pub/file/lib/
-path:../../../third_party/dart-pkg/pub/path/lib/
-platform:../../../third_party/dart-pkg/pub/platform/lib/
-convert:../../../third_party/dart-pkg/pub/convert/lib/
-typed_data:../../../third_party/dart-pkg/pub/typed_data/lib/
-charcode:../../../third_party/dart-pkg/pub/charcode/lib/
-args:../../../third_party/dart-pkg/pub/args/lib/
diff --git a/devshell/debug b/devshell/debug
deleted file mode 100755
index ec70235..0000000
--- a/devshell/debug
+++ /dev/null
@@ -1,107 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run the debug agent on target and connect to it with zxdb
-
-## Starts the debug agent on the proposed target and automatically connect zxdb
-## to it. Will close the debug agent on exit.
-##
-## TROUBLESHOOTING TIPS:
-##
-## - Remember to use "fx set-device" when working with multiple devices.
-## - This scripts by default will mute the SSH connection stdout/stderr, so any
-##   errors triggered by it won't appear. Use the --verbose-agent flag to see
-##   the output.
-## - This scripts uses the tool "nc" for testing TCP connections. Check that it
-##   is in $PATH and that it works.
-##
-## Usage: fx debug [(--port|-p) <PORT>] [(--verbose-agent|-va)]
-##
-##    --port            Port the debug agent will be listening on. Will use 2345
-##                      by default.
-##    --verbose-agent   Whether the debug agent's stdout/stderr should be shown.
-##                      Useful for debugging the debugger. Yo' dawg.
-##
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-# Defaults.
-port=2345
-agent_out="/dev/null"
-
-# Flag parsing.
-while [[ "$1" =~ ^- ]]; do
-  case "$1" in
-    --help|-h)
-      fx-command-help
-      exit 0
-      ;;
-    --port|-p)
-      shift
-      port="$1"
-      ;;
-    --verbose-agent|-va)
-      shift
-      agent_out=""
-      break
-      ;;
-    *)
-      break
-  esac
-  shift
-done
-
-# Get the defaulted device address.
-target=$(fx-command-run "netaddr" "--fuchsia" "$(get-device-name)")
-if [[ -z "${target}" ]]; then
-  # netaddr will have put the correct error message by now.
-  echo -e "Could not get a valid target. Run \"fx set-device\""
-  exit 1
-fi
-
-debug_agent_pkg="fuchsia-pkg://fuchsia.com/debug_agent#meta/debug_agent.cmx"
-
-# Leave the SSH connection open. Will be closed on script end.
-# We branch out on whether the user used the verbose-agent flag. If so, we
-# redirect the debug agent output to /dev/null.
-echo -e "Attempting to start the Debug Agent."
-if [[ -z "${agent_out}" ]]; then
-  (fx-command-run "ssh" "${target}" "run ${debug_agent_pkg} --port=${port}" &) &
-else
-  (fx-command-run "ssh" "${target}" "run ${debug_agent_pkg} --port=${port}" &) > "${agent_out}" 2>&1 &
-fi
-fx_ssh_pid="$!"
-
-# We wait until the debug agent is listening on the given port. We use NC to
-# attemp a tcp connection. This will actually go all the way with the handshake,
-# so the debug agent will think initially that NC is a client. But then it will
-# close the connection and receive the actual client's connection and work fine.
-try_count=0
-max_tries=10
-echo -e "Waiting for the Debug Agent to start."
-while true; do
-  # Use NC to test if the port is open and the debug agent is listening.
-  nc -w5 -6 -z ${target} ${port}
-  # If successful, we proceed to connect.
-  if [[ "$?" -eq 0 ]]; then
-    break
-  fi
-
-  # Otherwise, we count and check if we need to exit.
-  try_count=$(expr "${try_count}" + 1)
-  if [[ "${try_count}" -gt "${max_tries}" ]]; then
-    echo -e "Timed out trying to find the Debug Agent. Exiting."
-    echo -e "Attempting to kill SSH connection."
-    kill "${fx_ssh_pid}"
-    exit 1
-  fi
-  sleep 1
-done
-
-# We start the client with the flag that tells it to quit the agent when zxdb
-# quits.
-echo -e "Connection found. Starting zxdb."
-"${FUCHSIA_BUILD_DIR}/host_x64/zxdb" "--connect" "[${target}]:${port}" "--quit-agent-on-exit"
diff --git a/devshell/debug-report b/devshell/debug-report
deleted file mode 100755
index 6b0d126..0000000
--- a/devshell/debug-report
+++ /dev/null
@@ -1,83 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### generate a report with component exposed data
-
-## usage: fx debug-report [f|--format <FORMAT>]
-##                        [-s|--system-objects]
-##                        [--] [<REGEX> [...<REGEX>]]
-##
-## Runs and generate a debug report from the selected components.
-##
-##    -f|--format <FORMAT>  What formatter to be used for the output.
-##                          These will be passed on to the underlying iquery
-##                          tool. Supported values:
-##                          - text: Human readable output. [DEFAULT]
-##                          - json: Simple to parse JSON format.
-##    -s|--system-objects   Whether to include the system generated debug
-##                          information. This is information includes current
-##                          register and stack traces for all threads.
-##                          False by default.
-##
-##    REGEX   Basic Regular Expression (as understood by GNU grep) to filter
-##            out components. Only matching components will be included in the
-##            report.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-# Defaults.
-system_objects=false
-formatter=""
-
-# Flag parsing.
-while [[ "$1" =~ ^- ]]; do
-  case "$1" in
-    -h|--help)
-      fx-command-help
-      exit 0
-      ;;
-    -f|--format)
-      shift
-      formatter="$1"
-      ;;
-    -s|--system-objects)
-      system_objects=true
-      ;;
-    --)
-      break
-      ;;
-    *)
-      break
-  esac
-  shift
-done
-
-# Post-flag processing.
-format="--format=${formatter}"
-host="$(get-fuchsia-device-addr)"
-regexps="$@"
-find_target="/hub"
-
-# Find all the available services.
-find="$(fx-command-exec "ssh" "${host}" "iquery --find ${find_target}")"
-
-# Check if we want the system objects
-if [[ "${system_objects}" = "false" ]]; then
-  find="$(echo "${find}" | grep -v "system_objects$")"
-fi
-
-# Join al given regexp into a format grep can consume.
-if [[ ! -z ${regexps} ]]; then
-  filters=""
-  for regex in ${regexps}; do
-    filters="$(echo "${filters} -e ${regex}")"
-  done
-  find="$(echo "${find}" | grep ${filters})"
-fi
-
-# Flatten to the format iquery expects.
-flatten="$(echo "${find}" | tr '\n' ' ')"
-
-fx-command-exec "ssh" "${host}" "iquery --absolute_paths --cat ${format} --recursive ${flatten}"
diff --git a/devshell/delta b/devshell/delta
deleted file mode 100755
index 924c851..0000000
--- a/devshell/delta
+++ /dev/null
@@ -1,85 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### compare all built Fuchsia packages with a prior package snapshot
-
-## usage: fx delta [[--name|-n NAME] | [--source|-s PATH]] [--build] [--help-delta] [DELTA_ARGS...]
-##
-## Compare metadata of all Fuchsia packages with a prior package snapshot.
-##   --name|-n NAME   Set the NAME of the source package snapshot (Default: "system")
-##   --source|-s PATH Read the source snapshot from the specified PATH (Default: $FUCHSIA_BUILD_DIR/snapshots/$NAME.snapshot)
-##   --help-delta     Show command line help for "pm delta"
-##   --build          Build the current package snapshot before comparing it to the previously saved snapshot
-##   DELTA_ARGS       Unknown arguments are passed through to "pm delta"
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-pm_bin="${FUCHSIA_BUILD_DIR}/host_x64/pm"
-
-function usage {
-  fx-command-help delta
-
-  echo
-  "${pm_bin}" delta --help
-}
-
-function main {
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  snapshot_dir="$FUCHSIA_BUILD_DIR/snapshots"
-  target_path="$FUCHSIA_BUILD_DIR/obj/build/images/system.snapshot"
-
-  build=0
-  delta_args=()
-  source_name=
-  source_path=
-  while [[ $# -ne 0 ]]; do
-    case "$1" in
-      --build)
-        build=1
-        ;;
-      -n|--name)
-        source_name="$2"
-        shift
-        ;;
-      -s|--source)
-        source_path="$2"
-        shift
-        ;;
-      --help-delta)
-        usage
-        exit 0
-        ;;
-      *)
-        delta_args+=("$1")
-    esac
-    shift
-  done
-
-  if [[ -n "${source_name}" && -n "${source_path}" ]]; then
-    echo >&2 "Source name and source path can not both specified"
-    usage
-    exit 1
-  elif [[ -z "${source_name}" ]]; then
-    source_name="system"
-  fi
-
-  if [[ -z "${source_path}" ]]; then
-    source_path="${snapshot_dir}/${source_name}.snapshot"
-  fi
-
-  if [[ "${build}" -ne 0 ]]; then
-    fx-command-run build system_snapshot || {
-      echo >&2 "Build of current package state failed, bailing out"
-      exit 1
-    }
-  fi
-
-  "${pm_bin}" delta "${delta_args[@]}" "${source_path}" "${target_path}"
-}
-
-main "$@"
diff --git a/devshell/doctor b/devshell/doctor
deleted file mode 100755
index 7f01414..0000000
--- a/devshell/doctor
+++ /dev/null
@@ -1,132 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run various checks to determine the health of a Fuchsia checkout
-
-## usage: fx doctor
-
-# The goal of this script is to detect common issues with a Fuchsia
-# checkout and potential conflicts in the user's shell environment.
-#
-# For example, on OS X the xcode command line tool
-# installation often lapses. Ensuring that `xcode select --install` is
-# run as part of a checkout or build is problematic: the step involves
-# manual input. Detecting that it needs to be run, however, is
-# perfectly mechanizable.
-#
-# For potential issues in the user's shell initialization script
-# (such as ~/.bashrc), this script will also run a shell checkup
-# script (for example, devshell/lib/bashrc_checkup.sh)
-# under the user's bash "${SHELL}" (if different from /bin/bash),
-# load the user's shell settings, and check for any known issues.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-source "${FUCHSIA_DIR}/scripts/devshell/lib/style.sh" || exit $?
-source "${FUCHSIA_DIR}/scripts/devshell/lib/common_term_styles.sh" || exit $?
-
-fx-config-read || exit $?
-
-dr_mac() {
-  local status=0
-  local xcode_path=$(xcode-select --print-path)
-  local expected_path='/Library/Developer/CommandLineTools'
-  local required_subpath='usr/include/c++'
-
-  if [[ ! -d "${xcode_path}/${required_subpath}" ]]; then
-    if [[ "${xcode_path}" != "${expected_path}" ]] && \
-       [[ -d "${expected_path}/${required_subpath}" ]]; then
-      warn "You may need to run \`sudo xcode-select --switch \"${expected_path}\"\`"
-    else
-      warn "Make sure you've run \`sudo xcode-select --install\`"
-    fi
-
-    details << EOF
-A common issue with Fuchsia development on macOS is needing to
-re-run the \`xcode-select\` command. The typical symptom is
-failure to find system C or C++ headers after a reboot or update.
-
-If the XCode Command Line Tools are missing, install them with:
-
-EOF
-    code << EOF
-sudo xcode-select --install
-EOF
-    details << EOF
-
-If the XCode Command Line Tools are already installed, but XCode
-is configured to use the wrong path (e.g., an Xcode application
-directory, instead of the "CommandLineTools", which you can
-verify with \`xcode-select --print-path\`) then you may need to
-"switch" to the CommandLineTools, using:
-
-EOF
-    code << EOF
-sudo xcode-select --switch "${expected_path}"
-EOF
-    details << EOF
-
-See $(link 'https://fuchsia.googlesource.com/docs/getting_started.md#macos')
-for more details.
-EOF
-  fi
-
-  return ${status}
-}
-
-dr_linux() {
-  local status=0
-  return ${status}
-}
-
-shell_checkup() {
-  local status=0
-
-  # If the user is using bash, their default interactive "${SHELL}"
-  # may differ from the script-standard "/bin/bash", and their ~/.bashrc
-  # may depend on features of their shell that are not present in
-  # /bin/bash, so launch the shell checkup script using "${SHELL}".
-  #
-  # For example, since MacOS includes only bash version 3, Homebrew users
-  # may install bash 4 in /usr/local/bin/bash, and then select
-  # bash 4 by adding it to /etc/shells, and running the "chsh" command.
-
-  local shell_type="$(basename "${SHELL}")"
-  case "${shell_type}" in
-    bash)
-      local current_debug_flag="$(echo $-|sed -n 's/.*x.*/-x/p')"
-      eval "${SHELL}" "${current_debug_flag}" "${FUCHSIA_DIR}/scripts/devshell/lib/bashrc_checkup.sh" || status=$?
-      ;;
-    *)
-      info "No shell checkup for ${shell_type}"
-      ;;
-  esac
-
-  return ${status}
-}
-
-dr_all() {
-  local status=0
-  shell_checkup || status=$?
-  return ${status}
-}
-
-main() {
-  local status=0
-  case $(uname) in
-    Darwin)
-      dr_mac || status=$?
-      ;;
-    Linux)
-      dr_linux || status=$?
-      ;;
-  esac
-  dr_all || status=$?
-  if (( ${status} == 0 )); then
-    info "No known issues were found. You appear to be in good health!"
-  fi
-  return ${status}
-}
-
-main "$@" || exit $?
diff --git a/devshell/exec b/devshell/exec
deleted file mode 100755
index a2eb5fb..0000000
--- a/devshell/exec
+++ /dev/null
@@ -1,14 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### read the current build config, then exec
-
-## usage: fx exec <command> [args, ...]
-## The build configuration is sourced and read, then command is executed.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-exec "$@"
diff --git a/devshell/flash b/devshell/flash
deleted file mode 100755
index d1e6f37..0000000
--- a/devshell/flash
+++ /dev/null
@@ -1,110 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run fastboot to flash zedboot to device
-
-## usage: fx flash [-a|-b|-r] [-s <serial>] [--pave] <BOARD>
-##   <BOARD>   Board to flash.
-##   -a        Flash Zircon-A partition (default)
-##   -b        Flash Zircon-B partition
-##   -r        Flash Zircon-R partition
-##   -s        Device you wish to flash to (only necessary if multiple
-##             devices in fastboot mode)
-##   --pave    Pave device after flashing (recommended)
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/image_build_vars.sh || exit $?
-
-usage() {
-    fx-command-help
-    echo "Available boards:"
-    echo "  vim2"
-    echo "Available devices:"
-    fastboot devices -l
-    exit 1
-}
-
-ZIRCONA="ZIRCON-A"
-ZIRCONB="ZIRCON-B"
-ZIRCONR="ZIRCON-R"
-
-pave=false
-partition=$ZIRCONA
-device=
-while [[ $# -ge 1 ]]; do
-  case "$1" in
-  -h|--help)
-    usage
-    ;;
-  -a)
-    partition=$ZIRCONA
-    ;;
-  -b)
-    partition=$ZIRCONB
-    ;;
-  -r)
-    partition=$ZIRCONR
-    ;;
-  --pave)
-    pave=true
-    ;;
-  --nopave)
-    pave=false
-    ;;
-  -s)
-    shift
-    device="$1"
-    ;;
-  vim2)
-    board="$1"
-    ;;
-  *)
-    break
-  esac
-  shift
-done
-
-case "${board}-${partition}" in
-  "vim2-${ZIRCONA}")
-    partition="boot"
-    ;;
-  "vim2-${ZIRCONB}")
-    partition="misc"
-    ;;
-  "vim2-${ZIRCONR}")
-    partition="recovery"
-    ;;
-  *)
-    echo "Invalid board or partition"
-    usage
-esac
-
-num_devices=$(fastboot devices | wc -l)
-if [[ "${num_devices}" -lt 1 ]]; then
-  echo "Please place device into fastboot mode!"
-  exit 1
-elif [[ "${num_devices}" -gt 1 ]] && [[ -z "${device}" ]]; then
-  echo "More than one device detected, please provide -s <device>!"
-  usage
-  exit 1
-fi
-
-extra_args=()
-if [[ ! -z "${device}" ]]; then
-  if [[ ! "$(fastboot devices -l)" =~ "${device}" ]]; then
-    echo "Device ${device} not found!"
-    usage
-    exit 1
-  fi
-  extra_args=("-s" "${device}")
-fi
-
-fastboot flash "${partition}" "${FUCHSIA_BUILD_DIR}/${IMAGE_ZIRCONR_ZBI}" "${extra_args[@]}"
-fastboot reboot
-
-if [[ "${pave}" == "true" ]]; then
-  fx-command-exec pave -1
-fi
diff --git a/devshell/flutter-attach b/devshell/flutter-attach
deleted file mode 100755
index 595ea13..0000000
--- a/devshell/flutter-attach
+++ /dev/null
@@ -1,66 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### attach to a running flutter module to enable hot reload and debugging
-
-## usage: fx flutter-attach //topaz/examples/hello_world:hello_world
-##
-## This command requires Topaz in order to run.
-##
-## Arguments:
-##   -v|--verbose Enable verbose logging.
-##   -h|--help    Print out this message.
-
-DEVSHELL_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-source "${DEVSHELL_DIR}"/lib/vars.sh || exit $?
-fx-config-read
-
-ARGS=()
-
-NAME="$(get-device-name)"
-if [[ -n "$name" ]]; then
-  ARGS+=("--device", "${name}")
-fi
-
-for var in "$@"
-do
-  case "$var" in
-    -h|--help)
-      fx-command-help
-      exit 0
-    ;;
-    -v|--verbose)
-      ARGS+=("--verbose")
-    ;;
-    *)
-      TARGET="$var"
-    ;;
-  esac
-done
-
-case "$(uname -s)" in
-    Linux)
-        PREBUILT_DART="${DEVSHELL_DIR}/../../topaz/tools/prebuilt-dart-sdk/linux-x64/bin/dart"
-        ;;
-    Darwin)
-        PREBUILT_DART="${DEVSHELL_DIR}/../../topaz/tools/prebuilt-dart-sdk/mac-x64/bin/dart"
-        ;;
-esac
-
-if [[ -z "$PREBUILT_DART" ]]; then
-  >&2 echo "PRREBUILT DART binary not supported on current platform"
-  exit 1
-fi
-
-FUCHSIA_ATTACH_BIN="${FUCHSIA_BUILD_DIR}/host_x64/dart-tools/fuchsia_attach"
-
-# The flutter tool expects the working directory is the fuchsia root.
-cd "${FUCHSIA_DIR}"
-
-"${FUCHSIA_ATTACH_BIN}" \
-  "--dart-sdk=${PREBUILT_DART}" \
-  "--build-dir=${FUCHSIA_BUILD_DIR}"\
-  "--target=${TARGET}" \
-  "${ARGS[@]}"
diff --git a/devshell/format-code b/devshell/format-code
deleted file mode 100755
index c51c02f..0000000
--- a/devshell/format-code
+++ /dev/null
@@ -1,185 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### runs source formatters on modified files
-
-## Usage: fx format-code
-##           [--help] [--dry-run] [--verbose] [--all]
-##           [--files=FILES,[FILES ...]]
-##           [--target=GN_TARGET]
-##           [--git]
-##
-##   --help    Prints this message
-##   --dry-run Stops the program short of running the formatters
-##   --all     Formats all code in the git repo under the current working
-##             directory.
-##   --files   Allows the user to specify files.  Files are comma separated.
-##             Globs are dealt with by bash; fx format-code "--files=foo/*" will
-##             work as expected.
-##    --target Allows the user to specify a gn target.
-##    --git    The default; it uses `git diff-index` against the newest parent
-##             commit in the upstream branch (or against HEAD if no such commit
-##             is found).  Files that are locally modified, staged or touched by
-##             any commits introduced on the local branch are formatted.
-##    --remove-ordinals (temporary)
-##             Removes ordinals from FIDL files.  This option will be removed
-##             when ordinals are no longer legal in FIDL files.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-function usage() {
-  fx-command-help
-}
-
-function zap-commas() {
-  echo $1 | tr ',' '\n'
-}
-
-function get-diff-base() {
-  local upstream=$(git rev-parse --abbrev-ref --symbolic-full-name "@{u}" 2>/dev/null)
-  if [[ -z "${upstream}" ]]; then
-    upstream="origin/master"
-  fi
-  local local_commit=$(git rev-list HEAD ^${upstream} --  2>/dev/null | tail -1)
-  if [[ -z "${local_commit}" ]]; then
-    printf "HEAD"
-  else
-    git rev-parse "${local_commit}"^
-  fi
-}
-
-function format-cmd() {
-  if [ -f $1 ]; then
-    case $1 in
-      *.cc) printf "${CLANG_CMD}" ;;
-      *.cmx) printf "${JSON_FMT_CMD}" ;;
-      *.cpp) printf "${CLANG_CMD}" ;;
-      *.dart) printf "${DART_CMD}" ;;
-      *.fidl) printf "${FIDL_CMD}" ;;
-      *.gn) printf "${GN_CMD}" ;;
-      *.gni) printf "${GN_CMD}" ;;
-      *.go) printf "${GO_CMD}" ;;
-      *.h) printf "${CLANG_CMD}" ;;
-      *.hh) printf "${CLANG_CMD}" ;;
-      *.hpp) printf "${CLANG_CMD}" ;;
-      *.ts) printf "${CLANG_CMD}" ;;
-      *.rs) printf "${RUST_FMT_CMD}" ;;
-    esac
-  fi
-}
-
-function hg-cmd() {
-  [[ $1 =~ .*\.h ]] && printf "${FIX_HEADER_GUARDS_CMD}"
-}
-
-# Removes leading //, resolves to absolute path, and resolves globs.  The first
-# argument is a path prefix, and the remaining arguments are relative to that
-# path prefix.
-function canonicalize() {
-  local root_dir="$1"
-  shift
-  for fileglob in "${@}"; do
-    # // means it comes from gn, [^/]* means it is relative
-    if [[ "${fileglob}" = //* || "${fileglob}" = [^/]* ]]; then
-      local dir="${root_dir}"/
-    else
-      local dir=""
-    fi
-    for file in "${dir}"${fileglob#"//"}; do
-      echo "${file}"
-    done
-  done
-}
-
-DRY_RUN=
-VERBOSE=
-REMOVE_ORDINALS=
-
-fx-config-read
-
-for ARG in "$@"; do
-  case "${ARG}" in
-    --verbose) VERBOSE="1" ;;
-    --dry-run) DRY_RUN="1" ;;
-    --all) FILES=$(canonicalize "${PWD}" $(git ls-files)) ;;
-    --git) ;; # We'll figure this out later
-    --files=*) FILES=$(canonicalize "${PWD}" $(zap-commas "${ARG#--files=}")) ;;
-    --target=*) FILES=$(canonicalize "${FUCHSIA_DIR}" \
-        $(fx-buildtool-run gn desc \
-            "${FUCHSIA_OUT_DIR}/${FUCHSIA_ARCH}" "${ARG#--target=}" sources)) &&
-        RUST_ENTRY_POINT=$(canonicalize "${FUCHSIA_DIR}" \
-            $(fx rustfmt --print-sources ${ARG#--target=})) ;;
-# TODO(FIDL-372): Remove support for --remove-ordinals
-    --remove-ordinals) REMOVE_ORDINALS="1" ;;
-    --help) usage && exit 1 ;;
-    *) usage && printf "Unknown flag %s\n" "${ARG}" && exit 1 ;;
-  esac
-done
-
-if [ -z "${FILES+x}" ]; then
-  FILES=$(canonicalize $(git rev-parse --show-toplevel) \
-    $(git diff-index --name-only $(get-diff-base)))
-fi
-
-if [[ -n "${VERBOSE}" ]]; then
-  printf "Files to be formatted:\n%s\n" "${FILES}"
-fi
-
-declare BUILDTOOLS_ROOT="${FUCHSIA_DIR}"/buildtools
-declare HOST_OS=$(uname | tr '[:upper:]' '[:lower:]')
-
-[[ "${HOST_OS}" == "darwin" ]] && HOST_OS="mac"
-case $(uname -m) in
-  x86_64) HOST_ARCH="x64" ;;
-  aarch64) HOST_ARCH="arm64" ;;
-  *) echo "Unknown arch $(uname -m)" && exit 1 ;;
-esac
-
-declare HOST_PLATFORM="${HOST_OS}-${HOST_ARCH}"
-
-declare CLANG_CMD="${BUILDTOOLS_ROOT}/${HOST_PLATFORM}/clang/bin/clang-format -style=file -fallback-style=Google -sort-includes -i"
-declare DART_CMD="${FUCHSIA_DIR}/topaz/tools/prebuilt-dart-sdk/${HOST_PLATFORM}/bin/dartfmt -w"
-declare FIDL_BIN="${ZIRCON_TOOLS_DIR}"/fidl-format
-if [[ -z "${REMOVE_ORDINALS}" ]]; then
-  declare FIDL_CMD="${FIDL_BIN} -i"
-else
-  declare FIDL_CMD="${FIDL_BIN} -i --remove-ordinals"
-fi
-declare GN_CMD="${BUILDTOOLS_ROOT}/gn format"
-declare GO_CMD="${BUILDTOOLS_ROOT}/${HOST_PLATFORM}/go/bin/gofmt -w"
-declare JSON_FMT_CMD="${FUCHSIA_DIR}"/scripts/style/json-fmt.py
-declare RUST_FMT_CMD="${BUILDTOOLS_ROOT}/${HOST_PLATFORM}/rust/bin/rustfmt --config-path=${FUCHSIA_DIR}/garnet/rustfmt.toml --unstable-features --skip-children"
-declare RUST_ENTRY_POINT_FMT_CMD="${BUILDTOOLS_ROOT}/${HOST_PLATFORM}/rust/bin/rustfmt --config-path=${FUCHSIA_DIR}/garnet/rustfmt.toml"
-
-declare FIX_HEADER_GUARDS_CMD="${FUCHSIA_DIR}/scripts/style/check-header-guards.py --fix"
-
-# If there is a FIDL file to fix, and we don't have a copy of fidl-format,
-# generate one.
-for file in ${FILES}; do
-  if [[ ${file} =~ .*\.fidl ]]; then
-    if [[ ! -x "${FIDL_BIN}" ]]; then
-       printf "fidl-format not found: attempting to build it\n"
-       fx-command-run build-zircon -l
-       break
-     fi
-  fi
-done
-
-[[ -n "${DRY_RUN}" ]] && exit
-
-[[ -n "${RUST_ENTRY_POINT}" ]] && ${RUST_ENTRY_POINT_FMT_CMD} "${RUST_ENTRY_POINT}"
-
-for file in ${FILES}; do
-  # Git reports deleted files, which we don't want to try to format
-  [[ ! -f "${file}" ]] && continue
-
-  # Format the file
-  declare fcmd=$(format-cmd ${file})
-  [[ -n "${fcmd}" ]] && ${fcmd} "${file}"
-  declare hgcmd=$(hg-cmd ${file})
-  [[ -n "${hgcmd}" ]] && ${hgcmd} "${file}"
-done
diff --git a/devshell/full-build b/devshell/full-build
deleted file mode 100755
index 3fb0c5a..0000000
--- a/devshell/full-build
+++ /dev/null
@@ -1,14 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build zircon, then build fuchsia
-## build zircon, then build fuchsia
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-fx-command-run build-zircon
-fx-command-run build
diff --git a/devshell/fuzz b/devshell/fuzz
deleted file mode 100755
index 5349fc1..0000000
--- a/devshell/fuzz
+++ /dev/null
@@ -1,438 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Keep the usage info below in sync with //zircon/ulib/fuzz-utils/fuzzer.cpp!
-
-### run a fuzz test on target a device
-
-##
-## Usage: fx fuzz [options] [command] [command-arguments]
-##
-## Options (must be first):
-##   -d, --device   <name>   Connect to device using Fuchsia link-local name.
-##                           Must be specified if multiple devices are present.
-##   -f, --foreground        Run in the foreground (default is background).
-##   -o, --output   <dir>    Use the given directory for saving output files.
-##                           Defaults to the current directory.
-##   -s, --staging  <dir>    Use the given directory for staging temporary
-##                           corpus files being transferred on or off of a
-##                           target device. Defaults to a temporary directory
-##                           that is removed on completion; use this options to
-##                           preserve those temporary files on the host.
-##
-## Commands:
-##   help                    Prints this message and exits.
-##   list    [name]          Lists fuzzers matching 'name' if provided, or all
-##                           fuzzers.
-##   fetch   <name> [digest] Retrieves the corpus for the named fuzzer and
-##                           version given by 'digest'. Defaults to the latest
-##                           if 'digest' is omitted.
-##   start   <name> [...]    Starts the named fuzzer. Additional arguments are
-##                           passed to the fuzzer.  If the target does not have
-##                           a corpus for the fuzzer, and the metadata lists one
-##                           available, this will fetch the corpus first.
-##   check   <name>          Reports information about the named fuzzer, such as
-##                           execution status, corpus location and size, and
-##                           number of crashes.
-##   stop    <name>          Stops all instances of the named fuzzer.
-##   repro   <name> [...]    Runs the named fuzzer on specific inputs. If no
-##                           additional inputs are provided, uses all previously
-##                           found crashes.
-##   merge   <name> [...]    Merges the corpus for the named fuzzer. If no
-##                           additional inputs are provided, minimizes the
-##                           current corpus.
-##   store   <name>          Gathers the current corpus from the target platform
-##                           and publishes it. Requires a pristine repository,
-##                           as it will updates the build files with the new
-##                           corpus location.
-##   zbi                     Adds Zircon fuzzers to 'fuchsia.zbi'
-##
-## The RECOMMENDED way to run a fuzzer is to omit 'command', which will use
-## "automatic" mode.  In this mode, 'fx fuzz' will check if a corpus is already
-## present, and if not it will fetch the latest.  It will then start the fuzzer
-## and watch it to see when it stops.  Each of these steps respects the options
-## above.
-##
-## Example workflow:
-##   1. Shows available fuzzers:
-##         fx fuzz list
-##
-##   2. Run a fuzzer for 8 hours (e.g. overnight), fetching the initial corpus
-##      if needed:
-##         fx fuzz -o out foo_package/bar_fuzz_test -max_total_time=28800
-##
-##   3. Check if the fuzzer is still running.
-##         fx fuzz check foo/bar
-##
-##   4. Execute the fuzzer with a crashing input:
-##         fx fuzz repro foo/bar crash-deadbeef
-##
-##   5. Use the artifacts in 'out/foo_package/bar_fuzz_test/latest' to file and
-##      fix bugs. Repeat step 4 until the target doesn't crash.
-##
-##   6. Repeat steps 2-4 until no crashes found.
-##
-##   7. Minimize the resulting corpus:
-##         fx fuzz merge foo/bar
-##
-##   8. Save the new, minimized corpus in CIPD:
-##         fx fuzz store foo/bar
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-# Constants
-CIPD="${FUCHSIA_DIR}/buildtools/cipd"
-CIPD_PREFIX="fuchsia/test_data/fuzzing"
-
-# Global variables
-device=""
-fg=0
-output="."
-staging=""
-keep=0
-
-host=$(uname)
-fuzzer=""
-status=""
-data_path=""
-corpus_size="0"
-cipd_path=""
-
-# Utility functions
-fatal() {
-  echo "Fatal error: $@"
-  exit 1
-}
-
-abspath() {
-  if [[ ${host} == "Darwin" ]] ; then
-    # Ugh, Mac OSX, why can't you have decent utilities?
-    if [[ -d "$1" ]] ; then
-      cd $1 && pwd -P
-    else
-      cd $(dirname $1) && echo "$(pwd -P)/$(basename $1)"
-    fi
-  else
-    realpath $1
-  fi
-}
-
-set_staging() {
-  if [[ -z "${staging}" ]] ; then
-    staging=$(mktemp -d)
-    [[ $? -eq 0 ]] || fatal "failed to make staging directory"
-  fi
-}
-
-# Ensure the temporary directory is removed as needed
-cleanup() {
-  if [[ -n "${staging}" ]] && [[ ${keep} -eq 0 ]] ; then
-    rm -rf "${staging}"
-  fi
-}
-trap cleanup EXIT
-
-# Commands
-query() {
-  local tmp
-  tmp="$(fx-command-run ssh "${device}" fuzz check "$1" | tr -s '\n' ' ')"
-  [[ -n "${tmp}" ]] || fatal "Use 'fx fuzz list' to see available fuzzers."
-  [[ $? -eq 0 ]] || fatal "failed to query fuzzer"
-  tmp="$(echo "${tmp}" | tr -d ':' | cut -d' ' -f1,2,8,11)"
-  read fuzzer status data_path corpus_size <<< "${tmp}"
-  cipd_path="${CIPD_PREFIX}/${fuzzer}"
-}
-
-fetch() {
-  [[ "${status}" == "STOPPED" ]] || fatal "fuzzer must be stopped to run this command"
-
-  local version="${1:-latest}"
-  # "latest" is a ref, but digests are tags that are prefixed with "version:"
-  if [[ "${version}" != "latest" ]] ; then
-    version="version:${version}"
-  fi
-
-  local corpus="${staging}/${fuzzer}/corpus"
-  mkdir -p "${corpus}"
-  [[ $? -eq 0 ]] || fatal "failed to create local directory: ${corpus}"
-
-  # Get corpus from CIPD
-  if ${CIPD} ls -r "${CIPD_PREFIX}" | grep -q "${fuzzer}" ; then
-    ${CIPD} install "${cipd_path}" ${version} --root "${corpus}"
-    [[ $? -eq 0 ]] || fatal "failed to retrieve corpus"
-  fi
-
-  # Add any upstream third-party corpora
-  if [[ "${version}" == "latest" ]] ; then
-    local seed_corpora=$(fx-command-run ssh ${device} fuzz seeds ${fuzzer})
-    for seed in ${seed_corpora} ; do
-      if echo "${seed}" | grep -q "^//third_party" ; then
-        rsync -a ${FUCHSIA_DIR}/${seed}/ ${corpus}/
-      fi
-    done
-  fi
-
-  # Fuchsia's scp doesn't like to glob
-  fx-command-run scp -r ${corpus} "[${device}]:${data_path}"
-}
-
-store() {
-  [[ "${status}" == "STOPPED" ]] || fatal "fuzzer must be stopped to run this command"
-  [[ "${corpus_size}" != "0" ]] || fatal "refusing to store empty corpus"
-
-  local corpus="${staging}/${fuzzer}/corpus"
-  mkdir -p "${corpus}"
-  [[ $? -eq 0 ]] || fatal "failed to create local directory: ${corpus}"
-
-  # Fuchsia's scp doesn't like to glob
-  fx-command-run scp -r "[${device}]:${data_path}corpus" $(dirname "${corpus}")
-  echo "***"
-  echo "This script may prompt for credentials."
-  echo "This is to allow it to add POSIX-style ACLs to corpus files."
-  echo "***"
-  sudo chmod +x ${corpus}
-  [[ $? -eq 0 ]] || fatal "failed to grant access"
-
-  pushd ${corpus}
-  local version
-  version=$(tar c * | sha256sum | cut -d' ' -f1)
-  [[ $? -eq 0 ]] || fatal "failed to calculate digest"
-
-  cat >cipd.yaml <<EOF
-package: ${cipd_path}
-description: Auto-generated fuzzing corpus for ${fuzzer}.
-install_mode: copy
-data:
-$(ls -1 | grep -v cipd | sed 's/^/  - file: /')
-EOF
-  # TODO: catch the error and tell user to do this
-  # $ cipd auth-login  # One-time auth.
-  ${CIPD} create --pkg-def cipd.yaml -tag version:${version} -ref latest
-  echo "***"
-  echo "Successfully stored package for ${fuzzer}, version ${version}."
-  # cipd creates a .cipd directory in corpus that misses +x so it cannot be
-  # cleaned up properly in cleanup().
-  sudo chmod -R +x ${corpus}
-  popd
-}
-
-start() {
-  # Get fuzzer info and check status
-  query "${fuzzer}"
-  [[ ${status} != "RUNNING" ]] || fatal "${fuzzer} is already running"
-
-  # Ensure we have a directory for this target
-  mkdir -p "${output}/${fuzzer}"
-  [[ $? -eq 0 ]] || fatal "failed to make directory: ${output}/${fuzzer}"
-  pushd "${output}/${fuzzer}" >/dev/null
-
-  # Clear all old logs
-  fx-command-run ssh ${device} rm "${data_path}/fuzz-*.log"
-  killall loglistener 2>/dev/null
-
-  # Create a directory for this run
-  local results="$(date +%F-%T)"
-  mkdir ${results}
-  [[ $? -eq 0 ]] || fatal "failed to make directory: ${results}"
-  rm -f latest
-  ln -s ${results} latest
-  pushd latest >/dev/null
-
-  # Start logging
-  ${ZIRCON_TOOLS_DIR}/loglistener >zircon.log  &
-  echo $! >.loglistener.pid
-
-  # Start the fuzzer
-  if [[ ${fg} -eq 0 ]] ; then
-    fx-command-run ssh ${device} fuzz start ${fuzzer} "$@" &
-  else
-    fx-command-run ssh ${device} fuzz start ${fuzzer} -jobs=0 "$@" 2>&1 | tee "fuzz-0.log"
-  fi
-
-  query "${fuzzer}"
-  if [[ ${status} == "RUNNING" ]] ; then
-    echo "'${fuzzer}' started; you should be notified when it stops."
-    echo "To check its progress, use 'fx fuzz check ${fuzzer}'."
-    echo "To stop it manually, use 'fx fuzz stop ${fuzzer}'."
-  elif [[ ${fg} -ne 0 ]] ; then
-    echo "Test units written to $(pwd -P)"
-  fi
-  monitor &
-
-  # Undo pushds
-  popd >/dev/null
-  popd >/dev/null
-}
-
-monitor() {
-  # Wait for completion
-  query "${fuzzer}"
-  while [[ "${status}" == "RUNNING" ]] ; do
-    sleep 2
-    query "${fuzzer}"
-  done
-  if [[ ${fg} -eq 0 ]] ; then
-    fx-command-run scp "[${device}]:${data_path}/fuzz-*.log" .
-  fi
-
-  # Stop log collection and symbolize
-  if [[ -f .loglistener.pid ]] ; then
-    kill $(cat .loglistener.pid)
-    rm -f .loglistener.pid
-  fi
-  if [[ -f zircon.log ]] ; then
-    ${FUCHSIA_DIR}/zircon/scripts/symbolize \
-      -i ${FUCHSIA_BUILD_DIR}/ids.txt <zircon.log >symbolized.log
-  fi
-
-  # Transfer the fuzz logs
-  local units=0
-  for log in * ; do
-    for unit in $(grep 'Test unit written to ' ${log} | sed 's/.* //') ; do
-      fx-command-run scp "[${device}]:${unit}" .
-      units=$((${units} + 1))
-    done
-  done
-
-  # Notify user
-  local title="${fuzzer} has stopped"
-  local body="${units} test units written to $(pwd -P)"
-  if [[ ${host} == "Linux" ]] ; then
-    if [[ -x /usr/bin/notify-send ]] ; then
-      /usr/bin/notify-send "${title}." "${body}"
-    else
-      wall "${title}; ${body}"
-    fi
-  elif [[ ${host} == "Darwin" ]] ; then
-    osascript -e "display notification \"${body}\" with title \"${title}.\""
-  fi
-}
-
-add_to_zbi() {
-  local image="${FUCHSIA_BUILD_DIR}/fuchsia.zbi"
-  if [[ ! -f "${image}" ]] ; then
-    fatal "No such ZBI file: ${image}"
-  elif [[ -f "${image}.orig" ]] ; then
-    fatal "Cowardly refusing to overwrite existing ${image}.orig"
-  fi
-
-  # Build zircon with instrumentation
-  echo "Building Zircon fuzzers..."
-  pushd "${FUCHSIA_DIR}/zircon"
-  USE_ASAN=1 USE_SANCOV=1 scripts/build-zircon-${FUCHSIA_ARCH} -C
-
-  # Find the lines in the bootfs manifest that are relevant to fuzzing, and create a new fuzz
-  # manifest that we can use to inject this objects into a ZBI.
-  local bootfs_manifest="build-${FUCHSIA_ARCH}-asan/bootfs.manifest"
-  local fuzz_manifest="build-${FUCHSIA_ARCH}-asan/fuzz.manifest"
-  grep '{core}lib.*asan.*=' "${bootfs_manifest}" > "${fuzz_manifest}"
-  grep '^{libs}lib/asan' "${bootfs_manifest}" >> "${fuzz_manifest}"
-  grep '^{test}test/fuzz' "${bootfs_manifest}" >> "${fuzz_manifest}"
-
-  # Check that all fuzzers listed in the zircon_fuzzers package are present in the build
-  # The `targets` regex looks at the resources under the data/ directory to find the fuzz target
-  # name, i.e. "data/some_fuzz_target/corpora" => "some_fuzz_target".
-  local zircon_manifest="${FUCHSIA_BUILD_DIR}/obj/garnet/tests/zircon/zircon_fuzzers.manifest"
-  local targets="$(sed -n 's/^data\/\([^\/]*\)\/.*/\1/p' "${zircon_manifest}" | sort | uniq )"
-  for target in $targets ; do
-    grep -q "$target" "${fuzz_manifest}" || \
-      fatal "target not found in ${fuzz_manifest}: ${target}"
-  done
-
-  # Copy fuzzers into Fuchsia
-  mv "${image}" "${image}.orig"
-  if ! ${ZIRCON_TOOLS_DIR}/zbi -o "${image}" "${image}.orig" "${fuzz_manifest}" ; then
-    mv "${image}.orig" "${image}"
-    fatal "Could not create ${image}"
-  fi
-  popd
-  echo "Zircon fuzzers added to ${image}"
-  rm -f "${image}.orig"
-}
-
-# Main
-main() {
-  fx-config-read
-  # Parse options
-  while [[ "$1" == "-"* ]] ; do
-    local opt="$1"
-    shift
-
-    local has_optval
-    case "${opt}" in
-      -f|--foreground)
-        fg=1
-        has_optval=0
-        ;;
-      -o|--output)
-        output="$1"
-        has_optval=1
-        ;;
-      -s|--staging)
-        keep=1
-        staging="$1"
-        has_optval=1
-        ;;
-      *)
-        fatal "unknown option: ${opt}"
-        ;;
-    esac
-    if [[ ${has_optval} -ne 0 ]] ; then
-      if [[ -z "$1" ]] || [[ "$1" == "-"* ]] ; then
-        fatal "missing value for ${opt}"
-      fi
-      shift
-    fi
-  done
-  output=$(abspath "${output}")
-
-  # Parse command
-  local device="$(get-fuchsia-device-addr)"
-  local command=$1
-  local fuzzer=$2
-  local args="${@:3}"
-  case ${command} in
-    help)
-      fx-command-help
-      exit 0
-      ;;
-    list|check|stop|repro|merge)
-      fx-command-run ssh ${device} fuzz ${command} ${fuzzer} ${args}
-      ;;
-    start)
-      start "${args}"
-      ;;
-    fetch|store)
-      set_staging
-      query "${fuzzer}"
-      ${command} ${args}
-      ;;
-    zbi)
-      # TODO(security): SEC-141. This command should be replaced by something using //build/images
-      # once vanilla drivers in instrumented devhosts are fixed and/or partial Zircon
-      # instrumentation is implemented.
-      echo "NOTE: This command is subject to change. Check the documentation at"
-      echo "//docs/development/workflows/libfuzzer.md for the currently supported way of"
-      echo "running Zircon fuzzers in a Fuchsia environment."
-      add_to_zbi
-      ;;
-    *)
-      # "Automatic" mode
-      fuzzer="${command}"
-      args="${@:2}"
-      echo "Command omitted; starting fuzzer '${fuzzer}' in automatic mode."
-      echo "If this isn't what you intended, try 'fx fuzz help'."
-      set_staging
-      query "${fuzzer}"
-      if [[ ${corpus_size} == "0" ]] ; then
-        fetch
-      fi
-      start "${args}"
-      ;;
-  esac
-}
-
-main "$@"
diff --git a/devshell/gce b/devshell/gce
deleted file mode 100755
index d165c6f..0000000
--- a/devshell/gce
+++ /dev/null
@@ -1,13 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### Google Compute Engine commands
-
-## See `fx gce help` for more imformation.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-"${FUCHSIA_DIR}/scripts/gce/gce" "$@"
diff --git a/devshell/gen b/devshell/gen
deleted file mode 100755
index 0fb852f..0000000
--- a/devshell/gen
+++ /dev/null
@@ -1,18 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### `gn gen` the FUCHSIA_BUILD_DIR
-
-## If FUCHSIA_BUILD_DIR is out/x64, this is simply:
-##   gn gen out/x64
-## It is useful if one has by some mechanism deleted the ninja artifacts, but
-#not the args.gn, e.g. if one CTRL+C's a regen step (gn bug).
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-set -ex
-
-"${FUCHSIA_DIR}/buildtools/gn" gen "$FUCHSIA_BUILD_DIR" "$@"
diff --git a/devshell/gen-cargo b/devshell/gen-cargo
deleted file mode 100755
index ecc83a1..0000000
--- a/devshell/gen-cargo
+++ /dev/null
@@ -1,59 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### generates symlinks to Rust Cargo.toml output files
-
-import argparse
-import os
-import subprocess
-import sys
-
-import lib.rust
-from lib.rust import ROOT_PATH
-
-def main():
-    parser = argparse.ArgumentParser(
-            "Generate symlinks to Rust Cargo.toml output files")
-    parser.add_argument("gn_target",
-                        type=lib.rust.GnTarget,
-                        help="GN target to generate a symlink for. \
-                              Use '.[:target]' to discover the cargo target \
-                              for the current directory or use the \
-                              absolute path to the target \
-                              (relative to $FUCHSIA_DIR). \
-                              For example: //garnet/bin/foo/bar:baz")
-    parser.add_argument("--output",
-                        help="Path to Cargo.toml to generate",
-                        required=False)
-    parser.add_argument("--out-dir",
-                        help="Path to the Fuchsia output directory",
-                        required=False)
-    args = parser.parse_args()
-
-    if args.out_dir:
-        out_dir = args.out_dir
-    else:
-        out_dir = lib.rust.find_out_dir()
-
-    path = args.gn_target.manifest_path(out_dir)
-
-    if args.output:
-        output = args.output
-    else:
-        output = os.path.join(args.gn_target.path, "Cargo.toml")
-
-    if os.path.exists(path):
-        try:
-            os.remove(output)
-        except OSError:
-            pass
-        print "Creating '{}' pointing to '{}'".format(output, path)
-        os.symlink(path, output)
-    else:
-        print "Internal error: path '{}' does not point to a Cargo.toml file".format(path)
-
-if __name__ == '__main__':
-    sys.exit(main())
-
diff --git a/devshell/get-build-dir b/devshell/get-build-dir
deleted file mode 100755
index c858a7b..0000000
--- a/devshell/get-build-dir
+++ /dev/null
@@ -1,13 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### print the current fuchsia build directory
-
-## equivalent to `fx exec echo ${FUCHSIA_BUILD_DIR}`
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-echo "${FUCHSIA_BUILD_DIR}"
diff --git a/devshell/go b/devshell/go
deleted file mode 100755
index 1c9fea0..0000000
--- a/devshell/go
+++ /dev/null
@@ -1,83 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run the go tool in fuchsia target configuration
-## Usage:
-##  fx go <go tool args>
-##  fx go --package=packagename <go tool args>
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help
-  exit 1
-}
-
-case "${FUCHSIA_ARCH}" in
-  arm64)
-    GOARCH=arm64
-    ;;
-  x64)
-    GOARCH=amd64
-    ;;
-esac
-GOROOT="${FUCHSIA_BUILD_DIR}/tools/goroot"
-
-if [[ ! -d "${GOROOT}" ]]; then
-  echo >&2 "ERROR: You must build the goroot before running this command"
-  exit 1
-fi
-
-while getopts ":-:" opt; do
-  case $opt in
-    \?)
-      echo >&2 "ERROR: Unrecognized short option: -$OPTARG"
-      usage
-      ;;
-    -)
-        case $OPTARG in
-        package=*) PACKAGE=${OPTARG#*=};;
-        *)
-            echo >&2 "ERROR: Unrecognized long option: $OPTARG"
-            usage
-            ;;
-        esac
-      ;;
-  esac
-done
-shift $((OPTIND-1))
-
-## If --package is provided and has a gopath in $FUCHSIA_BUILD_DIR/gen/gopaths,
-## prepend the given gopath to $GOPATH before invoking $GOROOT/bin/go.
-function package_gopath {
-  local package="$1"
-  if [[ -z "$package" ]]; then
-    return 1
-  fi
-
-  gopathdir="$FUCHSIA_BUILD_DIR/gen/gopaths"
-  d="$gopathdir/$package"
-  if [[ ! -d "$d" ]]; then
-    echo >&2 "ERROR: Package not found in $gopathdir: $package"
-    return 1
-  fi
-
-  echo $d
-}
-
-if [[ -n $PACKAGE ]]; then
-  maybe_gopath="$(package_gopath $PACKAGE)"
-  GOPATH="${maybe_gopath}:${GOPATH}"
-fi
-
-source "${FUCHSIA_DIR}/buildtools/vars.sh"
-CC="${GOROOT}/misc/fuchsia/clangwrap.sh" \
- FDIO_INCLUDE="${FUCHSIA_DIR}/zircon/system/ulib/fdio/include" \
- FUCHSIA_SHARED_LIBS="${FUCHSIA_BUILD_DIR}/${FUCHSIA_ARCH}-shared" \
- ZIRCON_SYSROOT="${FUCHSIA_BUILD_DIR}/sdk/exported/zircon_sysroot/arch/${FUCHSIA_ARCH}/sysroot" \
- CLANG_PREFIX="${BUILDTOOLS_CLANG_DIR}/bin" \
- GOOS=fuchsia GOARCH=${GOARCH} CGO_ENABLED=1 GOPATH="${GOPATH}" GOROOT="${GOROOT}" \
- "${GOROOT}/bin/go" "$@"
diff --git a/devshell/lib/__init__.py b/devshell/lib/__init__.py
deleted file mode 100644
index de784fc..0000000
--- a/devshell/lib/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
diff --git a/devshell/lib/add_symlink_to_bin.sh b/devshell/lib/add_symlink_to_bin.sh
deleted file mode 100755
index 2b9af46..0000000
--- a/devshell/lib/add_symlink_to_bin.sh
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-devshell_lib_dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-FUCHSIA_DIR="$(dirname $(dirname $(dirname "${devshell_lib_dir}")))"
-
-if [[ -d "${FUCHSIA_DIR}/.jiri_root/bin" ]]; then
-  rm -f "${FUCHSIA_DIR}/.jiri_root/bin/fx"
-  ln -s "../../scripts/fx" "${FUCHSIA_DIR}/.jiri_root/bin/fx"
-fi
diff --git a/devshell/lib/bashrc_checkup.sh b/devshell/lib/bashrc_checkup.sh
deleted file mode 100644
index 33915d0..0000000
--- a/devshell/lib/bashrc_checkup.sh
+++ /dev/null
@@ -1,129 +0,0 @@
-# No #!/bin/bash - See "usage"
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### check the health of a user's interactive bash environment (from "fx doctor")
-
-## usage:
-##   "${SHELL}" [-flags] "${FUCHSIA_DIR}/scripts/devshell/lib/bashrc_checkup.sh" || status=$?
-##   (Valid only for bash ${SHELL} since this script is bash.)
-
-# Detect potential problems for Fuchsia development from settings specific
-# to the user's interactive shell environment. Potential customizations can
-# include bash version and settings introduced in the user's ~/.bashrc file
-# such as bash functions, aliases, and non-exported variables such as
-# "${CDPATH}" and "${PATH}" that can impact how bash executes some commands
-# from the command line.
-#
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/vars.sh || exit $?
-source "${FUCHSIA_DIR}/scripts/devshell/lib/style.sh" || exit $?
-source "${FUCHSIA_DIR}/scripts/devshell/lib/common_term_styles.sh" || exit $?
-
-fx-config-read || exit $?
-
-# For bash users, this script also attempts to load your preferred
-# bash interpreter (if different from the default, such as Homebrew
-# bash on Mac), and load your ~/.bashrc settings, as would happen
-# in an interactive shell. This allows doctor to check for
-# potential issues with settings that don't normally propagate to
-# bash scripts (unless executed with "source"), such as bash
-# functions, aliases, and unexported variables.
-
-# For bash users, load settings that would exist in the user's
-# interactive bash shells as per
-# [The GNU Bash Reference Manual, for Bash, Version 4.4](https://www.gnu.org/software/bash/manual/html_node/Bash-Startup-Files.html)
-if [ -f ~/.bashrc ]; then
-  source ~/.bashrc
-fi
-
-check_cd() {
-  # Returns an error status if the current definition of "cd"
-  # writes anything to the stdout stream, which would break common bash
-  # script lines similar to the following:
-  #
-  #   SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
-
-  local cd_output=$(
-    CDPATH=""
-    cd "${FUCHSIA_DIR}" 2>/dev/null
-    cd scripts 2>/dev/null
-  )
-
-  local cdpath_output=$(
-    if [ -z "${cd_output}" ] && [ "${CDPATH}" != "" ]; then
-      CDPATH="${FUCHSIA_DIR}"
-      cd scripts 2>/dev/null
-    fi
-  )
-
-  if [ -z "${cdpath_output}" ] && [ -z "${cd_output}" ]; then
-    return 0  # The check passed!
-  fi
-
-  # The check failed. Print recommendations based on what we found.
-
-  local status=1
-
-  warn 'Your implementation of the "cd" command writes to stdout.'
-
-  details << EOF
-Many common developer scripts and tools use "cd" to find relative
-file paths and will fail in unpredictable ways.
-EOF
-
-  if [ ! -z "${cdpath_output}" ]; then
-    details << EOF
-
-The "cd" command writes to stdout based on your CDPATH environment variable.
-
-You can remove or unset CDPATH in your shell initialization script, or
-define a cd wrapper function.
-EOF
-  fi
-
-  details << EOF
-
-If you have not redefined "cd", and the builtin "cd" is writing to stdout,
-define a wrapper function and redirect the output to /dev/null or stderr.
-
-EOF
-  code << EOF
-cd() {
-  builtin cd "\$@"
-}
-EOF
-  details << EOF
-
-If you already redefine "cd" during shell initialization, find the alias,
-function, or script, and either remove it, or redirect the output to stderr
-by appending "", as in this example:
-
-EOF
-  code << EOF
-cd() {
-  builtin cd "\$@" >/dev/null
-  update_terminal_cwd
-}
-EOF
-  details << EOF
-
-(Note, in this example, "update_terminal_cwd" is a common MacOS function
-to call when changing directories. Other common "cd" overrides may invoke
-"pwd", "print", or other commands.)
-
-EOF
-
-  return ${status}
-}
-
-main() {
-  local status=0
-
-  check_cd || status=$?
-
-  return ${status}
-}
-
-main "$@" || exit $?
diff --git a/devshell/lib/common_term_styles.sh b/devshell/lib/common_term_styles.sh
deleted file mode 100644
index 26940f4..0000000
--- a/devshell/lib/common_term_styles.sh
+++ /dev/null
@@ -1,64 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### add-on functions for common styling of text messages to the terminal
-
-## "source style.sh" before sourcing this script.
-## Functions include:
-##
-## * info
-## * warn
-## * error
-## * link
-## * code
-## * details
-
-## usage examples:
-##
-## # First import style.sh
-##
-## source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-## source "${FUCHSIA_DIR}/scripts/devshell/lib/style.sh" || exit $?
-## source "${FUCHSIA_DIR}/scripts/devshell/lib/common_term_styles.sh" || exit $?
-##
-##  warn 'The warning message.'
-##
-##  details << EOF
-##A multi-line message with bash ${variable} expansion.
-##Excape dollars with backslash \$
-##See $(link 'https://some/hyper/link') to insert a link.
-##EOF
-##
-## Visual tests (and demonstration of capabilities) can be run from:
-##   //scripts/tests/common_term_styles-test-visually
-
-info() {
-  style::info --stdout "
-INFO: $@
-"
-}
-
-warn() {
-  style::warning --stdout "
-WARNING: $@
-"
-}
-
-error() {
-  style::error --stdout "
-ERROR: $@
-"
-}
-
-details() {
-  style::cat --indent 2
-}
-
-code() {
-  style::cat --bold --magenta --indent 4
-}
-
-link() {
-  style::link "$@"
-}
diff --git a/devshell/lib/disktools.sh b/devshell/lib/disktools.sh
deleted file mode 100755
index 50ce0aa..0000000
--- a/devshell/lib/disktools.sh
+++ /dev/null
@@ -1,37 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-fx-truncate() {
-  local -r file="$1"
-  local -r size="$2"
-  touch "${file}"
-
-  if [[ -z "${size}" ]]; then
-    echo >&2 "fx-truncate: size \"${size}\" not given"
-    return 1
-  fi
-
-  case $(uname) in
-    Darwin)
-      mkfile -n "${size}" "${file}"
-      ;;
-    Linux)
-      truncate -s "${size}" "${file}"
-      ;;
-    *)
-      head -c "${size}" /dev/zero > "${file}"
-      ;;
-  esac
-  return $?
-}
-
-fx-need-mtools() {
-  for tool in "mmd" "mcopy"; do
-    if ! which "${tool}" >&1 > /dev/null; then
-      echo >&2 "Tool \"${tool}\" not found. You may need to install GNU mtools"
-      return 1
-    fi
-  done
-}
\ No newline at end of file
diff --git a/devshell/lib/image_build_vars.sh b/devshell/lib/image_build_vars.sh
deleted file mode 100644
index 51ae1f2..0000000
--- a/devshell/lib/image_build_vars.sh
+++ /dev/null
@@ -1,9 +0,0 @@
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/vars.sh || return $?
-fx-config-read
-
-source "${FUCHSIA_BUILD_DIR}"/image_paths.sh
-source "${FUCHSIA_BUILD_DIR}"/zedboot_image_paths.sh
diff --git a/devshell/lib/rust.py b/devshell/lib/rust.py
deleted file mode 100644
index a176d6e..0000000
--- a/devshell/lib/rust.py
+++ /dev/null
@@ -1,88 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import os
-import sys
-import subprocess
-
-ROOT_PATH = os.path.abspath(__file__ + "/../../../..")
-FX_PATH = os.path.join(ROOT_PATH, "scripts", "fx")
-CONFIG_PATH = os.path.join(ROOT_PATH, ".config")
-
-def _walk_up_path(path):
-    res = set([path])
-    while True:
-        path, basename = os.path.split(path)
-        if not path:
-            break
-        res.add(path)
-    return res
-
-def _find_cargo_target(path, target_filter=None):
-    match_paths = _walk_up_path(path)
-    all_targets = subprocess.check_output([FX_PATH, "build", "-t", "targets"])
-    for gn_target in all_targets.split("\n"):
-        target_parts = gn_target.split(":")
-        if len(target_parts) < 2:
-            continue
-        target_path, gn_target = target_parts[0], target_parts[1]
-        if target_path in match_paths and gn_target.endswith("_cargo"):
-            gn_target=gn_target[:gn_target.rindex("_")]
-            if target_filter and target_filter != gn_target:
-                continue
-            yield "{path}:{target}".format(
-                    path=target_path,
-                    target=gn_target,
-            )
-
-def find_out_dir():
-    with open(CONFIG_PATH, "r") as config:
-        for line in config.readlines():
-            if line.startswith("FUCHSIA_BUILD_DIR="):
-                key, value = line.split("=")
-                return value.strip().strip("'")
-    print "Invalid fuchsia/.config: no FUCHSIA_BUILD_DIR entry found"
-    sys.exit(1)
-
-class GnTarget:
-    def __init__(self, gn_target):
-        gn_target = gn_target.lstrip("/")
-        gn_target_parts = gn_target.split(":", 1)
-
-        if gn_target_parts[0] == ".":
-            cwd_rel_path = os.path.relpath(os.path.abspath("."), ROOT_PATH)
-            target_filter = None if len(gn_target_parts) == 1 else gn_target_parts[1]
-            gn_targets = list(_find_cargo_target(cwd_rel_path, target_filter))
-            if not gn_targets:
-                print "No cargo targets found at '{}'".format(cwd_rel_path)
-                raise ValueError(gn_target)
-            elif len(gn_targets) > 1:
-                print "Multiple cargo targets found at '{}'".format(cwd_rel_path)
-                for gn_target in gn_targets:
-                    print "- {}".format(gn_target)
-                raise ValueError(gn_target)
-            else:
-                gn_target, = gn_targets
-                gn_target_parts = gn_target.split(":", 1)
-
-        self.gn_target = gn_target
-        self.parts = gn_target_parts
-
-    def __str__(self):
-        return self.gn_target
-
-    @property
-    def path(self):
-        return os.path.join(ROOT_PATH, self.parts[0])
-
-    def manifest_path(self, out_dir=None):
-        if len(self.parts) == 1:
-            # Turn foo/bar into foo/bar/bar
-            path = os.path.join(self.gn_target, os.path.basename(self.gn_target))
-        else:
-            # Turn foo/bar:baz into foo/bar/baz
-            path = self.gn_target.replace(":", os.sep)
-
-        return os.path.join(ROOT_PATH, out_dir, "gen", path, "Cargo.toml")
-
diff --git a/devshell/lib/style.sh b/devshell/lib/style.sh
deleted file mode 100644
index 8e0a2cf..0000000
--- a/devshell/lib/style.sh
+++ /dev/null
@@ -1,311 +0,0 @@
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# This script adds text style options to echo, cat, and printf. For
-# example:
-#
-#   style::echo --stderr --bold --underline --color red -n Yo ho ho
-#
-# print "Yo ho ho" without a newline (echo's -n flag), in bold red
-# with underline. The text is written to stderr instead of the default
-# stdout.
-#
-#   style::cat --color black --background cyan --indent 4 <<EOF
-#   Multi-line text with expanded bash ${variables}
-#   can be styled and indented.
-#   EOF
-#
-# style::info, style::warning, and style::error use echo to stderr
-# with default color and bold text. For example:
-#
-#   style::warning "WARNING: This is a warning!"
-#
-# style::link uses echo to stdout, with dark_blue underlined text
-#
-# You can override these styles with your own preferences, for example:
-#
-#   export STYLE_WARNING="--stderr --faint --dark_red --background dark_yellow"
-#
-# Visual tests (and demonstration of capabilities) can be run from:
-#   //scripts/tests/style-test-visually
-
-# This script should be sourced. It is compatible with Bash 3.
-# MacOS still comes with Bash 3, so unfortunately no associative arrays.
-
-STYLE_TO_TTY_ONLY=false  # Set to true to suppress styling if output is redirected
-
-[[ "${STYLE_ERROR}" != "" ]] || STYLE_ERROR="--stderr --bold --color red"
-[[ "${STYLE_WARNING}" != "" ]] || STYLE_WARNING="--stderr --bold --color dark_yellow"
-[[ "${STYLE_INFO}" != "" ]] || STYLE_INFO="--stderr --bold --color dark_green"
-[[ "${STYLE_LINK}" != "" ]] || STYLE_LINK="--underline --color dark_blue"
-
-declare -i TERM_ATTRIBUTES__reset=0
-declare -i TERM_ATTRIBUTES__bold=1
-declare -i TERM_ATTRIBUTES__faint=2
-declare -i TERM_ATTRIBUTES__italic=3
-declare -i TERM_ATTRIBUTES__underline=4
-declare -i TERM_ATTRIBUTES__blink=5
-
-declare -i TERM_COLORS__default=39
-declare -i TERM_COLORS__black=30
-declare -i TERM_COLORS__dark_red=31
-declare -i TERM_COLORS__dark_green=32
-declare -i TERM_COLORS__dark_yellow=33
-declare -i TERM_COLORS__dark_blue=34
-declare -i TERM_COLORS__dark_magenta=35
-declare -i TERM_COLORS__purple=35
-declare -i TERM_COLORS__dark_cyan=36
-declare -i TERM_COLORS__light_gray=37
-declare -i TERM_COLORS__gray=90
-declare -i TERM_COLORS__red=91
-declare -i TERM_COLORS__green=92
-declare -i TERM_COLORS__yellow=93
-declare -i TERM_COLORS__blue=94
-declare -i TERM_COLORS__magenta=95
-declare -i TERM_COLORS__pink=95
-declare -i TERM_COLORS__cyan=96
-declare -i TERM_COLORS__white=97
-
-style::colors() {
-  set | sed -n "s/^TERM_COLORS__\([^=]*\)=.*$/\1/p" >&2
-}
-
-style::attributes() {
-  set | sed -n "s/^TERM_ATTRIBUTES__\([^=]*\)=.*$/--\1/p" >&2
-}
-
-style::usage() {
-  local help_option="$1"; shift
-  if [[ "${help_option}" == "colors" ]]; then
-    style::colors
-    return
-  elif [[ "${help_option}" == "attributes" ]]; then
-    style::attributes
-    return
-  fi
-  local function_call="$1"
-  local -a words=( $function_call )
-  local funcname="${words[0]}"
-  local command="$2"
-  local specifics="$3"
-
-  >&2 echo "
-Usage: ${function_call} [style options] [command parameters]"
-
-  if [[ "${specifics}" != "" ]]; then
-    >&2 echo "
-${specifics}"
-  fi
-  >&2 cat << EOF
-
-style options include:
-  --bold, --faint, --underline, etc.
-  --color <color_name>
-  --background <color_name>
-  --indent <spaces_count>
-  --stderr (output to standard error instead of standard out)
-
-  echo "This is \$(style::echo -f --bold LOUD) and soft."
-
-command parameters are those supported by the ${command} command.
-
-Use ${funcname} --help colors for a list of colors or backgrounds
-Use ${funcname} --help attributes for a list of style attribute flags
-EOF
-}
-
-style::attribute() {
-  local name="$1"
-  local fallback="$2"
-  local var=TERM_ATTRIBUTES__${name}
-  local -i attribute=${!var}
-  if ! (( attribute )); then
-    if [[ $fallback != "" ]]; then
-      echo "${fallback}"
-      return 0
-    else
-      >&2 echo "Invalid attribute name: $name"
-      return 1
-    fi
-  fi
-  echo ${attribute}
-}
-
-style::color() {
-  local name="$1"
-  local fallback="$2"
-  local var=TERM_COLORS__${name}
-  local -i color=${!var}
-  if ! (( color )); then
-    if [[ $fallback != "" ]]; then
-      echo "${fallback}"
-      return 0
-    else
-      >&2 echo "Invalid color name: $name"
-      return 1
-    fi
-  fi
-  echo ${color}
-}
-
-style::background() {
-  local color
-  color=$(style::color "$1" "$2" || exit $?) || return $?
-  echo $((10+${color}))
-}
-
-style::stylize() {
-  if [[ "$1" == --* || "$1" == "" ]]; then
-    style::usage "$2" "${FUNCNAME[0]} <command>" "stylized" "\
-<command> is any command with output to stylize, followed by style options,
-and then the command's normal parameters."
-    return
-  fi
-
-  local command="$1"; shift
-  if [[ "$1" == "--help" ]]; then
-    style::usage "$2" "style::${command}" "'${command}'"
-    return
-  fi
-
-  local get_flags=true
-  local -i fd=1
-  local styles
-  local semicolon
-  local name
-  local -i indent=0
-  local prefix
-  local -i code=0
-
-  while $get_flags; do
-    case "$1" in
-      --stderr)
-        fd=2
-        shift
-        ;;
-      --stdout)
-        fd=1
-        shift
-        ;;
-      --color)
-        shift; name="$1"; shift
-        styles="${styles}${semicolon}$(style::color $name || exit $?)" || return $?
-        semicolon=';'
-        ;;
-      --background)
-        shift; name="$1"; shift
-        styles="${styles}${semicolon}$(style::background $name || exit $?)" || return $?
-        semicolon=';'
-        ;;
-      --indent)
-        shift; indent=$1; shift
-        prefix="$(printf "%${indent}s")"
-        ;;
-      --*)
-        name="${1:2}"
-        code=$(style::attribute $name 0)
-        if (( code )); then
-          shift
-          styles="${styles}${semicolon}${code}"
-          semicolon=';'
-        else
-          code=$(style::color $name 0)
-          if (( code )); then
-            shift
-            styles="${styles}${semicolon}${code}"
-            semicolon=';'
-          else
-            get_flags=false
-          fi
-        fi
-        ;;
-      *)
-        get_flags=false
-        ;;
-    esac
-  done
-
-  if [ ! -t ${fd} ] && ${STYLE_TO_TTY_ONLY}; then
-    # Output is not to a TTY so don't stylize
-    if [[ "${prefix}" == "" ]]; then
-      >&${fd} "${command}" "$@" || status=$?
-    else
-      >&${fd} "${command}" "$@" | sed "s/^/${prefix}/"
-      if (( ${PIPESTATUS[0]} != 0 )); then
-        status=${PIPESTATUS[0]}
-      fi
-    fi
-    return 0
-  fi
-
-  local if_newline=''
-  local text
-
-  # Add placeholder (.) so command substitution doesn't strip trailing newlines
-  text="$("${command}" "$@" || exit $?;echo -n '.')" || return $?
-  if [[ "${prefix}" != "" ]]; then
-    text="$(echo "${text}" | sed "s/^/${prefix}/;\$s/^${prefix}[.]\$/./")"
-  fi
-
-  local -i len=$((${#text}-2))
-  if [[ "${text:$len:1}" == $'\n' ]]; then
-    # Save last newline to add back after styling.
-    if_newline='\n'
-  else
-    ((len++))
-  fi
-  # Strip trailing newline, if any, and placeholder.
-  text="${text:0:$((len))}"
-
-  # Style everything except newlines, otherwise background color highlights
-  # entire line. Add extra line with a character so sed does not add it's own
-  # last newline, then delete the line after substitutions.
-  local styled=$(printf '%s\n.' "${text}" | sed -e $'s/$/\033[0m/;s/^/\033['"${styles}"'m/;$d')
-
-  >&${fd} printf "%s${if_newline}" "${styled}"
-
-  return 0
-}
-
-style::echo() {
-  style::stylize "${FUNCNAME[0]:7}" "$@" || return $?
-}
-
-style::cat() {
-  style::stylize "${FUNCNAME[0]:7}" "$@" || return $?
-}
-
-style::printf() {
-  style::stylize "${FUNCNAME[0]:7}" "$@" || return $?
-}
-
-style::_echo_with_styles() {
-  local funcname="$1";shift
-  local style_options="$1";shift
-  if [[ "$1" == "--help" ]]; then
-
-    style::usage "$2" "${funcname}" "echo" "\
-Default style options for ${funcname}:
-  $(style::echo ${style_options} --stdout \"${style_options}\")"
-
-    return
-  fi
-  style::echo ${style_options} "$@" || return $?
-}
-
-style::error() {
-  style::_echo_with_styles "${FUNCNAME[0]}" "${STYLE_ERROR}" "$@" || return $?
-}
-
-style::warning() {
-  style::_echo_with_styles "${FUNCNAME[0]}" "${STYLE_WARNING}" "$@" || return $?
-}
-
-style::info() {
-  style::_echo_with_styles "${FUNCNAME[0]}" "${STYLE_INFO}" "$@" || return $?
-}
-
-style::link() {
-  style::_echo_with_styles "${FUNCNAME[0]}" "${STYLE_LINK}" "$@" || return $?
-}
diff --git a/devshell/lib/vars.sh b/devshell/lib/vars.sh
deleted file mode 100644
index 4c13d10..0000000
--- a/devshell/lib/vars.sh
+++ /dev/null
@@ -1,335 +0,0 @@
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -n "${ZSH_VERSION}" ]]; then
-  devshell_lib_dir=${${(%):-%x}:a:h}
-else
-  devshell_lib_dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-fi
-
-export FUCHSIA_DIR="$(dirname $(dirname $(dirname "${devshell_lib_dir}")))"
-export FUCHSIA_OUT_DIR="${FUCHSIA_OUT_DIR:-${FUCHSIA_DIR}/out}"
-export FUCHSIA_CONFIG="${FUCHSIA_CONFIG:-${FUCHSIA_DIR}/.config}"
-unset devshell_lib_dir
-
-export ZIRCON_TOOLS_DIR="${FUCHSIA_OUT_DIR}/build-zircon/tools"
-
-if [[ "${FUCHSIA_DEVSHELL_VERBOSITY}" -eq 1 ]]; then
-  set -x
-fi
-
-function fx-symbolize {
-  if [[ -z "$FUCHSIA_BUILD_DIR" ]]; then
-    fx-config-read
-  fi
-  if [[ -z "$BUILDTOOLS_CLANG_DIR" ]]; then
-    source "${FUCHSIA_DIR}/buildtools/vars.sh"
-  fi
-  local idstxt="${FUCHSIA_BUILD_DIR}/ids.txt"
-  if [[ $# > 0 ]]; then
-    idstxt="$1"
-  fi
-  local prebuilt_dir="${FUCHSIA_DIR}/zircon/prebuilt/downloads"
-  local llvm_symbolizer="${BUILDTOOLS_CLANG_DIR}/bin/llvm-symbolizer"
-  "${prebuilt_dir}/symbolize" -ids-rel -ids "$idstxt" -llvm-symbolizer "$llvm_symbolizer"
-}
-
-function fx-config-read-if-present {
-  if [[ "${FUCHSIA_CONFIG}" = "-" && -n "${FUCHSIA_BUILD_DIR}" ]]; then
-    if [[ -f "${FUCHSIA_BUILD_DIR}/fx.config" ]]; then
-      source "${FUCHSIA_BUILD_DIR}/fx.config"
-    else
-      FUCHSIA_ARCH="$(
-        fx-config-glean-arch "${FUCHSIA_BUILD_DIR}/args.gn")" || return
-    fi
-  elif [[ -f "${FUCHSIA_CONFIG}" ]]; then
-    source "${FUCHSIA_CONFIG}"
-    # If there's a file written by `gn gen` (//build/gn/BUILD.gn),
-    # then it can supplement with extra settings and exports.
-    # Note FUCHSIA_BUILD_DIR was just set by the previous line!
-    if [[ -f "${FUCHSIA_BUILD_DIR}/fx.config" ]]; then
-      source "${FUCHSIA_BUILD_DIR}/fx.config"
-    fi
-  else
-    return 1
-  fi
-
-  # Paths are relative to FUCHSIA_DIR unless they're absolute paths.
-  if [[ "${FUCHSIA_BUILD_DIR:0:1}" != "/" ]]; then
-    FUCHSIA_BUILD_DIR="${FUCHSIA_DIR}/${FUCHSIA_BUILD_DIR}"
-  fi
-
-  export FUCHSIA_BUILD_DIR FUCHSIA_ARCH
-
-  export ZIRCON_BUILDROOT="${ZIRCON_BUILDROOT:-${FUCHSIA_OUT_DIR}/build-zircon}"
-  export ZIRCON_BUILD_DIR="${ZIRCON_BUILD_DIR:-${ZIRCON_BUILDROOT}/build-${FUCHSIA_ARCH}}"
-  return 0
-}
-
-function fx-config-read {
-  if ! fx-config-read-if-present ; then
-    echo >& 2 "error: Cannot read config from ${FUCHSIA_CONFIG}. Did you run \"fx set\"?"
-    exit 1
-  fi
-
-  # The user may have done "rm -rf out".
-  local -r args_gn_file="${FUCHSIA_BUILD_DIR}/args.gn"
-  if [[ ! -f "$args_gn_file" ]]; then
-    echo >&2 "Build directory problem, args.gn is missing."
-    echo >&2 "Did you \"rm -rf out\" and not rerun \"fx set\"?"
-    exit 1
-  fi
-}
-
-function fx-config-glean-arch {
-  local -r args_file="$1"
-  # Glean the architecture from the args.gn file written by `gn gen`.
-  local arch=''
-  if [[ -r "$args_file" ]]; then
-    arch=$(
-      sed -n '/target_cpu/s/[^"]*"\([^"]*\).*$/\1/p' "$args_file"
-    ) || return $?
-  fi
-  if [[ -z "$arch" ]]; then
-    # Hand-invoked gn might not have had target_cpu in args.gn.
-    # Since gn defaults target_cpu to host_cpu, we need to do the same.
-    local -r host_cpu=$(uname -m)
-    case "$host_cpu" in
-      x86_64)
-        arch=x64
-        ;;
-      aarch64*|arm64*)
-        arch=aarch64
-        ;;
-      *)
-        echo >&2 "ERROR: Cannot default target_cpu to this host's cpu: $host_cpu"
-        return 1
-        ;;
-    esac
-  fi
-  echo "$arch"
-}
-
-function fx-config-write {
-  local -r build_dir="$1"
-  if [[ "$build_dir" == /* ]]; then
-    local -r args_file="${build_dir}/args.gn"
-    local -r zircon_args_file="${build_dir}.zircon-args"
-  else
-    local -r args_file="${FUCHSIA_DIR}/${build_dir}/args.gn"
-    local -r zircon_args_file="${FUCHSIA_DIR}/${build_dir}.zircon-args"
-  fi
-  local arch zircon_args
-  arch="$(fx-config-glean-arch "$args_file")" || return
-  if [[ -f "${zircon_args_file}" ]]; then
-    zircon_args=$(<"${zircon_args_file}")
-  fi
-  shift
-  echo > "${FUCHSIA_CONFIG}" "\
-# Generated by \`fx set\` or \`fx use\`.
-FUCHSIA_BUILD_DIR='${build_dir}'
-FUCHSIA_ARCH='${arch}'
-FUCHSIA_BUILD_ZIRCON_ARGS=(${zircon_args})
-"
-}
-
-function get-device-name {
-  fx-config-read
-  # If DEVICE_NAME was passed in fx -d, use it
-  if [[ "${FUCHSIA_DEVICE_NAME+isset}" == "isset" ]]; then
-    echo "${FUCHSIA_DEVICE_NAME}"
-    return
-  fi
-  # Uses a file outside the build dir so that it is not removed by `gn clean`
-  local pairfile="${FUCHSIA_BUILD_DIR}.device"
-  # If .device file exists, use that
-  if [[ -f "${pairfile}" ]]; then
-    echo "$(<"${pairfile}")"
-    return
-  fi
-  echo ""
-}
-
-function get-fuchsia-device-addr {
-  fx-command-run netaddr "$(get-device-name)" --fuchsia "$@"
-}
-
-function get-netsvc-device-addr {
-  fx-command-run netaddr "$(get-device-name)" "$@"
-}
-
-# if $1 is "-d" or "--device" return
-#   - the netaddr if $2 looks like foo-bar-baz-flarg
-#     OR
-#   - $2 if it doesn't
-# else return ""
-# if -z is suppled as the third argument, get the netsvc
-# address instead of the netstack one
-function get-device-addr {
-  device=""
-  if [[ "$1" == "-d" || "$1" == "--device" ]]; then
-    shift
-    if [[ "$1" == *"-"* ]]; then
-      if [[ "$2" != "-z" ]]; then
-        device="$(fx-command-run netaddr "$1" --fuchsia)"
-      else
-        device="$(fx-command-run netaddr "$1")"
-      fi
-    else
-      device="$1"
-    fi
-    shift
-  fi
-  echo "${device}"
-}
-
-function fx-command-run {
-  local -r command_name="$1"
-  local -r command_path="${FUCHSIA_DIR}/scripts/devshell/${command_name}"
-
-  if [[ ! -f "${command_path}" ]]; then
-    echo >& 2 "error: Unknown command ${command_name}"
-    exit 1
-  fi
-
-  shift
-  "${command_path}" "$@"
-}
-
-buildtools_whitelist=" gn ninja "
-
-function fx-buildtool-run {
-  local -r command_name="$1"
-  local -r command_path="${FUCHSIA_DIR}/buildtools/${command_name}"
-
-  if [[ ! "${buildtools_whitelist}" =~ .*[[:space:]]"${command_name}"[[:space:]].* ]]; then
-    echo >& 2 "error: command ${command_name} not allowed"
-    exit 1
-  fi
-
-  if [[ ! -f "${command_path}" ]]; then
-    echo >& 2 "error: Unknown command ${command_name}"
-    exit 1
-  fi
-
-  shift
-  "${command_path}" "$@"
-}
-
-function fx-command-exec {
-  local -r command_name="$1"
-  local -r command_path="${FUCHSIA_DIR}/scripts/devshell/${command_name}"
-
-  if [[ ! -f "${command_path}" ]]; then
-    echo >& 2 "error: Unknown command ${command_name}"
-    exit 1
-  fi
-
-  shift
-  exec "${command_path}" "$@"
-}
-
-function fx-print-command-help {
-  local -r command_path="$1"
-  if grep '^## ' "$command_path" > /dev/null; then
-    sed -n -e 's/^## //p' -e 's/^##$//p' < "$command_path"
-  else
-    local -r command_name=$(basename "$command_path")
-    echo "No help found. Try \`fx $command_name -h\`"
-  fi
-}
-
-function fx-command-help {
-  fx-print-command-help "$0"
-}
-
-# This function massages arguments to an fx subcommand so that a single
-# argument `--switch=value` becomes two arguments `--switch` `value`.
-# This lets each subcommand's main function use simpler argument parsing
-# while still supporting the preferred `--switch=value` syntax.  It also
-# handles the `--help` argument by redirecting to what `fx help command`
-# would do.  Because of the complexities of shell quoting and function
-# semantics, the only way for this function to yield its results
-# reasonably is via a global variable.  FX_ARGV is an array of the
-# results.  The standard boilerplate for using this looks like:
-#   function main {
-#     fx-standard-switches "$@"
-#     set -- "${FX_ARGV[@]}"
-#     ...
-#     }
-# Arguments following a `--` are also added to FX_ARGV but not split, as they
-# should usually be forwarded as-is to subprocesses.
-function fx-standard-switches {
-  # In bash 4, this can be `declare -a -g FX_ARGV=()` to be explicit
-  # about setting a global array.  But bash 3 (shipped on macOS) does
-  # not support the `-g` flag to `declare`.
-  FX_ARGV=()
-  while [[ $# -gt 0 ]]; do
-    if [[ "$1" = "--help" || "$1" = "-h" ]]; then
-      fx-print-command-help "$0"
-      # Exit rather than return, so we bail out of the whole command early.
-      exit 0
-    elif [[ "$1" == --*=* ]]; then
-      # Turn --switch=value into --switch value.
-      FX_ARGV+=("${1%%=*}" "${1#*=}")
-    elif [[ "$1" == "--" ]]; then
-      # Do not parse remaining parameters after --
-      FX_ARGV+=("$@")
-      return
-    else
-      FX_ARGV+=("$1")
-    fi
-    shift
-  done
-}
-
-function fx-choose-build-concurrency {
-  if grep -q "use_goma = true" "${FUCHSIA_BUILD_DIR}/args.gn"; then
-    # The recommendation from the Goma team is to use 10*cpu-count.
-    local cpus="$(fx-cpu-count)"
-    echo $(($cpus * 10))
-  else
-    fx-cpu-count
-  fi
-}
-
-function fx-cpu-count {
-  local -r cpu_count=$(getconf _NPROCESSORS_ONLN)
-  echo "$cpu_count"
-}
-
-# Define the global lock file in a readonly variable, setting its value if it
-# was never defined before. Need to do it in two steps since "readonly
-# FX_LOCK_FILE=..." does not work because we include this file multiple times in
-# one `fx` run.
-: ${FX_LOCK_FILE:="${FUCHSIA_DIR}/.build_lock"}
-readonly FX_LOCK_FILE
-
-# Use a lock file around a command if possible.
-# Print a message if the lock isn't immediately entered,
-# and block until it is.
-function fx-try-locked {
-  if ! command -v shlock >/dev/null; then
-    # Can't lock! Fall back to unlocked operation.
-    fx-exit-on-failure "$@"
-  elif shlock -f "${FX_LOCK_FILE}" -p $$; then
-    # This will cause a deadlock if any subcommand calls back to fx build,
-    # because shlock isn't reentrant by forked processes.
-    fx-cmd-locked "$@"
-  else
-    echo "Locked by ${FX_LOCK_FILE}..."
-    while ! shlock -f "${FX_LOCK_FILE}" -p $$; do sleep .1; done
-    fx-cmd-locked "$@"
-  fi
-}
-
-function fx-cmd-locked {
-  # Exit trap to clean up lock file
-  trap "[[ -n \"${FX_LOCK_FILE}\" ]] && rm -f \"${FX_LOCK_FILE}\"" EXIT
-  fx-exit-on-failure "$@"
-}
-
-function fx-exit-on-failure {
-  "$@" || exit $?
-}
diff --git a/devshell/list-usb-disks b/devshell/list-usb-disks
deleted file mode 100755
index 3b8b3b1..0000000
--- a/devshell/list-usb-disks
+++ /dev/null
@@ -1,30 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### list attached usb disks
-
-case "$(uname)" in
-Darwin)
-  disks=$(diskutil list | grep '^/dev/' | grep -v 'internal\|synthesized\|image' | grep -v 'virtual' | cut -d ' ' -f 1)
-  for disk in $disks; do
-    details="$(diskutil info "${disk}" | grep 'Media Name' | cut -d : -f 2-)"
-    echo "${disk} - ${details}"
-  done
-  ;;
-Linux)
-  for disk in $(ls /dev/disk/by-path/*-usb-* 2>/dev/null); do
-    if [[ "${disk}" =~ part ]]; then
-      continue
-    fi
-    disk=$(readlink -f "${disk}")
-    details="$(cat /sys/block/$(basename "${disk}")/device/model)"
-    echo "${disk} - ${details}"
-  done
-  ;;
-*)
-  echo "Unsupported platform $(uname)"
-  exit 1
-  ;;
-esac
diff --git a/devshell/log b/devshell/log
deleted file mode 100755
index 1d8c0ff..0000000
--- a/devshell/log
+++ /dev/null
@@ -1,23 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### listen for kernel logs.
-
-## This command delegates to the Zircon `loglistener` binary.
-## This will listen to the device specified with `fx set-device`; otherwise
-## one of the devices on the link-local network.
-
-set -e
-set -o pipefail
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-if [[ $# > 0 && $1 = "--raw" ]]; then
-  shift
-  exec "${ZIRCON_TOOLS_DIR}/loglistener" "$(get-device-name)"
-else
-  "${ZIRCON_TOOLS_DIR}/loglistener" "$(get-device-name)" | fx-symbolize
-fi
diff --git a/devshell/make-fuchsia-vol b/devshell/make-fuchsia-vol
deleted file mode 100755
index 32aa908..0000000
--- a/devshell/make-fuchsia-vol
+++ /dev/null
@@ -1,11 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build a fuchsia persistent disk
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-"${FUCHSIA_BUILD_DIR}/tools/make-fuchsia-vol" "$@"
diff --git a/devshell/mkzedboot b/devshell/mkzedboot
deleted file mode 100755
index 2b73bc7..0000000
--- a/devshell/mkzedboot
+++ /dev/null
@@ -1,152 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### make a zedboot USB key
-
-## usage: fx mkzedboot [options] <usb device>
-##  -f            force writing to a non-usb target
-##  -i|--install  include "offline" install
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/image_build_vars.sh || exit $?
-
-if [[ "${FUCHSIA_ARCH}" != "x64" ]]; then
-  echo >&2 mkzedboot is not supported for ${FUCHSIA_ARCH}
-  exit 1
-fi
-
-force=false
-if [[ "$1" == "-f" ]]; then
-  shift
-  force=true
-fi
-
-include_install=false
-if [[ "$1" == "-i" ]] || [[ "$1" == "--install" ]]; then
-  shift
-  include_install=true
-fi
-
-is_usb() {
-  if ! ${force}; then
-    fx-command-run list-usb-disks | grep "$1"
-  fi
-}
-
-USB_DEVICE="$1"
-if [[ -z "${USB_DEVICE}" ]]; then
-  echo >&2 "device argument required"
-  echo "USB disks:"
-  fx-command-run list-usb-disks
-  exit 1
-fi
-if ! is_usb "${USB_DEVICE}"; then
-  echo >&2 "${USB_DEVICE} does not look like a USB device, use -f to force, or pick from below"
-  echo "USB disks:"
-  fx-command-run list-usb-disks
-  exit 1
-fi
-
-echo >&2 "Changing ownership of ${USB_DEVICE} to ${USER}"
-sudo chown "${USER}" "${USB_DEVICE}"
-
-echo >&2 "Opening device..."
-# We open the device and hold onto an fd for the duration of our modifications.
-# This prevents automounting solutions from observing a final close and
-# rescanning the partition table until we're all done making changes -
-# particularly important on macOS where users would otherwise receive
-# EAGAIN/EBUSY and so on.
-open_device() {
-  case "$(uname)" in
-  Darwin)
-    if ! diskutil quiet unmountDisk "${USB_DEVICE}"; then
-      echo >&2 "Failed to unmount ${USB_DEVICE}, cannot continue"
-      exit 1
-    fi
-    ;;
-  esac
-  exec 3>>"${USB_DEVICE}"
-}
-close_device() {
-  echo >&2 "Closing device."
-  exec 3>&-
-}
-open_device
-
-# Destroy any existing GPT/MBR on the device and re-create
-echo "Create new GPT partition table... "
-"${FUCHSIA_BUILD_DIR}/tools/cgpt" create "${USB_DEVICE}"
-"${FUCHSIA_BUILD_DIR}/tools/cgpt" boot -p "${USB_DEVICE}"
-echo "done"
-
-echo "Create new partitions... "
-# ESP needs to be a FAT compatible size
-esp_size=$(((63*1024*1024)/512))
-vboot_size=$(((64*1024*1024)/512))
-esp_offset=2048
-vboot_offset=$(($esp_size + $esp_offset))
-"${FUCHSIA_BUILD_DIR}/tools/cgpt" add -s "${esp_size}" -t efi -b "${esp_offset}" -l esp "${USB_DEVICE}"
-"${FUCHSIA_BUILD_DIR}/tools/cgpt" add -s "${vboot_size}" -t kernel -b "${vboot_offset}" -l zedboot "${USB_DEVICE}"
-
-# NOTE: Ok, so here goes some stuff. I could have written a much smarter "dd"
-# (a thing that can operate on block-boundaries for seek and copy, but that
-# doesn't do operations ONE BLOCK AT A TIME because it's 2018 yo), or I could
-# do what follows. Before this change, adding the install image to a disk via
-# DD would take 20 minutes. That's just absurd.
-# The stuff:
-# Align the install_offset to a 4mb boundary.
-# Pad the partition size to a 4mb boundary.
-# Set the dd block size to 4mb, even though it really isn't 4mb.
-# Seek offset*lba/4mb
-# Write with osync
-
-if $include_install; then
-  if [[ ! -f "${FUCHSIA_BUILD_DIR}/${IMAGE_INSTALLER_RAW}" ]]; then
-    echo >&2 "Install image not found at ${FUCHSIA_BUILD_DIR}/${IMAGE_INSTALLER_RAW} did you build it?"
-    exit 1
-  fi
-
-  install_image_size=$(wc -c "${FUCHSIA_BUILD_DIR}/${IMAGE_INSTALLER_RAW}" | awk '{print $1}')
-  # Add some slack, like the build does, as the file size doesn't represent the
-  # volume size and there's no host tool that presently will print the
-  # superblock volume size data.
-  install_image_size=$((($install_image_size * 14) / 10))
-  # It begins. Pad the image size to a 4mb boundary above it's size:
-  install_size=$((($install_image_size + 4194303) / 4194304))
-  # We need to specify the install size in 512byte lba's:
-  install_size=$(($install_size * 8192))
-
-  install_min_offset=$(($esp_size + $esp_offset + $vboot_size))
-  # Align the partition offset to a 4mb "block size"
-  install_offset=$(( (($install_min_offset * 512) + 4194303) / 4194304))
-  # The lba offset of that is:
-  install_lba_offset=$(($install_offset * 8192))
-  "${FUCHSIA_BUILD_DIR}/tools/cgpt" add -s "${install_size}" -t "48435546-4953-2041-494E-5354414C4C52" -b "${install_lba_offset}" -l install "${USB_DEVICE}"
-fi
-"${FUCHSIA_BUILD_DIR}/tools/cgpt" add -i 2 -T 1 -S 1 -P 2 "${USB_DEVICE}"
-echo "done"
-
-echo "Writing zedboot for EFI"
-dd if="${FUCHSIA_BUILD_DIR}/${IMAGE_ZEDBOOT_ESP}" of="${USB_DEVICE}" seek=${esp_offset}
-echo "Writing zedboot for Cros"
-dd if="${FUCHSIA_BUILD_DIR}/${IMAGE_ZEDBOOT_VBOOT}" of="${USB_DEVICE}" seek=${vboot_offset}
-if $include_install; then
-  echo "Writing install partition"
-  dd if="${FUCHSIA_BUILD_DIR}/${IMAGE_INSTALLER_RAW}" of="${USB_DEVICE}" seek=${install_offset} bs=4194304
-fi
-echo "done"
-
-close_device
-
-case "$(uname)" in
-  Linux)
-    eject "${USB_DEVICE}"
-    ;;
-  Darwin)
-    diskutil eject "${USB_DEVICE}"
-    ;;
-esac
-
diff --git a/devshell/net-run b/devshell/net-run
deleted file mode 100755
index 0abb941..0000000
--- a/devshell/net-run
+++ /dev/null
@@ -1,98 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run Fuchsia on QEMU in background and runs SSH command after netstack runs
-
-## Run Fuchsia on QEMU (using standard fx setup) in background and pings the
-## target until netstack is up running. If successful, issues the given SSH
-## command. Always kills QEMU before exiting. Redirects kernel logs to out/ and
-## by default runs:
-##
-## `fx run -- -N -u "${FUCHSIA_DIR}"/scripts/start-dhcp-server.sh`
-##
-## These options can be overridden with the FX_NET_RUN_OPTIONS environment
-## variable.
-##
-## usage: fx net-run [--target TARGET] SSH_COMMAND
-##
-##    --target      IP address to connect to on launching. Defaults to the
-##                  value of the FX_NET_RUN_TARGET environment variable.
-##
-##    SSH_COMMAND   Any argument will be passed directly to "fx ssh".
-##                  If not specified, it will open an interactive SSH session.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-# make sure to terminate qemu and any other children when this script exits!
-trap 'kill 0 &>/dev/null' INT EXIT TERM
-
-klog="$FUCHSIA_DIR/out/qemu-klog"
-
-# Defaults
-# If --run is set, it will overwrite this value
-run_options="${FX_NET_RUN_OPTIONS}"
-if [[ -z "${run_options}" ]]; then
-  # NOTE: this script could tell us about the hostname this qemu instance
-  # is assigned using dnsmasq's leasefile, rather than using --target/envvars
-  run_options="-N -u ${FUCHSIA_DIR}/scripts/start-dhcp-server.sh"
-fi
-target="${FX_NET_RUN_TARGET}"
-
-# Flag parsing
-while [[ "$1" =~ ^- ]]; do
-  case "$1" in
-    -h|--help)
-      fx-command-help
-      exit 0
-      ;;
-    --target)
-      shift
-      target="$1"
-      ;;
-    *)
-      break
-  esac
-  shift
-done
-
-# Error check
-if [[ -z "${target}" ]]; then
-  echo -e "No target found. Use --target or export \$FX_NET_RUN_TARGET."
-  exit 1
-fi
-
-# Run fx run in background. If successful, it will be sent to background.
-echo "Running sudo now in case the DHCP server asks for it..."
-sudo echo "sudo successful."
-
-echo "Using \"${run_options}\" to boot Fuchsia."
-echo
-echo "To see live kernel logs, run \`tail -f $klog\`."
-echo
-fx-command-run "run" ${run_options} &> "$klog" &
-
-ping_count=1
-ping_max=120
-ping_wait=1
-max_wait=$((ping_wait * ping_max))
-
-echo "Waiting for device (started at $(date +%X), timeout in ${max_wait}s)."
-while ((ping_count <= ping_max)); do
-  # check for network first, then for ssh-ability
-  if ping -c1 -W1 "$target" &>/dev/null
-    then
-    if fx-command-run ssh "$target" echo &>/dev/null
-    then
-      break
-    fi
-  fi
-
-  ping_count=$((ping_count + 1))
-  sleep $ping_wait
-done
-
-echo "Running \`$*\`:"
-echo
-fx-command-run "ssh" "${target}" "$@"
diff --git a/devshell/netaddr b/devshell/netaddr
deleted file mode 100755
index d58dd8b..0000000
--- a/devshell/netaddr
+++ /dev/null
@@ -1,13 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### get the address of a running fuchsia system
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-"${ZIRCON_TOOLS_DIR}/netaddr" --nowait "$@"
diff --git a/devshell/netboot b/devshell/netboot
deleted file mode 100755
index 03fdeb5..0000000
--- a/devshell/netboot
+++ /dev/null
@@ -1,21 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run bootserver for netbooting
-
-## usage: fx netboot [extra bootserver arguments]
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-name_args=()
-name="$(get-device-name)"
-if [[ -n "$name" ]]; then
-  name_args+=("-n" "${name}")
-fi
-
-exec "${FUCHSIA_BUILD_DIR}/netboot.sh" "${name_args[@]}" "$@"
diff --git a/devshell/netls b/devshell/netls
deleted file mode 100755
index 27ada98..0000000
--- a/devshell/netls
+++ /dev/null
@@ -1,13 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### list running fuchsia systems on the local network
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-"${ZIRCON_TOOLS_DIR}/netls" "$@"
diff --git a/devshell/old-symbolize b/devshell/old-symbolize
deleted file mode 100755
index 33bbf78..0000000
--- a/devshell/old-symbolize
+++ /dev/null
@@ -1,13 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### symbolize call stacks provided as input
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-# TODO(jeffbrown): Fix symbolize to support arch other than x64
-exec "${FUCHSIA_DIR}/zircon/scripts/old-symbolize" \
-      --build-dir "${FUCHSIA_BUILD_DIR}" "$@"
diff --git a/devshell/ota b/devshell/ota
deleted file mode 100755
index a4e3a96..0000000
--- a/devshell/ota
+++ /dev/null
@@ -1,40 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### do a system OTA
-
-## usage: fx ota [-d|--device <device_address>] [-h|--help]
-##
-## Ask the target to do an OTA. The target will use any update server available
-## to it to do the update. This requires the target have a update server
-## available to it. The 'serve' command is typically used to make your
-## development host available to the target as an update server.
-##
-## Arguments:
-##   -h|--help    Print out this message.
-##   -d|--device  Fuchsia link-local name of the device. If not
-##                specified, will connect to the only available
-##                device on the link-local network.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function main {
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  if [[ -z "$(pgrep -f "amber-files/repository")" ]]; then
-    echo "WARNING: It looks like serve-updates is not running."
-    echo "WARNING: You probably need to start \"fx serve\""
-    return -1
-  fi
-
-  fx-command-run shell "$@" amber_ctl system_update
-  local r=$?
-  echo "Check the target's log for update progress"
-  return $r
-}
-
-main "$@"
diff --git a/devshell/pave b/devshell/pave
deleted file mode 100755
index 4a19a38..0000000
--- a/devshell/pave
+++ /dev/null
@@ -1,21 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### run bootserver for paving
-
-## usage: fx pave [extra bootserver arguments]
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-name_args=()
-name="$(get-device-name)"
-if [[ -n "$name" ]]; then
-  name_args+=("-n" "${name}")
-fi
-
-exec "${FUCHSIA_BUILD_DIR}/pave.sh" "${name_args[@]}" --authorized-keys "${FUCHSIA_DIR}/.ssh/authorized_keys" "$@"
diff --git a/devshell/pending-commits b/devshell/pending-commits
deleted file mode 100755
index 441a63e..0000000
--- a/devshell/pending-commits
+++ /dev/null
@@ -1,170 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### view commits not yet published to global integration
-
-import argparse
-from argparse import RawTextHelpFormatter
-import base64
-from datetime import datetime, timedelta, tzinfo
-import json
-import sys
-import urllib2
-import xml.etree.ElementTree as xml
-
-
-PETALS = [
-    'topaz',
-    'peridot',
-    'garnet',
-    'zircon',
-    'build',
-    'buildtools',
-    'scripts',
-]
-
-
-# Authors whose commits are not displayed.
-IGNORED_AUTHORS = [
-    'skia-fuchsia-autoroll@skia-buildbots.google.com.iam.gserviceaccount.com',
-    'third-party-roller',
-    'topaz-roller',
-    'peridot-roller',
-    'garnet-roller',
-    'zircon-roller',
-]
-
-
-def http_get(url):
-    """Fetches the content at a given URL."""
-    try:
-        target = urllib2.urlopen(url)
-        return target.read()
-    finally:
-        if target:
-            target.close()
-
-
-def get_published_commit_for(petal):
-    """Returns the pinned revision of a petal in global integration."""
-    url = ('https://fuchsia.googlesource.com/integration/+/master/%s/minimal?format=TEXT'
-           % petal)
-    content = http_get(url)
-    content = base64.b64decode(content)
-    manifest = xml.fromstring(content)
-    nodes = manifest.findall('./projects/project[@name="%s"]' % petal)
-    return (petal, nodes[0].get('revision'))
-
-
-def get_published_commits():
-    """Returns the published revision of all the petals."""
-    return [get_published_commit_for(petal) for petal in PETALS]
-
-
-def get_commits(petal, revision):
-    """Returns the commits in the given petal up to a given commit."""
-    url = 'https://fuchsia.googlesource.com/%s/+log/master?format=JSON' % petal
-    def get_more(result, start=None):
-        get_url = url
-        if start:
-            get_url = '%s&s=%s' % (url, start)
-        content = http_get(get_url)
-        # Remove the anti-XSSI header.
-        content = content[5:]
-        data = json.loads(content)
-        for commit in data['log']:
-            if commit['commit'] == revision:
-                return
-            result.append(commit)
-        get_more(result, start=data['next'])
-    result = []
-    get_more(result)
-    return result
-
-
-def filter_commit(commit):
-    """Returns True if a commit should be listed."""
-    return commit['author']['name'] not in IGNORED_AUTHORS
-
-
-class MyTimezone(tzinfo):
-    """Simple timezone implementation, since for some reason Python 2.7 doesn't
-       provide one.
-       """
-
-    def __init__(self, data=None):
-        self.data = data if data else '+0000'
-
-    def utcoffset(self, dt):
-        hours = int(self.data[1:3])
-        minutes = int(self.data[3:5])
-        delta = timedelta(hours=hours, minutes=minutes)
-        if self.data[0] == '-':
-            delta = -delta
-        return delta
-
-    def tzname(self, dt):
-        return 'Bogus'
-
-    def dst(self, dt):
-        return timedelta(0)
-
-
-def get_time_since(timestamp):
-    """Returns a string describing the amount of time elapsed since the given
-       timestamp.
-       Timestamp format: Sat Feb 10 03:17:06 2018 +0000
-       """
-    timestamp_no_tz = timestamp[:-6]
-    date_no_tz = datetime.strptime(timestamp_no_tz, '%a %b %d %H:%M:%S %Y')
-    date = date_no_tz.replace(tzinfo=MyTimezone(timestamp[-5:]))
-    now = datetime.utcnow().replace(tzinfo=MyTimezone())
-    delta = now - date
-    if delta.days >= 1:
-        return '>1d'
-    hours = delta.seconds / 3600
-    if hours >= 1:
-        return '%sh' % hours
-    minutes = (delta.seconds % 3600) / 60
-    return '%sm' % minutes
-
-
-def print_commits(petal, commits, print_all=False):
-    """Prints the given commits in a user=pleasing format."""
-    commit_filter = (lambda c: c) if print_all else filter_commit
-    commits = filter(commit_filter, commits)
-    if commits:
-        timestamp = commits[-1]['committer']['time']
-        elapsed_time = get_time_since(timestamp)
-    else:
-        elapsed_time = ''
-    print('--------------')
-    print('| %s | %s' % ('{:^10}'.format(petal), elapsed_time))
-    print('--------------')
-    for commit in commits:
-        print('%s | %s | %s' % (commit['commit'][:7],
-                                commit['author']['name'][:15].ljust(15),
-                                commit['message'].splitlines()[0]))
-    if not commits:
-        print('None')
-
-
-def main():
-    parser = argparse.ArgumentParser(formatter_class=RawTextHelpFormatter,
-        description="""Displays the commits not yet published to global integration.""")
-    parser.add_argument('--all',
-                        help='Whether to print all commits, including rollers',
-                        action='store_true')
-    args = parser.parse_args()
-
-    for (petal, published_commit) in get_published_commits():
-        commits = get_commits(petal, published_commit)
-        print_commits(petal, commits, print_all=args.all)
-
-    return 0
-
-
-if __name__ == "__main__":
-    sys.exit(main())
diff --git a/devshell/push-package b/devshell/push-package
deleted file mode 100755
index 76349b8..0000000
--- a/devshell/push-package
+++ /dev/null
@@ -1,72 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### push packages to a device
-
-## usage: fx push-package [pkg1 pkg2 ...]
-##
-## Push packages to a device. One or more package names may be supplied. If no
-## package name is suppled all packages in the build output will be pushed. The
-## target must be reachable from the host and must already know how to reach
-## the host package server (e.g. fx serve must be running).
-##
-## See https://fuchsia.googlesource.com/docs/+/master/development/workflows/package_update.md
-## for more information about using this workflow.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help push-package
-}
-
-function main {
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  # if the second arg is set, but 0-length, publish nothing
-  if [[ $# -eq 1 ]] && [[ -z "${1}" ]]; then
-    exit 0
-  fi
-
-  local all_pkgs=($("${FUCHSIA_DIR}/scripts/list-available-packages.py" --build-dir "${FUCHSIA_BUILD_DIR}"))
-
-  # pkgs is the last argument
-  local pkgs=()
-
-  if [[ $# -eq 0 ]]; then
-    pkgs=("${all_pkgs[@]}")
-  else
-    for pkg in "$@"; do
-      for p in "${all_pkgs[@]}"; do
-        if [[ "$p" == "$pkg" ]]; then
-          pkgs+=("$p")
-          continue 2
-        fi
-      done
-      # TODO(BLD-338): Remove "package_name" from error message
-      echo >&2 "Package $pkg will not be pushed. It is not part of the current build, is not a package, overrides 'package_name=\"...\"' or has 'deprecated_system_image=true'."
-      exit 1
-    done
-  fi
-
-  # The target doesn't support expansions, and a local expansion makes a command
-  # too long to spawn, so sending a loop makes things work ok.
-  local cmd="
-    code=0;
-    for c in ${pkgs[@]}; do
-      amber_ctl get_up -n \"\${c}\" -v 0 || code=\$((\$code + \$?));
-      manifest_path=\"/pkgfs/packages/\${c}/0/meta/module.json\";
-      if [ -s \"\$manifest_path\" ]; then
-        run fuchsia-pkg://fuchsia.com/module_package_indexer#meta/module_package_indexer.cmx \"\${c}\" 0;
-      fi
-    done;
-    exit \$code
-  "
-
-  fx-command-run shell "${cmd}"
-}
-
-main "$@"
diff --git a/devshell/reboot b/devshell/reboot
deleted file mode 100755
index b4d47f6..0000000
--- a/devshell/reboot
+++ /dev/null
@@ -1,64 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### reboot a target fuchsia system
-
-## usage: fx reboot [-r|--recovery] [-b|--bootloader]
-##   -r|--recovery   Reboot into recovery image
-##   -b|--bootloader Reboot into bootloader
-##
-## This will reboot the device specified with `fx set-device`; otherwise
-## one of the devices on the link-local network.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage() {
-    fx-command-help
-}
-
-reboot_type="reboot"
-while [[ "$1" =~ ^- ]]; do
-  case "$1" in
-  -h|--help)
-    usage
-    exit 0
-    ;;
-  -r|--recovery)
-    reboot_type="reboot-recovery"
-    ;;
-  -b|--bootloader)
-    reboot_type="reboot-bootloader"
-    ;;
-  *)
-    break
-  esac
-  shift
-done
-
-if [[ $# -gt 1 ]]; then
-  usage
-  exit 1
-fi
-
-# If the OS X firewall is enabled, add timeout so users can click the network connection warning
-# dialog.
-if [[ "$(uname -s)" = "Darwin" ]] &&
-   [[ "$("/usr/libexec/ApplicationFirewall/socketfilterfw" "--getglobalstate" | grep -i -c "enabled" || echo -n 0)" = "1" ]]; then
-  timeout_flag="--timeout=3000"
-else
-  timeout_flag="--nowait"
-fi
-
-device=$(get-device-name)
-if [[ -z ${device} ]] &&
-   [[ $("${ZIRCON_TOOLS_DIR}/netls" "${timeout_flag}" | grep -i -c "device ") > "1" ]] ; then
-  echo "Rebooting some device... Consider using \`fx set-device\` if you have multiple devices."
-else
-  echo "Rebooting ${device}..."
-fi
-"${ZIRCON_TOOLS_DIR}/netruncmd" "${timeout_flag}" "${device}" "dm ${reboot_type}"
diff --git a/devshell/run b/devshell/run
deleted file mode 100755
index 7dd07fd..0000000
--- a/devshell/run
+++ /dev/null
@@ -1,42 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### start fuchsia in qemu with a FVM disk
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/image_build_vars.sh || exit $?
-source "${FUCHSIA_DIR}/buildtools/vars.sh"
-
-qemu_dir="${BUILDTOOLS_QEMU_DIR}/bin"
-
-# Construction of a qcow image prevents qemu from writing back to the
-# build-produced image file, which could cause timestamp issues with that file.
-# Construction of the new ZBI adds //.ssh/authorized_keys for SSH access.
-imgdir="$(mktemp -d)"
-if [[ ! -d "${imgdir}" ]]; then
-  echo >&2 "Failed to create temporary directory"
-  exit 1
-fi
-qimg="${imgdir}/fuchsia.qcow2"
-kernelzbi="${imgdir}/fuchsia-ssh.zbi"
-trap 'rm "${qimg}" "${kernelzbi}" && rmdir "${imgdir}"' EXIT
-
-"${qemu_dir}/qemu-img" create -f qcow2 -b "${FUCHSIA_BUILD_DIR}/${IMAGE_FVM_RAW}" \
-  "${qimg}"
-
-"${ZIRCON_TOOLS_DIR}/zbi" -o "${kernelzbi}" "${FUCHSIA_BUILD_DIR}/${IMAGE_ZIRCONA_ZBI}" \
-  --entry "data/ssh/authorized_keys=${FUCHSIA_DIR}/.ssh/authorized_keys"
-
-"${FUCHSIA_DIR}/zircon/scripts/run-zircon" \
-  -a "${FUCHSIA_ARCH}" \
-  -q "${qemu_dir}" \
-  -G 3 \
-  -t "${FUCHSIA_BUILD_DIR}/${IMAGE_QEMU_KERNEL_RAW}" \
-  -z "${kernelzbi}" \
-  -d \
-  -D "${qimg}" \
-  --diskfmt="qcow2" \
-  "$@"
diff --git a/devshell/run-host-tests b/devshell/run-host-tests
deleted file mode 100755
index 3ffd338..0000000
--- a/devshell/run-host-tests
+++ /dev/null
@@ -1,68 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build and run tests on host
-
-##
-## Usage: fx run-host-tests [-z] [host test names ...] [-- [test runner flags]]
-## Builds and runs the given host tests.
-## With "-z" passed, only Zircon tests will be run - and
-## without it only tests from Garnet and above.
-## If no host test names are provided, then all available
-## host tests will be run.
-## Test runner flags can typically be --gtest_filter=TestSuiteName.TestName
-## to restrict to a particular test or set of tests.
-##
-
-set -o errexit
-set -o pipefail
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-ARGS=()
-TEST_NAMES=()
-function main {
-  while [[ -n "$1" ]]; do
-    case "$1" in
-      -v) ARGS+=("-v");;
-      -z) ZIRCON=1 ;;
-      # break at bare double dash
-      # allow passing args to runtests
-      --) shift
-          break
-          ;;
-      *) TEST_NAMES+=("$1");;
-    esac
-    shift
-  done
-
-  if [[ $ZIRCON -eq 1 ]]; then
-    host_test_dir="${ZIRCON_BUILD_DIR}/host_tests"
-    fx-command-run build-zircon "-v"
-  else
-    host_test_dir="${FUCHSIA_BUILD_DIR}/host_tests"
-    # If test names are supplied, rebuild the associated tests; else rebuild
-    # everything under the GN 'host_tests' label.
-    if [[ -n "$TEST_NAMES" ]]; then
-      build_targets=(${TEST_NAMES[@]/#/host_tests/})
-    else
-      build_targets=("./build/gn:host_tests")
-    fi
-    fx-command-run build "${build_targets[@]}"
-  fi
-
-  runtests_cmd=("${ZIRCON_TOOLS_DIR}/runtests" "${ARGS[@]}")
-
-  if [[ -n "$TEST_NAMES" ]]; then
-    # Comma-separated list of host test names to filter by.
-    IFS="," runtests_cmd+=("-t" "${TEST_NAMES[*]}")
-  fi
-
-  # remaining arguments after -- are passed to test runner
-  "${runtests_cmd[@]}" "${host_test_dir}" -- "$@"
-}
-
-main "$@"
diff --git a/devshell/run-image-test b/devshell/run-image-test
deleted file mode 100755
index d80cedf..0000000
--- a/devshell/run-image-test
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build, copy to and run a test on target which is a part of system image
-
-## usage: fx run-image-test TARGET [ARGS ...]
-## Builds the specified target (e.g., fxl_unittests), copies it to the
-## target, and executes it.
-## Before using this please consider moving your test to a package and then use fx run-test.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help run-image-test
-}
-
-function main {
-  if [[ $# -eq 0 ]]; then
-    usage
-    return 1
-  fi
-
-  target="$1"
-  fx-command-run build "${target}"
-  fx-command-run cp "${FUCHSIA_BUILD_DIR}/${target}" "/tmp/${target}"
-  shift
-  fx-command-run shell "/tmp/${target}" "$@"
-}
-
-main "$@"
diff --git a/devshell/run-netboot b/devshell/run-netboot
deleted file mode 100755
index 5b7035d..0000000
--- a/devshell/run-netboot
+++ /dev/null
@@ -1,18 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### start fuchsia in qemu via netboot
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/image_build_vars.sh || exit $?
-source "${FUCHSIA_DIR}/buildtools/vars.sh"
-
-qemu_dir="${BUILDTOOLS_QEMU_DIR}/bin"
-
-"${FUCHSIA_DIR}/zircon/scripts/run-zircon" \
-  -a "${FUCHSIA_ARCH}" \
-  -q "${qemu_dir}" \
-  -t "${FUCHSIA_BUILD_DIR}/${IMAGE_QEMU_KERNEL_RAW}" \
-  -z "${FUCHSIA_BUILD_DIR}/${IMAGE_NETBOOT_ZBI}" \
-  "$@"
diff --git a/devshell/run-test b/devshell/run-test
deleted file mode 100755
index 6f6a146..0000000
--- a/devshell/run-test
+++ /dev/null
@@ -1,92 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build a test package and run on target.
-### PKG_TARGET is fully qualified or under fuchsia-pkg://fuchsia.com/
-
-## usage: fx run-test [-t|--test <test_name>] PKG_TARGET -- [-args -to -test]
-## Builds the specified test package (e.g., appmgr_integration_tests), copies it to the
-## target, and executes it.
-##
-## If using this command, please run 'fx build' again before paving your device
-## because 'fx build updates' used by this script does not build images so it
-## can leave paver in weird state.
-## Arguments:
-##   -t|--test    Test to run. If not specified, it will run all tests in PKG_TARGET.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help run-test
-}
-
-
-function main {
-  all_args="$@"
-  test_args="${all_args#* -- }"
-
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  test=""
-  target=""
-
-  while (($#)); do
-    case $1 in
-      -t|--test)
-        shift  # name
-        if [[ -z "$1" ]]; then
-          echo "Missing parameter: <test_name>" >&2
-          usage
-          exit 1
-        fi
-        test="$1"
-        ;;
-      --)
-        break
-        ;;
-      *)
-        if [[ "$target" == "" ]]; then
-          target="$1"
-        else
-          usage
-          exit 1
-        fi
-        ;;
-    esac
-    shift  # value
-  done
-
-  if [[ $target == "" ]]; then
-    usage
-    return 1
-  fi
-
-  if [[ -z "$(pgrep -f "amber-files/repository")" ]]; then
-    echo "WARNING: It looks like amber-srv is not running."
-    echo "WARNING: You probably need to start \"fx serve-updates\""
-    exit 1
-  fi
-
-  echo -e "Building ..."
-  # build all packages as there is no way to only build one and push it to
-  # update repository.
-  fx-command-run build updates
-  echo -e "\nPush package to device"
-  fx-command-run push-package "${target}"
-
-  if [[ "${test}" == "" ]]; then
-    echo -e "\nRun all tests in ${target}"
-    fx-command-run shell runtests "pkgfs/packages/${target}/0/test" -- "$test_args"
-  else
-    fx-command-run shell runtests -t ${test} "pkgfs/packages/${target}/0/test" -- "$test_args"
-  fi
-}
-
-
-main "$@"
diff --git a/devshell/run-test-component b/devshell/run-test-component
deleted file mode 100755
index 2748054..0000000
--- a/devshell/run-test-component
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### build a test package and run on target.
-### PKG_TARGET is fully qualified or under fuchsia-pkg://fuchsia.com/
-
-## Deprecated, please use fx run-test
-##
-## usage: fx run-test-component [-t|--test <test_name>] [-d|--device <device>] PKG_TARGET
-## Builds the specified test package (e.g., appmgr_integration_tests), copies it to the
-## target, and executes it.
-##
-## If using this command, please run 'fx build' again before paving your device
-## because 'fx build updates' used by this script does not build images so it
-## can leave paver in weird state.
-## Arguments:
-##   -t|--test    Test to run. If not specified, it will run all tests in PKG_TARGET.
-##   -d|--device  Target device.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help run-test-component
-}
-
-echo $'This is deprecated. Please use fx run-test.\n'
-fx-command-run run-test "$@"
diff --git a/devshell/rustdoc b/devshell/rustdoc
deleted file mode 100755
index 4078371..0000000
--- a/devshell/rustdoc
+++ /dev/null
@@ -1,137 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### generates documentation for a Rust target
-
-import argparse
-import os
-import platform
-import subprocess
-import sys
-
-import lib.rust
-from lib.rust import ROOT_PATH
-
-def manifest_path_from_path_or_gn_target(arg):
-    if arg.endswith("Cargo.toml"):
-        return os.path.abspath(arg)
-    else:
-        gn_target = lib.rust.GnTarget(arg)
-        return gn_target.manifest_path(lib.rust.find_out_dir())
-
-def main():
-    parser = argparse.ArgumentParser("Compiles all third-party Rust crates")
-    parser.add_argument("manifest_path",
-                        metavar="gn_target",
-                        type=manifest_path_from_path_or_gn_target,
-                        help="GN target to document. \
-                              Use '.[:target]' to discover the cargo target \
-                              for the current directory or use the \
-                              absolute path to the target \
-                              (relative to $FUCHSIA_DIR). \
-                              For example: //garnet/bin/foo/bar:baz. \
-                              Alternatively, this can be a path to a \
-                              Cargo.toml file of a package for which to \
-                              generate docs.")
-    parser.add_argument("--target",
-                        help="Target triple for which this crate is being compiled",
-                        default="x86_64-fuchsia")
-    parser.add_argument("--out-dir",
-                        help="Path to the Fuchsia output directory",
-                        required=False)
-    parser.add_argument("--no-deps",
-                        action="store_true",
-                        help="Disable building of docs for dependencies")
-    parser.add_argument("--doc-private",
-                        action="store_true",
-                        help="Document private items")
-    parser.add_argument("--open",
-                        action="store_true",
-                        help="Open the generated documentation")
-
-    args = parser.parse_args()
-
-    if args.out_dir:
-        out_dir = args.out_dir
-    else:
-        out_dir = lib.rust.find_out_dir()
-
-    env = os.environ.copy()
-
-    host_platform = "%s-%s" % (
-        platform.system().lower().replace("darwin", "mac"),
-        {
-            "x86_64": "x64",
-            "aarch64": "arm64",
-        }[platform.machine()],
-    )
-
-    target_cpu = {
-        "x86_64-fuchsia": "x64",
-        "aarch64-fuchsia": "aarch64",
-        "x86_64-unknown-linux-gnu": "x64",
-        "aarch64-unknown-linux-gnu": "aarch64",
-    }[args.target]
-
-    # run cargo from third_party/rust-crtaes/rustc_deps which has an appropriate .cargo/config
-    cwd = os.path.join(ROOT_PATH, "third_party", "rust-crates", "rustc_deps")
-    buildtools_dir = os.path.join(ROOT_PATH, "buildtools", host_platform)
-    clang_prefix = os.path.join(buildtools_dir, "clang", "bin")
-    cmake_dir = os.path.join(buildtools_dir, "cmake", "bin")
-    cargo = os.path.join(buildtools_dir, "rust", "bin", "cargo")
-    rustc = os.path.join(buildtools_dir, "rust", "bin", "rustc")
-    rustdoc = os.path.join(ROOT_PATH, "scripts", "rust", "rustdoc_no_ld_library_path.sh")
-
-    shared_libs_root = os.path.join(ROOT_PATH, out_dir)
-    sysroot = os.path.join(ROOT_PATH, out_dir, "sdk", "exported", "zircon_sysroot", \
-            "arch", target_cpu, "sysroot")
-
-    clang_c_compiler = os.path.join(clang_prefix, "clang")
-
-    env["CARGO_TARGET_LINKER"] = clang_c_compiler
-    env["CARGO_TARGET_X86_64_APPLE_DARWIN_LINKER"] = clang_c_compiler
-    env["CARGO_TARGET_X86_64_UNKNOWN_LINUX_GNU_LINKER"] = clang_c_compiler
-    env["CARGO_TARGET_%s_LINKER" % args.target.replace("-", "_").upper()] = clang_c_compiler
-    if "fuchsia" in args.target:
-        env["CARGO_TARGET_%s_RUSTFLAGS" % args.target.replace("-", "_").upper()] = (
-            "-Clink-arg=--target=" + args.target +
-            " -Clink-arg=--sysroot=" + sysroot +
-            " -Lnative=" + shared_libs_root
-        )
-    else:
-        env["CARGO_TARGET_%s_RUSTFLAGS" % args.target.replace("-", "_").upper()] = (
-            "-Clink-arg=--target=" + args.target
-        )
-    env["RUSTC"] = rustc
-    env["RUSTDOC"] = rustdoc
-    env["RUST_BACKTRACE"] = "1"
-    env["CC"] = clang_c_compiler
-    if "fuchsia" in args.target:
-        env["CFLAGS"] = "--sysroot=%s -L %s" % (sysroot, shared_libs_root)
-    env["CXX"] = os.path.join(clang_prefix, "clang++")
-    env["AR"] = os.path.join(clang_prefix, "llvm-ar")
-    env["RANLIB"] = os.path.join(clang_prefix, "llvm-ranlib")
-    env["PATH"] = "%s:%s" % (env["PATH"], cmake_dir)
-
-    call_args = [
-        cargo,
-        "doc",
-        "--manifest-path=%s" % args.manifest_path,
-        "--target=%s" % args.target,
-    ]
-
-    if args.no_deps:
-        call_args.append("--no-deps")
-
-    if args.open:
-        call_args.append("--open")
-
-    if args.doc_private:
-        call_args.append("--document-private-items")
-
-    return subprocess.call(call_args, env=env, cwd=cwd)
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/devshell/rustfmt b/devshell/rustfmt
deleted file mode 100755
index cd94416..0000000
--- a/devshell/rustfmt
+++ /dev/null
@@ -1,89 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### runs rustfmt on a Rust target
-
-import argparse
-import os
-import platform
-import subprocess
-import sys
-
-import lib.rust
-from lib.rust import ROOT_PATH, CONFIG_PATH
-
-sys.path += [os.path.join(ROOT_PATH, "third_party", "pytoml")]
-import pytoml as toml
-
-def main():
-    parser = argparse.ArgumentParser("Format a rust target")
-    parser.add_argument("gn_target",
-                        type=lib.rust.GnTarget,
-                        help="GN target to format. \
-                              Use '.[:target]' to discover the cargo target \
-                              for the current directory or use the \
-                              absolute path to the target \
-                              (relative to $FUCHSIA_DIR). \
-                              For example: //garnet/bin/foo/bar:baz.")
-    parser.add_argument("-v", "--verbose",
-                        action='store_true',
-                        help="Show verbose output")
-    parser.add_argument("-s", "--print-sources",
-                        action='store_true',
-                        help="Only print sources; do not format")
-
-    args = parser.parse_args()
-
-    out_dir = lib.rust.find_out_dir()
-
-    if args.print_sources and not os.path.exists(args.gn_target.manifest_path(out_dir)):
-        return 0
-
-    with open(args.gn_target.manifest_path(out_dir), "r") as fin:
-        cargo_toml = toml.load(fin)
-
-    main_file = None
-    if 'bin' in cargo_toml:
-        bins = cargo_toml['bin']
-        if len(bins) != 1:
-            print("Expected a single bin target for {gn_target}, found {n}".format(
-                    gn_target = args.gn_target,
-                    n = len(bins)))
-            return 1
-        main_file = bins[0]['path']
-    elif 'lib' in cargo_toml:
-        main_file = cargo_toml['lib']['path']
-
-    if args.print_sources:
-        if main_file:
-            print(main_file)
-        return 0
-
-    if not main_file or not os.path.exists(main_file):
-        print("No source root (typically lib.rs or main.rs) found for this gn target")
-        return 1
-
-    host_platform = "%s-%s" % (
-        platform.system().lower().replace("darwin", "mac"),
-        {
-            "x86_64": "x64",
-            "aarch64": "arm64",
-        }[platform.machine()],
-    )
-    buildtools_dir = os.path.join(ROOT_PATH, "buildtools", host_platform)
-    rustfmt = os.path.join(buildtools_dir, "rust", "bin", "rustfmt")
-
-    call_args = [
-        rustfmt,
-        main_file,
-    ]
-
-    if args.verbose:
-        call_args.append("-v")
-
-    return subprocess.call(call_args)
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/devshell/save-package-stats b/devshell/save-package-stats
deleted file mode 100755
index 5b5417f..0000000
--- a/devshell/save-package-stats
+++ /dev/null
@@ -1,77 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### take a snapshot of all built Fuchsia packages
-
-## usage: fx save-package-stats [--build] [[--name|-n NAME] | [OUTPUT_PATH]]
-##
-## Save a snapshot of metadata of all Fuchsia packages for later analysis.
-##   --build          Build the current package snapshot before saving it
-##   --name|-n NAME   Set the NAME of the package snapshot (Default: "system")
-##   OUTPUT_PATH      Write the snapshot at the specified OUTPUT_PATH (Default: $FUCHSIA_BUILD_DIR/snapshots/$NAME.snapshot)
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help save-package-stats
-}
-
-function main {
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  snapshot_dir="$FUCHSIA_BUILD_DIR/snapshots"
-
-  build=0
-  target_name=
-  output_path=
-  while [[ $# -ne 0 ]]; do
-    case "$1" in
-      --build)
-        build=1
-        ;;
-      -n|--name)
-        target_name="$2"
-        shift
-        ;;
-      *)
-        if [[ -z "${output_path}" ]]; then
-          output_path="$1"
-        else
-          echo >&2 "Multiple output paths specified"
-          usage
-          exit 1
-        fi
-    esac
-    shift
-  done
-
-  if [[ -n "${target_name}" && -n "${output_path}" ]]; then
-    echo >&2 "Output name and output path can not both specified"
-    usage
-    exit 1
-  elif [[ -z "${target_name}" ]]; then
-    target_name="system"
-  fi
-
-  if [[ -z "${output_path}" ]]; then
-    mkdir -p "$snapshot_dir" || exit 1
-    output_path="${snapshot_dir}/${target_name}.snapshot"
-  fi
-
-  if [[ "${build}" -ne 0 ]]; then
-    fx-command-run build system_snapshot || {
-      echo >&2 "Build of current package state failed, bailing out"
-      exit 1
-    }
-  fi
-
-  cp \
-    "${FUCHSIA_BUILD_DIR}/obj/build/images/system.snapshot" \
-    "${output_path}"
-}
-
-main "$@"
diff --git a/devshell/scp b/devshell/scp
deleted file mode 100755
index b86dbfb..0000000
--- a/devshell/scp
+++ /dev/null
@@ -1,27 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### invoke scp with the build ssh config
-
-## usage: fx scp <arguments to scp>
-##
-## This command invokes scp (SSH's file copy tool) with Fuchsia's SSH
-## configuration.  Run "scp -h" to see the options that scp accepts.
-##
-## Example usage:
-##
-##   fx scp "[$(fx netaddr --fuchsia)]:source_file" dest_file
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-case $1 in
-  -h|--help)
-  fx-command-help
-  exit 0
-  ;;
-esac
-
-SSH_AUTH_SOCK="" scp -F "${FUCHSIA_BUILD_DIR}/ssh-keys/ssh_config" "$@"
diff --git a/devshell/screenshot b/devshell/screenshot
deleted file mode 100755
index 87730b1..0000000
--- a/devshell/screenshot
+++ /dev/null
@@ -1,73 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### takes a screenshot and copies it to the host
-
-## usage: fx screenshot [--landscape] [--png] [--trim] [-o <screencap_file>]
-##
-## This command invokes Fuchsia's screencap tool to create a screenshot.
-## The result is written to screencap.ppm or screencap.png. To write to
-## another filename, use the -o parameter.
-## The --trim, --landscape and --png commands require ImageMagick to be
-## installed.  Unrecognized parameters will be passed to ssh.
-##
-## -o FILENAME     Write to the given filename instead of screencap.ppm
-## --png           Create a .png file instead of a .ppm file
-## --trim          Remove black borders
-## --landscape     Rotate image ninety degrees
-##
-## Example usage:
-##   fx screenshot --trim --png --landscape
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-OUT=screencap.ppm
-png_command=cat
-enable_trim=0
-enable_landscape=0
-while [[ $# -ne 0 ]]; do
-  case $1 in
-  -h|--help)
-    fx-command-help
-    exit 0
-    ;;
-  -o|--out)
-    shift
-    OUT=$1
-    ;;
-  --png)
-    OUT=${OUT%.*}.png
-    png_command=pnmtopng
-    ;;
-  --landscape)
-    enable_landscape=1
-    ;;
-  --trim)
-    enable_trim=1
-    ;;
-  *)
-    break
-    ;;
-  esac
-  shift
-done
-
-fx-command-run shell "$@" screencap | $png_command > "$OUT"
-mog_opts=
-if [[ $enable_trim -ne 0 ]]; then
-  mog_opts="$mog_opts -trim +repage"
-fi
-if [[ $enable_landscape -ne 0 ]]; then
-  mog_opts="$mog_opts -rotate 90"
-fi
-if [[ ! -z $mog_opts ]]; then
-  # Also add a black border so it's easier to see the image when
-  # pasted into the bug database.
-  mogrify -bordercolor black -border 3 $mog_opts $OUT
-fi
-
diff --git a/devshell/serve b/devshell/serve
deleted file mode 100755
index 7d32b87..0000000
--- a/devshell/serve
+++ /dev/null
@@ -1,62 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### start `pave` and `serve-updates` in a single command
-## usage: fx serve [-v] [-l host[:port]]
-##   -l host:port for "pm serve" to listen on
-##   -v enable more verbose output (must be first argument)
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-kill_child_processes() {
-  child_pids=$(jobs -p)
-  if [[ -n "${child_pids}" ]]; then
-    # Note: child_pids must be expanded to args here.
-    kill ${child_pids} 2> /dev/null
-    wait 2> /dev/null
-  fi
-}
-trap kill_child_processes EXIT
-
-serve_args=()
-
-fx-standard-switches "$@"
-set -- "${FX_ARGV[@]}"
-
-while (($#)); do
-  case "$1" in
-    -v|-vv|--verbose)
-      serve_args+=("$1")
-      ;;
-    -l)
-      serve_args+=("$1" "$2")
-      shift
-      ;;
-    *)
-      echo 2>&1 "Unknown argument: \"${1}\" ignored"
-      ;;
-  esac
-  shift
-done
-
-fx-command-exec pave &
-pave_pid=$!
-fx-command-exec serve-updates "${serve_args[@]}" &
-serve_pid=$!
-
-while true; do
-  sleep 1
-
-  # If any child exits, then exit the whole process, causing other children to
-  # be cleaned up by the exit trap.
-  for pid in "${pave_pid}" "${serve_pid}"; do
-    if ! kill -0 $pid 2> /dev/null; then
-      exit
-    fi
-  done
-done
-
-# See EXIT trap above for cleanup that occurs
diff --git a/devshell/serve-updates b/devshell/serve-updates
deleted file mode 100755
index b0f56ff..0000000
--- a/devshell/serve-updates
+++ /dev/null
@@ -1,129 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### start the update server and attach to a running fuchsia device
-## usage: fx serve-updates [-v] [-l host[:port]]
-##   -l host:port for "pm serve" to listen on
-##   -v verbose (do not suppress `pm serve` output)
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-function usage {
-  fx-command-help serve-updates
-}
-
-fx-standard-switches "$@"
-set -- "${FX_ARGV[@]}"
-
-serve_flags=()
-verbose=false
-very_verbose=false
-while (($#)); do
-  case "$1" in
-    -l)
-      shift
-      serve_flags+=(-l "$1")
-      ;;
-    -v|--verbose)
-      if $verbose; then
-        very_verbose=true
-      else
-        verbose=true
-      fi
-      ;;
-    -vv)
-      verbose=true
-      very_verbose=true
-      ;;
-    *)
-      echo "Unrecognized option: $1"
-      usage
-      exit 1
-      ;;
-  esac
-  shift
-done
-
-if [[ "${verbose}" != true ]]; then
-  serve_flags+=("-q")
-fi
-
-pm_srv_pid=
-cleanup() {
-  if [[ -n "${pm_srv_pid}" ]]; then
-    if kill -0 "${pm_srv_pid}" 2> /dev/null; then
-      kill -TERM "${pm_srv_pid}" 2> /dev/null
-      wait "${pm_srv_pid}" 2> /dev/null
-    fi
-  fi
-}
-trap cleanup EXIT
-
-log() {
-  # This format matches bootserver so that `fx serve` ui is easier to read.
-  echo "$(date '+%Y-%m-%d %H:%M:%S') [serve-updates] $@"
-}
-
-log_verbose() {
-  if [[ "$very_verbose" == true ]]; then
-    log "$@"
-  fi
-}
-
-if [[ -z "${pm_srv_pid}" ]]; then
-  "${FUCHSIA_BUILD_DIR}/host_x64/pm" serve -d "${FUCHSIA_BUILD_DIR}/amber-files/repository" "${serve_flags[@]}" &
-  pm_srv_pid=$!
-fi
-
-# Allow a little slack for pm serve to startup, that way the first kill -0 will catch a dead server
-sleep 0.1
-if ! kill -0 "${pm_srv_pid}" 2> /dev/null; then
-  log "Server died, exiting"
-  wait
-  exit $?
-fi
-
-log "Discovery..."
-
-# State is used to prevent too much output
-state="discover"
-while true; do
-  if ! kill -0 "${pm_srv_pid}" 2> /dev/null; then
-    log "Server died, exiting"
-    pm_srv_pid=
-    exit 1
-  fi
-  
-  fx-command-run shell -o ConnectionAttempts=1 -o ConnectTimeout=1 echo >/dev/null 2>&1
-  ping_result=$?
-
-  if [[ "$state" == "discover" && "$ping_result" == 0 ]]; then
-    log "Device up"
-    state="config"
-  fi
-
-  if [[ "$state" == "config" ]]; then
-    log "Registering devhost as update source"
-    if fx-command-run add-update-source; then
-      log "Ready to push packages!"
-      state="ready"
-    else
-      log "Device lost while configuring update source"
-      state="discover"
-    fi
-  fi
-
-  if [[ "$state" == "ready" ]]; then
-    if [[ "$ping_result" != 0 ]]; then
-      log "Device lost"
-      state="discover"
-    else
-      sleep 1
-    fi
-  fi
-done
-
-# See EXIT trap above for cleanup that occurs
diff --git a/devshell/set b/devshell/set
deleted file mode 100755
index e3e6d2c..0000000
--- a/devshell/set
+++ /dev/null
@@ -1,562 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### set up a build directory
-
-## usage: fx set TARGET [OUTDIR]
-##               [--board NAME|PATH]
-##               [--product NAME|PATH]
-##               [--monolith PATH]
-##               [--preinstall PATH]
-##               [--available PATH]
-##               [--netboot]
-##               [--args ARG] [--help-args [ARG]] [--variant VARIANT]
-##               [--goma|--no-goma] [--no-ensure-goma]
-##               [--goma-dir DIR]
-##               [--ccache|--no-ccache]
-##               [--release]
-##               [--zircon-arg ARG]
-##
-## where TARGET is x64 or arm64
-##
-## OUTDIR is the directory where the build output goes.
-## If it begins with `//` or `out/` then it's taken as relative to FUCHSIA_DIR.
-## Otherwise it should be an absolute path or a path relative to the current
-## working directory that winds up in `FUCHSIA_DIR/out`.
-## It defaults to `out/TARGET`.
-##
-## This is a wrapper around running `gn gen --check OUTDIR --args ...`.
-## If GN fails with an error, `fx set` does not change anything.
-## If GN succeeds, this also points subsequent `fx` commands at OUTDIR,
-## just as `fx use` does and ensures Goma is ready (if enabled).
-##
-## NAME|PATH can be provided as arugments to --board and --product. In the
-## case of NAME the name is searched for in //layer/board/NAME.gni, otherwise it
-## is treated as a PATH.
-##
-## optional arguments:
-##   --board               Use the given board target definition. Board
-##                         configurations are used to modulate hardware specific
-##                         behavior, such as configuring the set of drivers
-##                         included, or adding peripheral specific configurations.
-##                         If no board is given, a default board for the architecture
-##                         is selected from the current layer, e.g.
-##                         garnet/boards/x64.gni.
-##   --product             Include the given product in the build. Defaults to
-##                         the default product for the current layer (e.g.,
-##                         "garnet/products/default.gni" for the Garnet layer).
-##                         Product configurations define a set of packages to be
-##                         included in the monolith, preinstall and available
-##                         package sets, as well as product oriented configurations.
-##   --monolith            Additional packages to be built and included in the
-##                         monolithic system image. Monolith is the set of packages
-##                         that make up an OTA image.
-##                         If the --monolith argument is given multiple times,
-##                         all the specified packages are included in this set.
-##                         These packages are added to the available set defined
-##                         by the board and product specifications.
-##   --preinstall          Additional packages to be built and included in the
-##                         system image alongside the monolithic system image.
-##                         Packages in preinstall are not part of OTA updates,
-##                         instead they are updated dynamically.
-##                         If the --preinstall argument is given multiple times,
-##                         all the specified packages are included in this set.
-##                         These packages are added to the available set defined
-##                         by the board and product specifications.
-##   --available           Additional packages to be built and included in the
-##                         set of packages available for pushing dynamically.
-##                         If the --available argument is given multiple times,
-##                         all the specified available are included in this set.
-##                         These packages are added to the available set defined
-##                         by the board and product specifications.
-##   --netboot             Ensure that a network ramboot image is always built.
-##   --variant             Pass a `select_variant=[VARIANT*,...]` GN build argument
-##                         collecting all the --variant arguments in order.
-##   --fuzz-with           Pass a sanitizer name, e.g. "--fuzz-with asan" to
-##                         enable ALL supporting fuzzers.  Use --variant for
-##                         individual fuzzers, e.g. "--variant asan-fuzzer/foo".
-##   --args                Additional argument to pass to gn. If the --args
-##                         argument is given multiple times, all the specified
-##                         arguments are passed to gn.
-##                         N.B. Arguments must be expressed using GN's syntax.
-##                         In particular this means that for strings they must
-##                         be quoted with double-quotes, and the quoting must
-##                         survive, for example, the shell. Thus when passing
-##                         an argument that takes a string, pass it with
-##                         something like --args=foo='"bar"'. E.g.,
-##                         bash$ fx set x64 --args=foo='"bar"'
-##                         More complicated arguments, e.g., lists, require
-##                         their own special syntax. See GN documentation
-##                         for the syntax of each kind of argument.
-##   --help-args           Display GN arguments documentation.  If --help-args
-##                         is followed by a GN build argument identifier, just
-##                         that argument's documentation is displayed.
-##                         If --help-args is used alone, all GN build arguments
-##                         are displayed (lots of output).
-##                         This option requires an existing build directory.
-##   --goma|--no-goma      Whether to use the goma service during the build. Goma
-##                         attempts to make builds faster using remote build
-##                         servers. Defaults to detecting whether goma is installed
-##                         on your machine.
-##   --no-ensure-goma      Skip ensuring that goma is started when using goma.
-##   --goma-dir            The directory where goma is installed. Defaults to
-##                         ~/goma.
-##   --ccache|--no-ccache  Whether to use ccache during the build. Ccache attempts
-##                         to make builds faster by caching build artifacts.
-##                         Defaults to detecting whether the CCACHE_DIR environment
-##                         variable is set to a directory.
-##   --ide                 Pass --ide=VALUE to gn when generating to create project
-##                         files usable with that IDE. Useful values include "vs"
-##                         for Visual Studio or "xcode" for Xcode.
-##   --release             an alias for "--args=is_debug=false"
-##
-## Deprecated flags:
-##
-##   --boards              Use the listed board target definition (seperated by
-##                         comas) in the build, each board results in an import
-##                         statement and MUST define the `fuchsia_packages`
-##                         variable as well as deal with variable re-definition.
-##                         This flag is deprecated, please use --board instead.
-##   --products            Include the listed products (separated by commas) in
-##                         the build. Defaults to the default product for the
-##                         current layer (e.g., "garnet/packages/default.gni" for
-##                         the Garnet layer). If the --products argument is
-##                         given multiple times, the products configurations
-##                         are merged.
-##                         This flag is deprecated, please use exactly one
-##                         --product and specify packages to add in addition to
-##                         the product with --packages.
-##   --packages            Additional packages to be built and made available.
-##                         If the --packages argument is given multiple times,
-##                         all the specified packages are included in the
-##                         build.
-##                         This flag is deprecated, please use one of
-##                         --available, --preinstall, or --monolith.
-##   --zircon-arg ARG      Additional arguments to pass to Zircon make. Can be given
-##                         multiple times.
-##
-## Example:
-##
-##   $ fx set x64 kitchensink --product ermine --available topaz/packages/kitchen_sink
-##   -> architecture: x64
-##      build directory: out/kitchensink
-##      board: topaz/boards/x64.gni
-##      product: topaz/products/ermine.gni
-##      available: topaz/packages/kitchen_sink (all other packages)
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-function guess_config_within_layer {
-  local config_type="$1"
-  local config_name="$2"
-  local __resultvar="$3"
-
-  # Guess the petal we're using. We'll search at this petal and below for
-  # config matching the type and short name
-  local current_petal
-  current_petal="$(${FUCHSIA_DIR}/build/gn/guess_layer.py)" || return 1
-  readonly current_petal
-
-  # Compute a lookup order starting at the current petal going down.
-  local petal_order="${current_petal}"
-  case $current_petal in
-    vendor/*)
-      petal_order="${petal_order},topaz,peridot,garnet"
-      ;;
-    "topaz")
-      petal_order="${petal_order},peridot,garnet"
-      ;;
-    "peridot")
-      petal_order="${petal_order},garnet"
-      ;;
-  esac
-  readonly petal_order
-
-  # Look through petals in this order to find configs with matching names.
-  IFS=,
-  local petal
-  for petal in $petal_order; do
-    guessed_config="${petal}/${config_type}/${config_name}.gni"
-    if [[ -a "${FUCHSIA_DIR}/${guessed_config}" ]]; then
-      echo "Guessing ${config_type} config ${guessed_config}"
-      eval "${__resultvar}"="${guessed_config}"
-      return
-    fi
-  done
-
-  echo "Could not guess a ${config_type} configuration matching \"${config_name}\""
-  echo "Please specify the full path from the root of the checkout such as"
-  echo "garnet/${config_type}/base.gni"
-  exit 1
-}
-
-
-function main {
-  fx-standard-switches "$@"
-  set -- "${FX_ARGV[@]}"
-
-  if [[ $# -lt 1 ]]; then
-    fx-command-help
-    return 1
-  fi
-
-  local arch=
-  case $1 in
-    x64 | x86 | x64-64)
-      arch=x64
-      ;;
-    arm64 | aarch64)
-      arch=arm64
-      ;;
-    *)
-      # TODO(alainv): Add support for extracting arch from board configs.
-      echo Unknown target \"$1\"
-      fx-command-help
-      return 1
-      ;;
-  esac
-  shift
-
-  cd "${FUCHSIA_DIR}"
-
-  local gn_cmd='gen'
-  local -a gn_switches=(--check)
-  local gn_args="target_cpu=\"${arch}\""
-  local boards=()
-  local products=()
-  local available=()
-  local preinstall=()
-  local monolith=()
-  local packages=()
-  local zircon_args=()
-  local extra_packages=()
-  local build_dir=
-  local variant=
-  local use_goma
-  local goma_dir
-  local ensure_goma=1
-  local ccache
-  while [[ $# -ne 0 ]]; do
-    case "$1" in
-      --board)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        boards+=("$2")
-        shift
-        ;;
-      --product)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        products+=("$2")
-        shift
-        ;;
-      --available)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        IFS=, available+=("$2")
-        shift
-        ;;
-      --preinstall)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        IFS=, preinstall+=("$2")
-        shift
-        ;;
-      --monolith)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        IFS=, monolith+=("$2")
-        shift
-        ;;
-      --boards)
-        echo >&2 "--boards is deprecated, --board is preferred"
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        # Multiple boards may be comma separated
-        IFS=, boards+=("$2")
-        shift
-        ;;
-      --products)
-        echo >&2 "--products is deprecated, --product is preferred"
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        # Multiple products may be comma separated
-        IFS=, products+=("$2")
-        shift
-        ;;
-      --packages)
-        echo >&2 "--packages is deprecated, --monolith, --preinstall or --available are preferred"
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        # Multiple packages may be comma separated
-        IFS=, packages+=("$2")
-        shift
-        ;;
-      --zircon-arg)
-        echo >&2 "--zircon-arg is deprecated, common use case is no longer required"
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        zircon_args+=("$2")
-        shift
-        ;;
-      --netboot)
-        extra_packages+=('build/packages/netboot')
-        ;;
-      --goma)
-        use_goma=1
-        ;;
-      --no-goma)
-        use_goma=0
-        ;;
-      --no-ensure-goma)
-        ensure_goma=0
-        ;;
-      --goma-dir)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        goma_dir=$2
-        if [[ ! -d "${goma_dir}" ]]; then
-          echo -e "error: GOMA directory does not exist: "${goma_dir}""
-          return 1
-        fi
-        shift
-        ;;
-      --release)
-        gn_args+=" is_debug=false"
-        ;;
-      --variant)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        variant+="\"$2\","
-        shift
-        ;;
-      --fuzz-with)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        variant+="{variant=\"$2-fuzzer\" target_type=[\"fuzzed_executable\"]},"
-        shift
-        ;;
-      --args)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        gn_args+=" $2"
-        shift
-        ;;
-      --help-args)
-        gn_cmd=args
-        if [[ $# -ge 2 ]] && [[ "$2" != '--*' ]]; then
-          gn_switches+=("--list=$2")
-          shift
-        else
-          gn_switches+=(--list)
-        fi
-        ;;
-      --ccache)
-        ccache=1
-        ;;
-      --no-ccache)
-        ccache=0
-        ;;
-      --ide)
-        if [[ $# -lt 2 ]]; then
-          fx-command-help
-          return 1
-        fi
-        gn_switches+=("--ide=$2")
-        shift
-        ;;
-      --*)
-        fx-command-help
-        return 1
-        ;;
-      *)
-        # A non-option argument is the build_dir, but there can be only one.
-        if [[ -n "$build_dir" ]]; then
-          fx-command-help
-          return 1
-        fi
-        build_dir="$1"
-        ;;
-    esac
-    shift
-  done
-
-  if [[ -z "${products}${packages}" ]]; then
-    # This is the default logic GN would use, but if a user specified --netboot
-    # we would short-circuit the logic, so repeat it here.
-    local layers
-    layers="$(${FUCHSIA_DIR}/build/gn/guess_layer.py)" || return 1
-    readonly layers
-    local layer
-    for layer in $layers; do
-      products+="${products:+,}$layer/products/default.gni"
-    done
-  fi
-
-  if [[ -z "${boards}" ]]; then
-    # Import the board target definition from the current layer if it exists.
-    local layers
-    if [[ -z "${layers}" ]]; then
-      layers="$(${FUCHSIA_DIR}/build/gn/guess_layer.py)" || return 1
-    fi
-    readonly layers
-    local layer
-    for layer in $layers; do
-      if [[ -e "$layer/boards/$arch.gni" ]]; then
-        boards+="${boards:+,}$layer/boards/$arch.gni"
-      fi
-    done
-  fi
-
-  # Remove any trailing slash from build directory name.
-  build_dir="${build_dir%/}"
-
-  local config_build_dir
-  case "$build_dir" in
-    '')
-      # Default is "//out/$target_cpu".  Store it as relative.
-      config_build_dir="out/${arch}"
-      build_dir="${FUCHSIA_DIR}/${config_build_dir}"
-      ;;
-    //*|out/*)
-      # GN-style "source-relative" path or relative out/something.
-      config_build_dir="${build_dir#//}"
-      build_dir="${FUCHSIA_DIR}/${config_build_dir}"
-      ;;
-    *)
-      # Absolute or relative path.  Canonicalize it to source-relative.
-      local abs_build_dir
-      abs_build_dir="$(cd "${build_dir%/*}" >/dev/null 2>&1; pwd)/${build_dir##*/}" || {
-        echo >&2 "ERROR: Missing parent directories for ${build_dir}"
-        return 1
-      }
-      if [[ "$abs_build_dir" == "${FUCHSIA_DIR}"/out/* ]]; then
-        config_build_dir="${abs_build_dir#${FUCHSIA_DIR}/}"
-      else
-        echo >&2 "WARNING: ${abs_build_dir} is not a subdirectory of ${FUCHSIA_DIR}/out"
-        config_build_dir="$abs_build_dir"
-      fi
-      ;;
-  esac
-
-  # If a goma directory wasn't specified explicitly then default to "~/goma".
-  if [[ -z "${goma_dir}" ]]; then
-    goma_dir="$HOME/goma"
-  fi
-
-  # Automatically detect goma and ccache if not specified explicitly.
-  if [[ -z "${use_goma}" ]] && [[ -z "${ccache}" ]]; then
-    if [[ -d "${goma_dir}" ]]; then
-      use_goma=1
-    elif [[ -n "${CCACHE_DIR}" ]] && [[ -d "${CCACHE_DIR}" ]]; then
-      ccache=1
-    fi
-  fi
-
-  for board in ${boards[@]}; do
-    if [[ ! -a "${FUCHSIA_DIR}/${board}" ]]; then
-      local guessed_board=""
-      guess_config_within_layer "boards" "${board}" guessed_board
-      board="${guessed_board}"
-    fi
-    gn_args+=" import(\"//${board}\")"
-  done
-
-  # Add goma or ccache settings as appropriate.
-  if [[ "${use_goma}" -eq 1 ]]; then
-    gn_args+=" use_goma=true goma_dir=\"${goma_dir}\""
-  elif [[ "${ccache}" -eq 1 ]]; then
-    gn_args+=" use_ccache=true"
-  fi
-
-  for product in ${products[@]}; do
-    if [[ ! -a "${FUCHSIA_DIR}/${product}" ]]; then
-      local guessed_product=""
-      guess_config_within_layer "products" "${product}" guessed_product
-      product="${guessed_product}"
-    fi
-    gn_args+=" import(\"//${product}\")"
-  done
-
-  # Board configs MUST declare fuchsia_packages, and boards must be non-empty.
-  gn_args+=" fuchsia_packages+=["
-  for package in ${packages[@]} ${extra_packages[@]}; do
-    gn_args+="\"${package}\","
-  done
-  gn_args+="]"
-
-  gn_args+=" if (!defined(available)) { available = [] }"
-  gn_args+=" available+=["
-  for package in ${available[@]}; do
-    gn_args+="\"${package}\","
-  done
-  gn_args+="]"
-
-  gn_args+=" if (!defined(preinstall)) { preinstall = [] }"
-  gn_args+=" preinstall+=["
-  for package in ${preinstall[@]}; do
-    gn_args+="\"${package}\","
-  done
-  gn_args+="]"
-  gn_args+=" if (!defined(monolith)) { monolith = [] }"
-  gn_args+=" monolith+=["
-  for package in ${monolith[@]}; do
-    gn_args+="\"${package}\","
-  done
-  gn_args+="]"
-
-  if [[ -n "${variant}" ]]; then
-    gn_args+=" select_variant=[${variant}]"
-  fi
-
-  mkdir -p "${build_dir}"
-  echo "${zircon_args[@]}" > "${build_dir}.zircon-args"
-
-  # Using a subshell with -x prints out the gn command precisely with shell
-  # quoting so a cut&paste to the command line works.  Always show the real
-  # meaning of what this script does so everyone learns how GN works.
-  (
-    set -x
-    "${FUCHSIA_DIR}/buildtools/gn" ${gn_cmd} "${build_dir}" \
-                                   "${gn_switches[@]}" --args="${gn_args}" "$@"
-  # If GN failed, don't update .config.
-  ) || return
-
-  fx-config-write "${config_build_dir}"
-
-  if [[ "${use_goma}" -eq 1 ]] && [[ "${ensure_goma}" -eq 1 ]]; then
-    if ! [[ $("${goma_dir}/gomacc" port) =~ ^[0-9]+$ ]]; then
-      "${goma_dir}/goma_ctl.py" ensure_start || return $?
-    fi
-  fi
-}
-
-main "$@"
diff --git a/devshell/set-clock b/devshell/set-clock
deleted file mode 100755
index fa307ae..0000000
--- a/devshell/set-clock
+++ /dev/null
@@ -1,17 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### set the clock on target using host clock
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-if [[ "$(uname -s)" = "Darwin" ]]; then
-  device_date=`date +%Y-%m-%dT%T`
-else
-  device_date=`date -Iseconds`
-fi
-
-echo "Setting device's clock to ${device_date}"
-fx-command-run shell "clock --set ${device_date}"
diff --git a/devshell/set-device b/devshell/set-device
deleted file mode 100755
index 21cd5a5..0000000
--- a/devshell/set-device
+++ /dev/null
@@ -1,49 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### set the default device to interact with
-
-## usage: fx set-device [<device-name>]
-##
-## fx set-device is used to specify the default device to target for various
-## fx commands, such as serve, syslog, and so on.
-## A device is set within the scope of a build directory (i.e. out/arm64 may
-## have a different pair set than out/x64).
-##
-## If no device name is given, set-device will attempt to discover devices. If
-## one device is found, that device is set as the default for the current build
-## directory. If more than one device is found, the user must select one and
-## provide it to a subsequent invocation of the command.
-##
-## To unset, use `fx unset-device`.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-standard-switches "$@"
-fx-config-read
-
-device="$1"
-if [[ -z "$device" ]]; then
-  devices="$("${ZIRCON_TOOLS_DIR}/netls" | awk '/device /{print $2}')"
-  if [[ "$(echo "$devices" | wc -l)" -gt 2 ]]; then
-    echo 2>&1 "Multiple devices found, please pick one from the list:"
-    echo "${devices}"
-    exit 1
-  fi
-  if [[ -z "${devices}" ]]; then
-    echo 2>&1 "No devices discovered, please supply a device name"
-    exit 1
-  fi
-  device="${devices}"
-fi
-
-if [[ ! -d "${FUCHSIA_BUILD_DIR}" ]]; then
-  echo 2>&1 "Build directory ${FUCHSIA_BUILD_DIR} does not exist, run \"fx set\" first."
-  exit 1
-fi
-
-echo "New default device: ${device}"
-echo "$device" > "${FUCHSIA_BUILD_DIR}.device"
diff --git a/devshell/set-petal b/devshell/set-petal
deleted file mode 100755
index 68b2216..0000000
--- a/devshell/set-petal
+++ /dev/null
@@ -1,63 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### configure jiri to manage a specific petal
-
-## usage: fx set-petal zircon|garnet|peridot|topaz
-## Configures jiri to fetch the source code for the given petal and its
-## dependencies.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-if [[ "$#" -ne 1 ]]; then
-  fx-command-help
-  exit 1
-fi
-
-petal="$1"
-
-if [[ "${petal}" != "zircon" ]] &&
-   [[ "${petal}" != "garnet" ]] &&
-   [[ "${petal}" != "peridot" ]] &&
-   [[ "${petal}" != "topaz" ]]; then
-  fx-command-help
-  exit 1
-fi
-
-# If jiri is not found which will return an err which will cause bash to exit.
-# "|| echo" catches that.
-jiri=$(which jiri || echo)
-if [[ -z ${jiri} ]]; then
-  jiri="${FUCHSIA_DIR}/.jiri_root/bin/jiri"
-  if [[ ! -f ${jiri} ]]; then
-    echo >&2 "error: Cannot find \"jiri\" in your PATH nor at ${jiri}."
-    exit 1
-  fi
-fi
-
-cd "${FUCHSIA_DIR}"
-rm -f -- "${FUCHSIA_DIR}/.jiri_manifest"
-"${jiri}" import -name=integration topaz/topaz "https://fuchsia.googlesource.com/integration"
-"${jiri}" override "${petal}" "https://fuchsia.googlesource.com/${petal}"
-
-echo "Configured jiri for ${petal}. Run these commands to update your build:"
-
-if [[ "${petal}" == "zircon" ]]; then
-cat <<END
- * jiri update -gc     # Updates your source tree to contain ${petal} and
-                       # removes unneeded repositories.
-
- * cd zircon && scripts/build-zircon-<arch>  # Acutally builds ${petal}.
-END
-else
-cat <<END
- * jiri update -gc     # Updates your source tree to contain ${petal} and
-                       # removes unneeded repositories.
- * fx set x64          # Updates your build directory to build ${petal}.
- * fx full-build       # Actually builds ${petal}
-END
-fi
diff --git a/devshell/setup-macos b/devshell/setup-macos
deleted file mode 100755
index bd2b3c7..0000000
--- a/devshell/setup-macos
+++ /dev/null
@@ -1,50 +0,0 @@
-#!/usr/bin/env bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Run this command in Fuchsia development environment, or at minimum,
-# source ${FUCHSIA_DIR}/scripts/devshell/lib/vars.sh || exit $?
-
-### register Zircon tools at MacOS Application Firewall
-
-FIREWALL_CMD="/usr/libexec/ApplicationFirewall/socketfilterfw"
-
-function list_zircon_tools() {
-  TOOL_LIST="$(${FIREWALL_CMD} --listapps | grep zircon | awk '{print $3}')"
-  for f in "${TOOL_LIST[@]}"; do
-    echo "${f}"
-  done
-}
-
-function clear_zircon_tools() {
-  TOOL_LIST="$(${FIREWALL_CMD} --listapps | grep zircon | awk '{print $3}')"
-  for f in ${TOOL_LIST}; do
-    sudo ${FIREWALL_CMD} --remove "${f}" &> /dev/null
-  done
-}
-
-function allow_zircon_tools() {
-  for f in ${ZIRCON_TOOLS_DIR}/*; do
-    sudo ${FIREWALL_CMD} --add "$f" --unblockapp "$f" &> /dev/null
-  done
-}
-
-function main() {
-  echo "  clearing firewall rules.."
-  clear_zircon_tools
-  echo "  adding firewall rules.."
-  allow_zircon_tools
-
-  # Activate the changes
-  sudo ${FIREWALL_CMD} --setglobalstate off &> /dev/null
-  sudo ${FIREWALL_CMD} --setglobalstate on &> /dev/null
-  echo "..done"
-
-  echo "  following tools are registered in the firewall rules:"
-  echo " "
-  list_zircon_tools
-  echo " "
-}
-
-main
diff --git a/devshell/sftp b/devshell/sftp
deleted file mode 100755
index 4f5d956..0000000
--- a/devshell/sftp
+++ /dev/null
@@ -1,28 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### invoke sftp with the build ssh config
-
-## usage: fx sftp <arguments to sftp>
-##
-## This command invokes sftp (one of SSH's file copy tools) with
-## Fuchsia's SSH configuration.  Run "sftp -h" to see the options that
-## sftp accepts.
-##
-## Example usage:
-##
-##   fx sftp "[$(fx netaddr --fuchsia)]:source_file" dest_file
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-case $1 in
-  -h|--help)
-  fx-command-help
-  exit 0
-  ;;
-esac
-
-SSH_AUTH_SOCK="" sftp -F "${FUCHSIA_BUILD_DIR}/ssh-keys/ssh_config" "$@"
diff --git a/devshell/shell b/devshell/shell
deleted file mode 100755
index f7e93ea..0000000
--- a/devshell/shell
+++ /dev/null
@@ -1,34 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### start a remote interactive shell in the target device
-
-## usage: fx shell [-h|--help] [<command>]
-##
-## Creates an SSH connection with a device and executes a command.
-##
-## Arguments:
-##   -h|--help    Print out this message.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-case $1 in
-  -h|--help)
-  fx-command-help
-  exit 0
-  ;;
-esac
-
-device_addr="$(get-fuchsia-device-addr)"
-if [[ -z "${device_addr}" ]]; then
-  echo >&2 "Device not found"
-  exit 1
-fi
-# Note: I know there are people who don't like the host-key message, but DO NOT
-# apply -q here, it silences error messages and makes network and configuration
-# failures much harder to diagnose when helping people. The control master will
-# mean you only get one per TCP socket, which is once per newly booted host.
-# It's not a huge burden compared to end user support.
-fx-command-exec ssh "${device_addr}" "$@"
diff --git a/devshell/ssh b/devshell/ssh
deleted file mode 100755
index fd8ba96..0000000
--- a/devshell/ssh
+++ /dev/null
@@ -1,16 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### invoke ssh with the keys from $FUCHSIA_BUILD_DIR/ssh-keys
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-# Note: I know there are people who don't like the host-key message, but DO NOT
-# apply -q here, it silences error messages and makes network and configuration
-# failures much harder to diagnose when helping people. The control master will
-# mean you only get one per TCP socket, which is once per newly booted host.
-# It's not a huge burden compared to end user support.
-SSH_AUTH_SOCK="" exec ssh -F "${FUCHSIA_BUILD_DIR}/ssh-keys/ssh_config" "$@"
diff --git a/devshell/symbolize b/devshell/symbolize
deleted file mode 100755
index d338a08..0000000
--- a/devshell/symbolize
+++ /dev/null
@@ -1,20 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### symbolize backtraces and program locations provided as input on stdin
-
-## This tool takes in log output from either zx_log or syslog and processes
-## the results to make the symbolizer markup in them human readable.
-## Anything that is not valid marked up is left alone. This is similar
-## to how c++filt works for demangling C++ symbols.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-if [[ $# > 1 && "$1" = "-i" ]]; then
-  fx-symbolize "$2"
-else
-  fx-symbolize
-fi
diff --git a/devshell/syslog b/devshell/syslog
deleted file mode 100755
index 34d7bb9..0000000
--- a/devshell/syslog
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### listen for logs
-
-## usage: fx syslog [--raw] [flags]
-##
-## Creates an SSH connection with a device and starts listening for logs.
-## Pass -h to get help with log-listener flags.
-## pass --raw as the first argument to get the raw, unsymbolized logs.
-
-set -o pipefail
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-function listen {
-  while true; do
-    fx-command-run wait || return
-    fx-command-run shell log_listener "$@"
-    echo "Connection lost, reconnecting..."
-  done
-}
-
-echo "Connecting..."
-if [[ $# > 0 && "$1" = "--raw" ]]; then
-  shift
-  listen "$@"
-else
-  listen "$@" | fx-symbolize
-fi
diff --git a/devshell/tests/build_test.sh b/devshell/tests/build_test.sh
deleted file mode 100755
index b2d7136..0000000
--- a/devshell/tests/build_test.sh
+++ /dev/null
@@ -1,64 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Halt on use of undeclared variables.
-set -o nounset
-
-# Set up the environment
-readonly TEST_DIR="$(mktemp -d)"
-if [[ ! -d "${TEST_DIR}" ]]; then
-  echo >&2 "Failed to create temporary directory"
-  exit 1
-fi
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly FUCHSIA_DIR="$(dirname "$(dirname "$(dirname "${SCRIPT_DIR}")")")"
-readonly BUILD_DIR="foo"
-readonly TEST_BUILD_DIR="${TEST_DIR}/${BUILD_DIR}"
-
-readonly ENV="FUCHSIA_DIR=${TEST_DIR}"
-
-ln -s \
-  "${FUCHSIA_DIR}/scripts" \
-  "${TEST_DIR}"
-
-cat <<END >> "${TEST_DIR}/.config"
-FUCHSIA_BUILD_DIR='${BUILD_DIR}'
-FUCHSIA_ARCH='x64'
-END
-
-mkdir -p "${TEST_BUILD_DIR}"
-cat <<END >> "${TEST_BUILD_DIR}/args.gn"
-target_cpu = "x64"
-use_goma = false
-END
-
-ln -s \
-  "${FUCHSIA_DIR}/buildtools" \
-  "${TEST_DIR}"
-
-cat <<END >> "${TEST_BUILD_DIR}/build.ninja"
-rule touch
-  command = touch \$out
-build foo.o: touch
-END
-
-# Invoke `fx build`.
-env -i "${ENV}" "${FUCHSIA_DIR}/scripts/fx" build
-
-declare RETURN_CODE
-
-if [[ -f "${TEST_BUILD_DIR}/foo.o" ]]; then
-  echo "SUCCESS"
-  RETURN_CODE=0
-else
-  echo "FAILURE"
-  RETURN_CODE=1
-fi
-
-# Clean up
-rm -rf -- "${TEST_DIR}"
-
-exit "${RETURN_CODE}"
diff --git a/devshell/tests/shell_test.sh b/devshell/tests/shell_test.sh
deleted file mode 100755
index 1eb2392..0000000
--- a/devshell/tests/shell_test.sh
+++ /dev/null
@@ -1,20 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-#
-# Usage: shell_test.sh
-
-# Halt on use of undeclared variables and errors.
-set -o nounset
-set -o errexit
-
-# Choose a directory that won't likely have its name changed or appear in
-# in error message.
-readonly PKGFS="pkgfs"
-readonly RESULT=$(fx shell ls | grep "${PKGFS}")
-
-if [ "${RESULT}" != "${PKGFS}" ]; then
-  exit 1
-fi
-
diff --git a/devshell/unset-device b/devshell/unset-device
deleted file mode 100755
index 0577bf7..0000000
--- a/devshell/unset-device
+++ /dev/null
@@ -1,21 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### unset the default device to interact with
-
-## usage: fx unset-device
-##
-## Unset the default device to work with. See "fx set-device" for more
-## information.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-standard-switches "$@"
-fx-config-read
-
-if [[ -n "${FUCHSIA_BUILD_DIR}" ]]; then
-  rm -f "${FUCHSIA_BUILD_DIR}.device"
-fi
diff --git a/devshell/update-rustc-third-party b/devshell/update-rustc-third-party
deleted file mode 100755
index 555be39..0000000
--- a/devshell/update-rustc-third-party
+++ /dev/null
@@ -1,49 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### updates rustc_library and rustc_binary third_party dependencies
-
-## usage: fx update-rustc-third-party
-## Updates third_party/rust-crates/rustc_deps based on the contents of
-## third_party/rust-crates/rustc_deps/Cargo.toml
-##
-## After updating third_party/rust-crates/rustc_deps, the pinned revision of
-## third_party/rust-crates will need to be updated in garnet/manifest/third_party.
-## See https://fuchsia.googlesource.com/docs/+/master/development/languages/rust/third_party.md
-## for more details.
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-case "$(uname -s)" in
-  Linux*) OS=linux-x64;;
-  Darwin*) OS=mac-x64;;
-  *) echo "Error: unrecognized OS"; exit 1;;
-esac
-
-declare -x PATH=$FUCHSIA_DIR/buildtools/$OS/cmake/bin:$PATH
-
-if [[ "$OS" = "mac-x64" ]]; then
-  if ! [[ -x "$(command -v brew)" ]]; then
-    echo >&2 "'brew' binary not found"
-    echo >&2 "A homebrew <https://brew.sh> installation of opensslis required in order to update"       echo >&2 "Rust third party crates on the Mac."
-    exit 1
-  fi
-
-  declare -x LDFLAGS="-L$(brew --prefix)/opt/openssl/lib"
-  declare -x CPPFLAGS="-I$(brew --prefix)/opt/openssl/include"
-fi
-
-export RUSTC=$FUCHSIA_DIR/buildtools/$OS/rust/bin/rustc
-
-(cd $FUCHSIA_DIR; $FUCHSIA_DIR/buildtools/$OS/rust/bin/cargo run \
-  --target-dir $FUCHSIA_DIR/out/cargo_vendor_target \
-  --manifest-path $FUCHSIA_DIR/third_party/rust-mirrors/cargo-vendor/Cargo.toml \
-  -- vendor --sync $FUCHSIA_DIR/third_party/rust-crates/rustc_deps/Cargo.toml \
-  $FUCHSIA_DIR/third_party/rust-crates/rustc_deps/vendor)
-
-python $FUCHSIA_DIR/scripts/rust/check_rust_licenses.py \
-  --directory $FUCHSIA_DIR/third_party/rust-crates/rustc_deps/vendor
diff --git a/devshell/use b/devshell/use
deleted file mode 100755
index a02505d..0000000
--- a/devshell/use
+++ /dev/null
@@ -1,42 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### re-use a previous build directory set up by `fx set`
-
-## usage: fx use DIR
-##
-## Switches further `fx` commands to using a different build directory.
-## This only works if `fx set ... --build-dir DIR` succeeded previously
-## (and DIR has not been removed since).  The next `fx build` or other
-## such command will now refer to DIR.  The previous build directory is
-## left in place, so you can switch back again with `fx use` later.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-function main {
-  if [[ $# -ne 1 ]]; then
-    fx-command-help
-    return 1
-  fi
-
-  local -r build_dir="$1"
-
-  if [[ "$build_dir" == /* ]]; then
-    local -r full_build_dir="${build_dir}"
-  else
-    local -r full_build_dir="${FUCHSIA_DIR}/${build_dir}"
-  fi
-
-  if [[ -e "${full_build_dir}/args.gn" ]]; then
-    fx-config-write "${build_dir}"
-  else
-    echo "\"${build_dir}\" is not a valid build dir."
-    echo ""
-    fx-command-help
-    return 1
-  fi
-}
-
-main "$@"
diff --git a/devshell/vendor b/devshell/vendor
deleted file mode 100755
index f09661e..0000000
--- a/devshell/vendor
+++ /dev/null
@@ -1,41 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### forward commands to vendor/*/scripts/devshell
-
-## usage: fx vendor <vendor-dir> [command]
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-function main {
-  if [[ $# -lt 2 ]]; then
-    fx-command-help
-
-    echo >&2 "commands: "
-    for d in "${FUCHSIA_DIR}"/vendor/*; do
-      for f in "$d"/scripts/devshell/*; do
-        if [[ -x "$f" ]]; then
-          echo >&2 "  $(basename $d)" "$(basename $f)"
-        fi
-      done
-    done
-
-    return 1
-  fi
-
-  vendor_cmd_path="${FUCHSIA_DIR}/vendor/$1/scripts/devshell/$2"
-
-  if [[ ! -x "${vendor_cmd_path}" ]]; then
-    echo >&2 "command $1 $2 not found!"
-    return 1
-  fi
-
-  shift
-  shift
-
-  exec "${vendor_cmd_path}" "$@"
-}
-
-main "$@"
diff --git a/devshell/verify-build-packages b/devshell/verify-build-packages
deleted file mode 100755
index 1bf05cf..0000000
--- a/devshell/verify-build-packages
+++ /dev/null
@@ -1,48 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### verify the structure of the build package directory in a layer
-
-## usage: fx verify-build-packages zircon|garnet|peridot|topaz|vendor/foo
-
-set -e
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-fx-config-read
-
-if [[ "$#" -ne 1 ]]; then
-  fx-command-help
-  exit 1
-fi
-
-readonly layer="$1"
-
-layer_arg=""
-if [[ ${layer} =~ vendor/([a-zA-Z0-9_-]+) ]]; then
-  readonly vendor="${BASH_REMATCH[1]}"
-  layer_arg="--vendor-layer ${vendor}"
-elif [[ "${layer}" == "zircon" ]]; then
-  # Zircon does not have build packages.
-  exit 0
-elif [[ "${layer}" == "garnet" ]] ||
-     [[ "${layer}" == "peridot" ]] ||
-     [[ "${layer}" == "topaz" ]]; then
-  layer_arg="--layer ${layer}"
-else
-  fx-command-help
-  exit 1
-fi
-
-readonly validator="${FUCHSIA_BUILD_DIR}/host_x64/json_validator"
-
-if [[ ! -f ${validator} ]]; then
-  echo >&2 "error: Cannot find JSON validator at ${validator}."
-  echo >&2 "Try running 'fx build' first."
-  exit 1
-fi
-
-"${FUCHSIA_DIR}/scripts/packages/verify_layer.py" \
-  --json-validator "${validator}" \
-  ${layer_arg}
diff --git a/devshell/wait b/devshell/wait
deleted file mode 100755
index 8685300..0000000
--- a/devshell/wait
+++ /dev/null
@@ -1,22 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-### wait for a shell to become available
-
-## usage: fx wait
-##
-## Attempts to SSH to the target repeatedly until the target becomes
-## available.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/lib/vars.sh || exit $?
-
-device_addr=""
-until [[ -n "${device_addr}" ]]; do
-  device_addr="$(get-fuchsia-device-addr 2>/dev/null)"
-done
-until fx-command-run ssh "${device_addr}" \
-  -o ConnectionAttempts=1 -o ConnectTimeout=1 echo >/dev/null 2>&1; do
-  echo -n
-done
diff --git a/editors/cat_compile_commands.py b/editors/cat_compile_commands.py
deleted file mode 100755
index 4cc5c30..0000000
--- a/editors/cat_compile_commands.py
+++ /dev/null
@@ -1,49 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Logically concatenates given compile_commands.json files to stdout.
-
-Filenames of compile_commands.json files are provided as arguments.  A logical
-concatenation of all the compile commands is output to stdout.
-
-Example:
-    <layer>$ ./scripts/cat_compile_commands.py zircon/compile_commands.json \
-    out/debug-x64/compile_commands.json > compile_commands.json
-
-The output is suitable for use with vscode + cquery which needs a single
-compile_commands.json for the whole editor workspace.  This way a single vscode
-workspace can handle zircon + other layers of fuchsia, by adding the topaz
-folder with the combined compile_commands.json in it.
-
-Todo:
-    * For zircon, in docs that mention "bear make -j20", see also this script.
-    * For rest of fuchsia, integrate something like
-      "~/topaz/buildtools/ninja -v -C \
-      /usr/local/google/home/dustingreen/topaz/out/debug-x64 \
-      -t compdb cc cxx objc objcxx x64-shared_cc x64-shared_cxx \
-      > ~/topaz/out/debug-x64/compile_commands.json
-      into the build - ideally only if build.ninja or maybe *.ninja change.
-    * Iff the ~0.65s execution time becomes an issue, consider a more optimized
-      raw binary splice based on fixing up the "]" and "[" and the end of first
-      file and start of second file.
-    * Document all the steps to get a nicely working vscode + cquery workspace
-      with zircon + the rest of fuchsia in a combined editor workspace.
-"""
-
-import json
-import sys
-
-
-def main():
-    """Cat compile_commands.json files given as args to stdout"""
-    data = []
-    for arg in sys.argv[1:]:
-        data += json.load(open(arg))
-    json.dump(data, sys.stdout, indent=True)
-    return 0
-
-
-if __name__ == "__main__":
-    sys.exit(main())
diff --git a/fd.py b/fd.py
deleted file mode 100755
index cb27257..0000000
--- a/fd.py
+++ /dev/null
@@ -1,289 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-"""fd.py is a fascinating directory changer to save your time in typing.
-
-Use shell function "fd()" to enable autocompletion.
-Use "fd.py" directly for without autocompletion.
-
-See examples by
-$ fd.py --help
-"""
-
-from __future__ import print_function
-
-import argparse
-import os
-import pickle
-import sys
-import termios
-import tty
-
-SEARCH_BASE = os.environ['FUCHSIA_DIR']  # or 'HOME'
-TMP_BASE = '/tmp/'
-DIRS_FILE = TMP_BASE + 'fd.txt'
-PICKLE_FILE = TMP_BASE + 'fd.pickle'
-
-EXCLUDE_DIRS = [
-    '"*/.git"', './build', './buildtools', './out', './third_party',
-    './zircon/build', './zircon/prebuilt', './cmake-build-debug', './zircon/third_party',
-]
-
-
-def eprint(*args, **kwargs):
-  print(*args, file=sys.stderr, **kwargs)
-
-
-class Trie(object):
-  """Class Trie.
-  """
-
-  def __init__(self):
-    self.name = ''  # != path up to here from the root. Key is a valid
-    # complete one.
-    self.vals = []
-    self.kids = {}
-
-  def __getitem__(self, name, idx=0):
-    if self.name == name:
-      return self.vals
-    if idx == name.__len__() or name[idx] not in self.kids:
-      return None
-    return self.kids[name[idx]].__getitem__(name, idx + 1)
-
-  def __setitem__(self, name, val, idx=0):
-    if idx < name.__len__():
-      self.kids.setdefault(name[idx], Trie()).__setitem__(name, val, idx + 1)
-      return
-    self.name = name
-    self.vals.append(val)
-
-  def __contains__(self, name):
-    return self[name] is not None
-
-  def walk(self):
-    descendants = []
-    if self.name:
-      descendants.append(self.name)
-    for k in self.kids:
-      descendants.extend(self.kids[k].walk())
-    return descendants
-
-  def prefixed(self, name, idx=0):
-    if idx < name.__len__():
-      if name[idx] in self.kids:
-        return self.kids[name[idx]].prefixed(name, idx + 1)
-      return []
-
-    return self.walk()
-
-
-def build_trie():
-  """build trie.
-
-  Returns:
-    Trie
-  """
-
-  def build_find_cmd():
-    paths = []
-    for path in EXCLUDE_DIRS:
-      paths.append('{} {}'.format('-path', path))
-    return (r'cd {}; find . \( {} \) -prune -o -type d -print > '
-            '{}').format(SEARCH_BASE, ' -o '.join(paths), DIRS_FILE)
-
-  cmd_str = build_find_cmd()
-  os.system(cmd_str)
-
-  t = Trie()
-  with open(DIRS_FILE, 'r') as f:
-    for line in f:
-      line = line[2:][:-1]
-      tokens = line.split('/')
-      if tokens.__len__() == 0:
-        continue
-      target = tokens[-1]
-
-      t[target] = line
-
-  return t
-
-
-def get_trie():
-  """get_trie.
-
-  Returns:
-    trie
-  """
-
-  def save_pickle(obj):
-    with open(PICKLE_FILE, 'wb+') as f:
-      pickle.dump(obj, f, protocol=pickle.HIGHEST_PROTOCOL)
-
-  def load_pickle():
-    with open(PICKLE_FILE, 'rb') as f:
-      return pickle.load(f)
-
-  if os.path.exists(PICKLE_FILE):
-    return load_pickle()
-
-  t = build_trie()
-  save_pickle(t)
-  return t
-
-
-def button(idx):
-  """button maps idx to an ascii value.
-  """
-  ascii = 0
-  if 0 <= idx <= 8:
-    ascii = ord('1') + idx
-  elif 9 <= idx <= 34:
-    ascii = ord('a') + idx - 9
-  elif 35 <= idx <= 60:
-    ascii = ord('A') + idx - 35
-  elif 61 <= idx <= 75:
-    ascii = ord('!') + idx - 61
-  return str(unichr(ascii))
-
-
-def get_button():  # Unix way
-  fd = sys.stdin.fileno()
-  old_settings = termios.tcgetattr(fd)
-  try:
-    tty.setraw(sys.stdin.fileno())
-    ch = sys.stdin.read(1)
-  finally:
-    termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
-  return ch
-
-
-def choose_options(t, key, choice):
-  # Build options by the given key
-  if key in t:
-    options = t[key]
-  else:
-    prefixed_keys = t.prefixed(key)
-    options = []
-    for pk in prefixed_keys:
-      options.extend(t[pk])
-
-  options = sorted(options)
-  if options.__len__() == 0:
-    eprint('No such directory: {}'.format(key))
-    return None
-  elif options.__len__() == 1:
-    return options[0]
-  elif options.__len__() > 75:  # See def button() for the limit.
-    eprint('Too many ({}) results for "{}". '
-           'Refine your prefix or time to buy 4K '
-           'monitor\n'.format(options.__len__(), key))
-    return None
-
-  def list_choices(l):
-    for i in range(l.__len__()):
-      eprint('[{}] {}'.format(button(i), l[i]))
-    eprint()
-
-  choice_dic = {}
-  for idx, val in enumerate(options):
-    choice_dic[button(idx)] = val
-
-  if choice is not None and choice not in choice_dic:
-    # Invalid pre-choice
-    eprint('Choice "{}" not available\n'.format(choice))
-
-  if choice not in choice_dic:
-    list_choices(options)
-    choice = get_button()
-    if choice not in choice_dic:
-      return None
-
-  return choice_dic[choice]
-
-
-def main():
-
-  def parse_cmdline():
-    example_commands = """
-
-[eg] # Use "fd" for autocompletion (See //scripts/fx-env.sh)
-  $ fd ral        # change directory to an only option: ralink
-  $ fd wlan       # shows all "wlan" directories and ask to choose
-  $ fd wlan 3     # change directory matching to option 3 of "fd wlan"
-  $ fd [TAB]      # Autocomplete subdirectories from the current directory
-  $ fd //[TAB]    # Autocomplete subdirectories from ${FUCHSIA_DIR}
-  $ fd --rebuild  # rebuilds the directory structure cache
-"""
-    p = argparse.ArgumentParser(
-        description='A fascinating directory changer',
-        epilog=example_commands,
-        formatter_class=argparse.RawDescriptionHelpFormatter)
-
-    p.add_argument(
-        '--rebuild', action='store_true', help='rebuild the directory DB')
-    p.add_argument('--base', type=str, default=None)
-    p.add_argument('target', nargs='?', default='')
-    p.add_argument('choice', nargs='?', default=None)
-
-    # Redirect help messages to stderr
-    if len(sys.argv) == 2:
-      if sys.argv[1] in ['-h', '--help']:
-        eprint(p.format_help())
-        print('.')  # Stay at the current directory
-        sys.exit(0)
-
-    return p.parse_args()
-
-  def get_abs_path(relative_dir):
-    if relative_dir is not None:
-      return os.path.join(SEARCH_BASE, relative_dir)
-    return os.getcwd()
-
-  def derive_dest(target):
-    if not target:
-      # To test if this command was invoked just to rebuild
-      return get_abs_path('.') if args.rebuild is False else os.getcwd()
-
-    if target[:2] == '//':
-      target = target[2:]
-
-    candidate = target
-
-    # Do not guess-work when the user specifies an option to intend to use.
-    # Do guess work otherwise.
-    if not args.choice:
-      if os.path.exists(candidate):
-        return candidate
-
-      candidate = get_abs_path(target)
-      if os.path.exists(candidate):
-        return candidate
-
-      candidate = os.path.abspath(target)
-      if os.path.exists(candidate):
-        return candidate
-
-    t = get_trie()
-    return get_abs_path(choose_options(t, target, args.choice))
-
-  args = parse_cmdline()
-  if args.base:
-    global SEARCH_BASE
-    SEARCH_BASE = args.base
-
-  if args.rebuild:
-    os.remove(PICKLE_FILE)
-
-  dest = derive_dest(args.target)
-  dest = os.path.normpath(dest)
-  print(dest)
-
-
-if __name__ == '__main__':
-  try:
-    main()
-  except Exception as e:  # Catch all
-    eprint(e.message, e.args)
-    print('.')  # Stay at the current directory upon exception
diff --git a/fetch-build-artifacts b/fetch-build-artifacts
deleted file mode 100755
index 3d40547..0000000
--- a/fetch-build-artifacts
+++ /dev/null
@@ -1,146 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# NOTE: This tool must be able to be downloaded and run independently of the
-# rest of this repo: the manual release processes that use it do not have a
-# fuchsia checkout.
-
-## usage: fetch-build-artifacts \
-##            [-r] [-p <parent_dir>] [-o <out_dir>] [-h] \
-##            <buildbucket_id>
-##
-## Downloads artifacts created by the specified build.
-##
-## <buildbucket_id> is a large number like '8938070794014050064', which you can
-## find on the build results page. This tool accepts an optional leading 'b',
-## letting you copy-and-paste the ID component from a URL like
-## https://ci.chromium.org/p/fuchsia/builders/luci.fuchsia.ci/garnet-arm64-release-qemu_kvm/b8938011100592458048
-##
-## Prerequisites:
-## - Install the 'gsutil' tool:
-##   https://cloud.google.com/storage/docs/gsutil_install
-## - Run 'gcloud init'. The default project does not matter, but if you need to
-##   provide one you can use 'fuchsia-infra'.
-##   - If you don't have gcloud installed, you can run 'gsutil config' instead.
-## - The user must have permission to access the target artifacts in Google
-##   Cloud Storage (GCS).
-##
-## Optional arguments:
-##   -r    Indicates that the build ID points to a release build, whose
-##         artifacts are stored in a special GCS bucket.
-##   -p    The parent directory of the destination build artifact directory:
-##         see -o. Defaults to "${HOME}/.cache/fuchsia/build-artifacts".
-##         Mutually exclusive with -o.
-##   -o    The directory to write the artifacts to. Defaults to
-##         <parent_dir>/<buildbucket_id>. Mutually exclusive with -p.
-##   -h    Prints this help message.
-
-set -o errexit
-set -o nounset
-set -o pipefail
-
-readonly DEFAULT_PARENT_DIR="${HOME}/.cache/fuchsia/build-artifacts"
-
-function usage {
-  # Prints lines from this file that begin with ##.
-  sed -n -e 's/^## //p' -e 's/^##$//p' < "${BASH_SOURCE[0]}"
-}
-
-function main {
-  if [[ -z "$(which gsutil)" ]]; then
-    echo "ERROR: Please install gsutil and run 'gcloud init'."
-    echo "See https://cloud.google.com/storage/docs/gsutil_install"
-    return 1
-  fi
-
-  local parent_dir=''
-  local out_dir=''
-  local is_release=0
-  while getopts "rp:o:h" arg; do
-    case ${arg} in
-      r)
-        is_release=1
-        ;;
-      p)
-        parent_dir="${OPTARG}"
-        ;;
-      o)
-        out_dir="${OPTARG}"
-        ;;
-      *)
-        usage
-        return 1
-        ;;
-    esac
-  done
-  shift $((${OPTIND} - 1))
-
-  if [[ "${parent_dir}" ]] && [[ "${out_dir}" ]]; then
-    echo "ERROR: Cannot specify both -p and -o."
-    usage
-    return 1
-  fi
-
-  # Expect exactly one build ID.
-  # TODO(dbort): We could support multiple build IDs if -o isn't specified.
-  if [[ $# -ne 1 ]]; then
-    echo "ERROR: Build ID missing."
-    usage
-    return 1
-  fi
-  local build_id="$1"
-
-  # Make sure it's a number, but allow a leading 'b' since some LUCI URLs
-  # tack it onto the build ID component.
-  if [[ ! "${build_id}" =~ ^b?[0-9]+$ ]]; then
-    echo "ERROR: Build ID '${build_id}' is not a number."
-    usage
-    return 1
-  fi
-  build_id="${build_id#b}"  # Remove leading 'b' if present.
-
-  if [[ -z "${parent_dir}" ]]; then
-    parent_dir="${DEFAULT_PARENT_DIR}"
-  fi
-
-  if [[ -z "${out_dir}" ]]; then
-    out_dir="${parent_dir}/${build_id}"
-  fi
-
-  if (( is_release )); then
-    local archive_bucket='fuchsia-release-archive'
-  else
-    local archive_bucket='fuchsia-archive'
-  fi
-
-  echo "Fetching build ${build_id} to ${out_dir} ..."
-  mkdir -p "${out_dir}"
-
-  local failed=0
-  (
-    set -x  # Prints the gsutil command
-    # -m: Download files in parallel
-    # -c: Continue downloading later files even if an earlier one fails
-    # -L: Log file to use for resuming; also avoids re-downloading files
-    gsutil -m cp \
-      -c \
-      -L "${out_dir}/.gsutil-cp.log" \
-      -r \
-      "gs://${archive_bucket}/builds/${build_id}/*" "${out_dir}"
-  ) || failed=1
-
-  if (( failed )); then
-    if (( ! is_release )); then
-      echo "NOTE: If you are trying to download a release build, remember"
-      echo "      to specify the '-r' flag."
-    fi
-    return 1
-  fi
-
-  echo "Downloaded artifacts to ${out_dir}"
-  echo "DONE"
-}
-
-main "$@"
diff --git a/find_integration_revision.py b/find_integration_revision.py
deleted file mode 100755
index aa8e110..0000000
--- a/find_integration_revision.py
+++ /dev/null
@@ -1,85 +0,0 @@
-#!/usr/bin/env python
-
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import itertools
-import os
-import subprocess
-import sys
-import xml.etree.ElementTree as ET
-
-
-def main():
-    parser = argparse.ArgumentParser(
-        description="Find the first integration revision that integrated"
-        "a given petal revision.")
-    parser.add_argument("petal_path")
-    parser.add_argument("petal_revision")
-    args = parser.parse_args()
-
-    fuchsia_dir = os.environ["FUCHSIA_DIR"]
-
-    imports = ET.parse(os.path.join(
-        fuchsia_dir,
-        ".jiri_manifest")).findall('./imports/import[@name=\'integration\']')
-
-    components = []
-    for n in imports:
-        if n.attrib["remote"].startswith("sso://"):
-            components.append("fuchsia")
-            break
-
-    components.extend([args.petal_path, "minimal"])
-
-    integration_dir = os.path.join(fuchsia_dir, "integration")
-    petal_dir = os.path.join(fuchsia_dir, args.petal_path)
-
-    manifest_path = os.path.join(*components)
-    project_xpath = './projects/project[@name=\'%s\']' % args.petal_path
-
-    # Used to avoid shelling out to git multiple times with same petal revision.
-    last_petal_revision_checked = None
-    descendant_found = False
-    for i in itertools.count(start=0, step=1):
-        manifest_string = subprocess.check_output([
-            "git",
-            "-C",
-            integration_dir,
-            "show",
-            "HEAD~%d:%s" % (i, manifest_path),
-        ])
-        for project in ET.fromstring(manifest_string).findall(project_xpath):
-            petal_revision_at_integration_revision = project.attrib["revision"]
-
-        if petal_revision_at_integration_revision != last_petal_revision_checked:
-            last_petal_revision_checked = petal_revision_at_integration_revision
-            if not subprocess.call([
-                    "git",
-                    "-C",
-                    petal_dir,
-                    "merge-base",
-                    "--is-ancestor",
-                    args.petal_revision,
-                    petal_revision_at_integration_revision,
-            ]):
-                descendant_found = True
-            elif descendant_found:
-                subprocess.check_call(
-                    [
-                        "git",
-                        "-C",
-                        integration_dir,
-                        "rev-parse",
-                        "HEAD~%d" % (i - 1),
-                    ],
-                    stdout=sys.stdout,
-                    stderr=sys.stderr,
-                )
-                break
-
-
-if __name__ == "__main__":
-    main()
diff --git a/flog b/flog
deleted file mode 100755
index 4be0949..0000000
--- a/flog
+++ /dev/null
@@ -1,368 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-"""flog is a developer friendly log listener with automatic crash symbol decoder.
-
-The crash is archived to /tmp/fuchsia-crash or a specified directory.
-Configurable keywords or lines with them can be color coded.
-"""
-
-import argparse
-import os
-import re
-import subprocess
-import sys
-from time import gmtime
-from time import strftime
-
-CASE_SENSITIVE = False
-BEGINS = []
-ENDS = []
-SUPPRESSES = []
-CRASH_DIR = '/tmp/fuchsia_crash'
-
-# TODO(porce): Support regular expression
-# TODO(porce): Support compatibility with REGEX_ regex expressions
-COLOR_LINES = {'error': 'red'}
-COLOR_WORDS = {
-    'warn': 'white-on-red',
-    'error': 'white-on-red',
-    'fail': 'white-on-red',
-    'exception': 'white-on-red',
-    'address': 'green',
-    'did not add device in bind': 'green',
-}
-
-RESET = '\033[1;0m'
-
-COLORS = {
-    'WHITE-ON-RED': '\033[41;37m',
-    'BLACK': '\033[30;1m',
-    'RED': '\033[31;1m',
-    'GREEN': '\033[32;1m',
-    'YELLOW': '\033[33;1m',
-    'BLUE': '\033[34;1m',
-    'MAGENTA': '\033[35;1m',
-    'CYAN': '\033[36m;1m',
-}
-
-FIRST_LOG_AFTER_BOOTUP = '[00000.000] 00000.00000> bootdata:'
-
-# REGEX_* are constructed when the command line arguments are parsed.
-REGEX_BEGINS = ''
-REGEX_ENDS = ''
-REGEX_SUPPRESS = ''
-REGEX_COLOR_LINES = ''
-REGEX_COLOR_WORDS = ''
-
-
-def static_vars(**kwargs):
-
-  def decorate(func):
-    for k in kwargs:
-      setattr(func, k, kwargs[k])
-    return func
-
-  return decorate
-
-
-def now_str():
-  return strftime('%Y%m%d_%H%M%S', gmtime())
-
-
-def get_log_listener(args):
-  # return 'cat /tmp/z' # Unit test
-  listener = 'fx log'
-  return '{} {}'.format(listener, ' '.join(x for x in args))
-
-
-@static_vars(crash_dump=[])
-@static_vars(is_crashing=False)
-def monitor_crash(line):
-  if '<==' in line and 'exception' in line:
-    monitor_crash.is_crashing = True
-
-  if monitor_crash.is_crashing is True:
-    monitor_crash.crash_dump.append(line)
-
-  if ': end' in line and 'bt#' in line:
-    decode_backtrace(monitor_crash.crash_dump, CRASH_DIR)
-    monitor_crash.crash_dump = []
-    monitor_crash.is_crashing = False
-
-
-def color(string, color_name):
-  if color_name.upper() not in COLORS:
-    return string
-
-  ends_in_newline = string.endswith('\n')
-
-  if ends_in_newline:
-    string = string[:-1]
-
-  result = '{}{}{}'.format(COLORS[color_name.upper()], string, RESET)
-
-  if ends_in_newline:
-    result += '\n'
-
-  return result
-
-
-def anymatch(test_string, regexes):
-  global CASE_SENSITIVE
-  if CASE_SENSITIVE:
-    ans = re.search(regexes, test_string)
-  else:
-    ans = re.search(regexes, test_string, flags=re.IGNORECASE)
-
-  return ans.group(0) if ans else None
-
-
-@static_vars(is_in_session=False)
-def is_suppressed(line):
-  """Test if the log may be printed or not.
-
-  Args:
-    line (str): A log line.
-
-  Returns:
-    True if the log line should be suppressed. False otherwise.
-  """
-  if line.startswith(FIRST_LOG_AFTER_BOOTUP):
-    # A reboot occurs. Reset to default.
-    is_suppressed.is_in_session = False
-
-  if not BEGINS or anymatch(line, REGEX_BEGINS):
-    if not is_suppressed.is_in_session:
-      print '\n' * 3
-      is_suppressed.is_in_session = True
-
-  if anymatch(line, REGEX_ENDS):
-    is_suppressed.is_in_session = False
-
-  if not is_suppressed.is_in_session:
-    return True
-
-  if anymatch(line, REGEX_SUPPRESSES):
-    return True
-
-  return False
-
-
-def colorize(line_incoming):
-  """Color-code the log line.
-
-  Args:
-    line_incoming: log line.
-
-  Returns:
-    color-coded log line
-  """
-
-  line = line_incoming
-
-  k = anymatch(line, REGEX_COLOR_LINES)
-  if k:
-    v = COLOR_LINES.get(k.lower(), '')
-    line = color(line, v)
-    return line
-
-  if anymatch(line, REGEX_COLOR_WORDS):
-    # TODO(porce): Maybe there are less costly ways.
-    for k, v in COLOR_WORDS.iteritems():
-      replacement = r'%s\1%s' % (COLORS[v.upper()], RESET)
-      search = '(?i)(' + '|'.join(map(re.escape, [k])) + ')'
-      line = re.sub(search, replacement, line)
-
-  return line
-
-
-def print_log(line):
-  if is_suppressed(line):
-    return
-
-  print colorize(line),
-
-
-def hijack_stdout(cmd):
-  proc = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE)
-  return iter(proc.stdout.readline, '')
-
-
-def get_decode_cmd():
-  decode_cmd = 'fx symbolize'
-  return decode_cmd
-
-
-def print_bt(line):
-  """Print backtrace with some colors.
-
-  Args:
-    line (str): Backtrace line.
-  """
-  tokens = line.split(' at ')
-
-  methods = tokens[:-1]
-  methods.append('')
-  methods_text = ' at '.join(x for x in methods)
-
-  path = tokens[-1]
-  path_tokens = path.split('/')
-  paths = path_tokens[:-1]
-  paths.append('')
-
-  path_text = '/'.join(x for x in paths)
-  path_colored = path_text + color(path_tokens[-1], 'red')
-
-  print methods_text, path_colored,
-
-
-def decode_backtrace(crash_dump, dst_dir):
-  """A wrapper to fsymbolize.
-
-  Args:
-    crash_dump (array): array of log lines.
-    dst_dir    (str): directory to archive to.
-  """
-  os.system('mkdir -p {}'.format(dst_dir))
-  crash_file_path = '{}/{}.crash'.format(dst_dir, now_str())
-  tmp_file = '{}/tmp.dump'.format(dst_dir)
-
-  f = open(tmp_file, 'w')
-  for line in crash_dump:
-    f.write(line)
-  f.close()
-
-  cmd = '{} < {} > {}'.format(get_decode_cmd(), tmp_file, crash_file_path)
-  os.system(cmd)
-
-  print '\n\n'
-
-  is_start = False
-  with open(crash_file_path, 'r') as f:
-    for line in f:
-      if 'start of symbolized stack:' in line:
-        is_start = True
-
-      if is_start:
-        print_bt(line)
-  print '\n\n'
-
-
-def parse_color_map(string):
-  """Converted comma separated text into a color code map.
-
-  Args:
-    string (str): a text line
-
-  Returns:
-   Dictonary whose key is a text pattern and value is the color name.
-  """
-  m = {}
-  items = string.split(',')
-  for item in items:
-    sep = ':'
-    if sep not in item:
-      continue
-
-    idx = item.rfind(sep)
-    text = item[:idx]
-    color_name = item[idx + 1:]
-
-    if text.__len__() == 0:
-      continue
-
-    m[text] = color_name
-
-  return m
-
-
-def proc_cmdline():
-  """Argument parser.
-
-  Returns:
-    args.
-  """
-  example_commands = """
-
-  Pro tip: Use comma seperated texts for multiple matches
-
-  Example:
-  $ flog --begin \'my module starts,rare event\'
-         --end \'my module ends\'
-         --suppress \'verbose,chatty\'
-         --lines \'error msg:red,warn:blue\'
-         --words \'register 0x00:green,exit:yellow\'
-
-  """
-
-  p = argparse.ArgumentParser(
-      description='A friendly Fuchsia log listener',
-      epilog=example_commands,
-      formatter_class=argparse.RawDescriptionHelpFormatter)
-
-  p.add_argument('--begin', type=str, help='trigger texts to start logging')
-  p.add_argument('--end', type=str, help='trigger texts to end logging')
-
-  p.add_argument(
-      '--case', type=bool, help='match case-sensitively', default=False)
-
-  p.add_argument('--suppress', type=str, help='text to suppress the line')
-  p.add_argument('--lines', type=str, help='colorize the line. {text:color}')
-  p.add_argument('--words', type=str, help='colorize the word. {text:color}')
-  p.add_argument('--crashdir', type=str, help='directory to store crash files.')
-  p.add_argument(
-      'remainders',
-      nargs=argparse.REMAINDER,
-      help='arguments passed to loglistener')
-
-  args = p.parse_args()
-
-  global CASE_SENSITIVE
-  global BEGINS
-  global ENDS
-  global SUPPRESSES
-  global COLOR_LINES
-  global COLOR_WORDS
-  global CRASH_DIR
-
-  CASE_SENSITIVE = args.case
-  if args.begin:
-    BEGINS.extend(args.begin.split(','))
-  if args.end:
-    ENDS.extend(args.end.split(','))
-  if args.suppress:
-    SUPPRESSES.extend(args.suppress.split(','))
-  if args.lines:
-    COLOR_LINES.update(parse_color_map(args.lines))
-  if args.words:
-    COLOR_WORDS.update(parse_color_map(args.words))
-  if args.crashdir:
-    CRASH_DIR = args.crashdir
-
-  global REGEX_BEGINS
-  global REGEX_ENDS
-  global REGEX_SUPPRESSES
-  global REGEX_COLOR_LINES
-  global REGEX_COLOR_WORDS
-
-  # TODO(porce): Support regex input
-  REGEX_BEGINS = '|'.join(BEGINS)
-  REGEX_ENDS = '|'.join(ENDS)
-  REGEX_SUPPRESSES = '|'.join(SUPPRESSES)
-  REGEX_COLOR_LINES = '|'.join(k for k, v in COLOR_LINES.iteritems())
-  REGEX_COLOR_WORDS = '|'.join(k for k, v in COLOR_WORDS.iteritems())
-
-  return args
-
-
-def main():
-  args = proc_cmdline()
-  cmd = get_log_listener(args.remainders)
-  for line in hijack_stdout(cmd):
-    print_log(line)
-    monitor_crash(line)
-
-
-main()
diff --git a/fx b/fx
deleted file mode 100755
index acf5242..0000000
--- a/fx
+++ /dev/null
@@ -1,267 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-function get_build_dir {
-  (source "${vars_sh}" && fx-config-read-if-present && echo "${FUCHSIA_BUILD_DIR}")
-}
-
-function commands {
-  local cmds="$(ls "${fuchsia_dir}/scripts/devshell" | grep -v -e '^lib$' -e '^tests$')"
-
-  local newline=$'\n'
-  local build_dir=$(get_build_dir)
-  if [[ -n "${build_dir}" ]]; then
-    cmds="${cmds}${newline}$(ls "${build_dir}/tools" 2>/dev/null)"
-  fi
-
-  for tool in ${buildtools_whitelist}; do
-    cmds="${cmds}${newline}${tool}"
-  done
-
-  echo "$(echo "${cmds}" | sort)"
-}
-
-function find_command {
-  local cmd=$1
-
-  local command_path="${fuchsia_dir}/scripts/devshell/${cmd}"
-  if [[ -x "${command_path}" ]]; then
-    echo "${command_path}"
-    return 0
-  fi
-
-  local build_dir=$(get_build_dir)
-  if [[ -n "${build_dir}" ]]; then
-    local command_path="${build_dir}/tools/${cmd}"
-    if [[ -x "${command_path}" ]]; then
-      echo "${command_path}"
-      return 0
-    fi
-  fi
-
-  for tool in ${buildtools_whitelist}; do
-    if [[ "$cmd" != "$tool" ]]; then
-      continue
-    fi
-    local command_path="${fuchsia_dir}/buildtools/${tool}"
-    if [[ -x "${command_path}" ]]; then
-      echo "${command_path}"
-      return 0
-    fi
-  done
-
-  return 1
-}
-
-function help {
-  local cmd="$1"
-  if [[ -z "${cmd}" ]]; then
-    for cmd in $(commands); do
-      local cmd_path="$(find_command "${cmd}")"
-      if [[ $(file -b --mime "${cmd_path}" | cut -d / -f 1) == "text" ]]; then
-        echo "${cmd} | $(sed -n '1,/^###/s/^### //p' < "${cmd_path}")"
-      else
-        echo "${cmd}"
-      fi
-    done | column -t -s '|' -c 2
-  else
-    local cmd_path="$(find_command "${cmd}")"
-    if [[ -z "$cmd_path" ]]; then
-      echo "Command not found"
-    elif [[ $(file -b --mime "${cmd_path}" | cut -d / -f 1) == "text" ]]; then
-      fx-print-command-help "${cmd_path}"
-    else
-      echo "No help found. Try \`fx ${cmd} -h\`"
-    fi
-  fi
-}
-
-function usage {
-  cat <<END
-usage: fx [--config CONFIG_FILE | --dir BUILD_DIR] [-d DEVICE_NAME] [-i] [-x] COMMAND [...]
-
-Run Fuchsia development commands. Must be run with either a current working
-directory that is contained in a Fuchsia source tree or the FUCHSIA_DIR
-environment variable set to the root of a Fuchsia source tree.
-
-commands:
-$(help)
-
-optional arguments:
-  --config=CONFIG_FILE  Path to the config file use when running COMMAND.
-                        Defaults to FUCHSIA_CONFIG if set in the
-                        environment and //.config otherwise.  The config
-                        file determines which build directory (and therefore
-                        build configuration) is used by COMMAND.
-  --dir=BUILD_DIR       Path to the build directory to use when running COMMAND.
-                        If specified, FILE is ignored.
-  -d=DEVICE_NAME        Target a specific device. DEVICE_NAME may be a Fuchsia
-                        device name. Note: "fx set-device" can be used to set a
-                        default DEVICE_NAME for a BUILD_DIR.
-  -i                    Iterative mode.  Repeat the command whenever a file is
-                        modified under your Fuchsia directory, not including
-                        out/.
-  -x                    Print commands and their arguments as they are executed.
-
-optional shell extensions:
-  fx-go
-  fx-update-path
-  fx-set-prompt
-
-To use these shell extensions, first source fx-env.sh into your shell:
-
-  $ source scripts/fx-env.sh
-
-END
-}
-
-buildtools_whitelist="gn ninja"
-
-fuchsia_dir="${FUCHSIA_DIR}"
-if [[ -z "${fuchsia_dir}" ]]; then
-  # We walk the parent directories looking for .jiri_root rather than using
-  # BASH_SOURCE so that we find the fuchsia_dir enclosing the current working
-  # directory instead of the one containing this file in case the user has
-  # multiple Fuchsia source trees and is picking up this file from another one.
-  fuchsia_dir="$(pwd)"
-  while [[ ! -d "${fuchsia_dir}/.jiri_root" ]]; do
-    fuchsia_dir="$(dirname "${fuchsia_dir}")"
-    if [[ "${fuchsia_dir}" == "/" ]]; then
-      echo >& 2 "error: Cannot find Fuchsia source tree containing $(pwd)"
-      exit 1
-    fi
-  done
-fi
-
-declare -r vars_sh="${fuchsia_dir}/scripts/devshell/lib/vars.sh"
-source "${vars_sh}" || exit $?
-
-while [[ $# -ne 0 ]]; do
-  case $1 in
-    --config=*|--dir=*|-d=*)
-      # Turn --switch=value into --switch value.
-      arg="$1"
-      shift
-      set -- "${arg%%=*}" "${arg#*=}" "$@"
-      continue
-      ;;
-    --config)
-      if [[ $# -lt 2 ]]; then
-        usage
-        echo >&2 "ERROR: Missing path to config file for --config argument"
-        exit 1
-      fi
-      shift # Removes --config.
-      export FUCHSIA_CONFIG="$1"
-      ;;
-    --dir)
-      if [[ $# -lt 2 ]]; then
-        usage
-        echo >&2 "ERROR: Missing path to build directory for --dir argument"
-        exit 1
-      fi
-      shift # Removes --dir.
-      export FUCHSIA_BUILD_DIR="$1"
-      if [[ "$FUCHSIA_BUILD_DIR" == //* ]]; then
-        FUCHSIA_BUILD_DIR="${fuchsia_dir}/${FUCHSIA_BUILD_DIR#//}"
-      fi
-      if [[ ! -d "$FUCHSIA_BUILD_DIR" ]]; then
-        echo >&2 "ERROR: Build directory $FUCHSIA_BUILD_DIR does not exist"
-        exit 1
-      fi
-      # This tells fx-config-read not to use the file.
-      export FUCHSIA_CONFIG=-
-      ;;
-    -d)
-      if [[ $# -lt 2 ]]; then
-        usage
-        echo >&2 "ERROR: Missing device name for -d argument"
-        exit 1
-      fi
-      shift # removes -d
-      export FUCHSIA_DEVICE_NAME="$1"
-      ;;
-    -i)
-      declare iterative=1
-      ;;
-    -x)
-      export FUCHSIA_DEVSHELL_VERBOSITY=1
-      ;;
-    --)
-      shift
-      break
-      ;;
-    help)
-      if [[ $# -gt 1 ]]; then
-        shift
-        help "$@"
-        exit
-      else
-        usage
-        exit 1
-      fi
-      ;;
-    -*)
-      usage
-      echo >& 2 "error: Unknown global argument $1"
-      exit 1
-      ;;
-    *)
-      break
-      ;;
-  esac
-  shift
-done
-
-if [[ $# -lt 1 ]]; then
-  usage
-  echo >& 2 "error: Missing command name"
-  exit 1
-fi
-
-command_name="$1"
-command_path="$(find_command ${command_name})"
-
-if [[ $? -ne 0 ]]; then
-  usage
-  echo >& 2 "error: Unknown command ${command_name}"
-  exit 1
-fi
-
-declare -r cmd_and_args="$@"
-shift # Removes the command name.
-
-"${command_path}" "$@"
-declare -r retval=$?
-if [ -z "${iterative}" ]; then
-  exit ${retval}
-elif which inotifywait >/dev/null; then
-  # Watch everything except out/ and files/directories beginning with "."
-  # such as lock files, swap files, .git, etc'.
-  inotifywait -qrme modify --exclude "(/\.|lock|compile_commands.json)" "${fuchsia_dir}" @"${fuchsia_dir}"/out @"${fuchsia_dir}"/zircon/public | while read; do
-    # Drain all subsequent events in a batch.
-    # Otherwise when multiple files are changes at once we'd run multiple
-    # times.
-    read -d "" -t .01
-    # Allow at most one fx -i invocation per Fuchsia dir at a time.
-    # Otherwise multiple concurrent fx -i invocations can trigger each other
-    # and cause a storm.
-    echo "---------------------------------- fx -i ${cmd_and_args} ---------------------------------------"
-    "${command_path}" "$@"
-    echo "--- Done!"
-  done
-elif which apt-get >/dev/null; then
-  echo "Missing inotifywait"
-  echo "Try: sudo apt-get install inotify-tools"
-elif which fswatch >/dev/null; then
-  fswatch --one-per-batch --event=Updated -e "${fuchsia_dir}"/out/ -e "${fuchsia_dir}"/zircon/public/ -e "/\." -e "lock" -e "/compile_commands.json" . | while read; do
-    echo "---------------------------------- fx -i ${cmd_and_args} ---------------------------------------"
-    "${command_path}" "$@"
-    echo "--- Done!"
-  done
-else
-  echo "Missing fswatch"
-  echo "Try: brew install fswatch"
-fi
diff --git a/fx-env.sh b/fx-env.sh
deleted file mode 100755
index e5eb91c..0000000
--- a/fx-env.sh
+++ /dev/null
@@ -1,156 +0,0 @@
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -n "${ZSH_VERSION}" ]]; then
-  source "$(cd "$(dirname "${(%):-%x}")" >/dev/null 2>&1 && pwd)"/devshell/lib/vars.sh || return $?
-else
-  source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/devshell/lib/vars.sh || return $?
-fi
-
-# __patched_path <old-regex> <new-component>
-# Prints a new path value based on the current $PATH, removing path components
-# that match <old-regex> and adding <new-component> to the end.
-function __patched_path {
-    local old_regex="$1"
-    local new_component="$2"
-    local stripped
-    # Put each PATH component on a line, delete any lines that match the regex,
-    # then glue back together with ':' characters.
-    stripped="$(
-        set -o pipefail &&
-        echo "${PATH}" |
-        tr ':' '\n' |
-        grep -v -E "^${old_regex}$" |
-        tr '\n' ':'
-    )"
-    # The trailing newline will have become a colon, so no need to add another
-    # one here.
-    echo "${stripped}${new_component}"
-}
-
-### fx-update-path: add useful tools to the PATH
-
-# Add tools to path, removing prior tools directory if any. This also
-# matches the Zircon tools directory added by zset, so add it back too.
-function fx-update-path {
-  local rust_dir="$(source "${FUCHSIA_DIR}/buildtools/vars.sh" && echo -n "${BUILDTOOLS_RUST_DIR}/bin")"
-
-  local build_dir="$(fx-config-read; echo "${FUCHSIA_BUILD_DIR}")"
-
-  local tools_dirs="${ZIRCON_TOOLS_DIR}"
-  if [[ -n "${build_dir}" ]]; then
-    tools_dirs="${build_dir}/tools:${tools_dirs}"
-  fi
-
-  export PATH="$(__patched_path \
-      "${FUCHSIA_OUT_DIR}/[^/]*-[^/]*/tools" \
-      "${tools_dirs}"
-  )"
-
-  export PATH="$(__patched_path "${rust_dir}" "${rust_dir}")"
-}
-
-### fx-prompt-info: prints the current configuration for display in a prompt
-
-function fx-prompt-info {
-  # Run fx-config-read in a subshell to avoid polluting this shell's environment
-  # with data from the config file, which can change without updating this
-  # shell's environment.
-  (
-    fx-config-read
-    echo "${FUCHSIA_BUILD_DIR##*/}"
-  )
-}
-
-### fx-set-prompt: displays the current configuration in the prompt
-
-function fx-set-prompt {
-  if [[ -n "${ZSH_VERSION}" ]]; then
-    autoload -Uz colors && colors
-    setopt PROMPT_SUBST
-    export PS1='%B%F{yellow}[$(fx-prompt-info)] %B%F{blue}%m:%~%#%f%b '
-    export PS2='%B%F{blue}>%f%b '
-  else
-    export PS1='\[\e[0;1;33m\][$(fx-prompt-info)] \[\e[34m\]\h:\w\\$\[\e[0m\] '
-    export PS2='\[\e[0;1;34m\]>\[\e[0m\] '
-  fi
-}
-
-### fd: navigate to directories with autocomplete
-
-# $ fd --help   # for usage.
-function fd {
-  local fd_python
-  local dest
-  fd_python="${FUCHSIA_DIR}/scripts/fd.py"
-  dest=$(eval ${fd_python} "$@")
-  cd -- "${dest}"
-}
-
-if [[ -z "${ZSH_VERSION}" ]]; then
-  function __fd {
-    local cur
-    COMPREPLY=()
-    cur="${COMP_WORDS[COMP_CWORD]}"
-    if [[ ${cur:0:2} == "//" ]]; then
-      COMPREPLY=($(/bin/ls -dp1 ${FUCHSIA_DIR}/${cur}* 2>/dev/null | \
-        sed -n "s|^${FUCHSIA_DIR}/\(.*/\)\$|\1|p" | xargs echo))
-    else
-      COMPREPLY=($(/bin/ls -dp1 ${cur}* 2>/dev/null | grep "/$" | xargs echo))
-    fi
-  }
-  complete -o nospace -F __fd fd
-fi
-
-### fx-go: alias of fd, for backward compatibility
-
-function fx-go {
-  echo "fx-go is to be deprecated in Q1 2018 in favor of 'fd'. For more help, fd --help"
-  fd "$@"
-}
-
-
-# Support command-line auto-completions for the fx command.
-if [[ -z "${ZSH_VERSION}" ]]; then
-  function __fx_complete_cmd {
-    local cmd cur prev
-    cmd="${COMP_WORDS[1]}"
-    cur="${COMP_WORDS[COMP_CWORD]}"
-    prev="${COMP_WORDS[COMP_CWORD-1]}"
-
-    case "${cmd}" in
-      set)
-        if [[ ${COMP_CWORD} -eq 2 ]]; then
-          COMPREPLY=($(compgen -W "x64 arm64" "${cur}"))
-          return
-        fi
-        case "${prev}" in
-          --packages)
-            COMPREPLY=($(/bin/ls -dp1 ${FUCHSIA_DIR}/*/packages/${cur}* 2>/dev/null | \
-              sed -n "s|^${FUCHSIA_DIR}/\(.*\)\$|\1|p" | xargs echo))
-            return
-            ;;
-        esac
-        ;;
-
-      set-petal)
-        if [[ ${COMP_CWORD} -eq 2 ]]; then
-          COMPREPLY=($(compgen -W "garnet peridot topaz" "${cur}"))
-          return
-        fi
-        ;;
-    esac
-  }
-
-  function __fx {
-    COMPREPLY=()
-    if [[ ${COMP_CWORD} -eq 1 ]]; then
-      COMPREPLY=($(/bin/ls -dp1 ${FUCHSIA_DIR}/scripts/devshell/${COMP_WORDS[1]}* 2>/dev/null | \
-        sed -n "s|^${FUCHSIA_DIR}/scripts/devshell/\([^/]*\)\$|\1|p" | xargs echo))
-    else
-      __fx_complete_cmd
-    fi
-  }
-  complete -o default -F __fx fx
-fi
diff --git a/fx-wrapper b/fx-wrapper
deleted file mode 100755
index 749a6d9..0000000
--- a/fx-wrapper
+++ /dev/null
@@ -1,11 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# This script is a wrapper around the fx script that allows it to be run from
-# a current working directory outside the fuchsia source tree.
-
-export FUCHSIA_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." >/dev/null 2>&1 && pwd)"
-
-exec ${FUCHSIA_DIR}/scripts/fx "$@"
diff --git a/gce/README.md b/gce/README.md
deleted file mode 100644
index 8b32522..0000000
--- a/gce/README.md
+++ /dev/null
@@ -1,68 +0,0 @@
-# Fuchsia GCE Scripts
-
-The scripts in this directory are convenience wrappers around primarily two
-tools: `make-fuchsia-vol` that is a host tool in the build, and `gcloud` which
-is the command like client for Google Cloud.
-
-The scripts are usable via `//scripts/gce/gce` as an executable bash script.
-
-## Prerequisites
-
- * You have followed the Fuchsia getting started documents: https://fuchsia.googlesource.com/docs/+/master/getting_started.md
- * You have installed & configured gcloud(1): https://cloud.google.com/sdk/gcloud/
- * You have setup defaults for gcloud, e.g.:
-```
-gcloud auth login
-gcloud config set project my-awesome-cloud
-gcloud config set compute/region us-west1
-gcloud config set compute/zone us-west1-a
-```
-
-## Building and running an image
-
-The following incantation will create the relevant disk images and boot a
-Fuchsia instance:
-
-```
-cd $FUCHSIA_ROOT
-fx set x64 --release
-fx full-build
-fx gce create-fuchsia-image
-fx gce create-instance
-sleep 60
-fx gce serial
-fx gce delete-instance
-```
-
-## How the sausage is made
-
-### gce/env.sh
-
-The gce script suite comes with an `env.sh` that sets default environment
-variables used by gce subcommands. The `gce` entrypoint script sources this.
-Users can override all of the variables seen in `env.sh`, for example, altering
-the name of the GCE instance you create/delete can be done by setting
-`$FUCHSIA_GCE_INSTANCE`. See the contents of the script for more options.
-
-### Fuchsia volumes
-
-The Fuchsia disk images are built as standard x86-64 EFI bootable GPT volumes.
-
-### Command execution
-
-The `gce` entrpoint script sources `env.sh`, then looks for a command script in
-the `gce` scripts directory that matches the first argument given to it. If it
-finds one, it shifts the first argument and execs that script.
-
-## Commands
-
- * make-fuchsia-image - builds a local image file containing your local fuchsia
-   build
- * create-fuchsia-image - builds and uploads a fuchsia image to GCE.
- * serial - attaches to the instance serial port. Note that you will need to
-   have an appropriately configured compute engine ssh key for this to work. If
-   you have a more tuned ssh configuration, you may need to add
-   `~/.ssh/google_compute_engine` to your `ssh-agent`.
- * create-instance - create a GCE instance running fuchsia based on the most
-   recently created fuchsia image.
- * delete-instance - deletes a GCE instance created by `create-instance`.
diff --git a/gce/create-fuchsia-image.sh b/gce/create-fuchsia-image.sh
deleted file mode 100755
index 0831426..0000000
--- a/gce/create-fuchsia-image.sh
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -z $FUCHSIA_GCE_PROJECT ]]; then
-  source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/env.sh
-fi
-
-$FUCHSIA_DIR/scripts/gce/gce make-fuchsia-image "$@" || exit 1
-
-diskimage="$FUCHSIA_OUT_DIR/$FUCHSIA_GCE_IMAGE.img"
-
-tmp="$(mktemp -d)"
-if [[ ! -d $tmp ]]; then
-  echo "mktemp failed" >&2
-  exit 1
-fi
-trap "rm -rf '$tmp'" EXIT
-
-cd "$tmp"
-mv "$diskimage" disk.raw
-
-tar -Scf "$FUCHSIA_OUT_DIR/$FUCHSIA_GCE_IMAGE.tar" disk.raw
-if [ -x "$(command -v pigz)" ]; then
-  pigz -f "$FUCHSIA_OUT_DIR/$FUCHSIA_GCE_IMAGE.tar"
-else
-  gzip -f "$FUCHSIA_OUT_DIR/$FUCHSIA_GCE_IMAGE.tar"
-fi
-gsutil cp "$FUCHSIA_OUT_DIR/$FUCHSIA_GCE_IMAGE.tar.gz" "gs://$FUCHSIA_GCE_PROJECT/$FUCHSIA_GCE_USER/$FUCHSIA_GCE_IMAGE.tar.gz"
-gcloud -q compute images delete "$FUCHSIA_GCE_IMAGE"
-gcloud -q compute images create "$FUCHSIA_GCE_IMAGE" --source-uri "gs://$FUCHSIA_GCE_PROJECT/$FUCHSIA_GCE_USER/$FUCHSIA_GCE_IMAGE.tar.gz" --guest-os-features=UEFI_COMPATIBLE
diff --git a/gce/create-instance.sh b/gce/create-instance.sh
deleted file mode 100755
index 9ead3d2..0000000
--- a/gce/create-instance.sh
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -z $FUCHSIA_GCE_PROJECT ]]; then
-  source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/env.sh
-fi
-
-# gcloud -q compute disks create "$FUCHSIA_GCE_DISK" --guest-os-features=UEFI_COMPATIBLE --image "$FUCHSIA_GCE_IMAGE" || exit
-# gcloud -q compute instances create "$FUCHSIA_GCE_INSTANCE" --metadata=serial-port-enable=1 --disk=auto-delete=yes,boot=yes,mode=rw,name="${FUCHSIA_GCE_DISK}" || exit
-gcloud -q compute instances create "$FUCHSIA_GCE_INSTANCE" --metadata=serial-port-enable=1 --image "${FUCHSIA_GCE_IMAGE}" || exit $?
diff --git a/gce/delete-instance.sh b/gce/delete-instance.sh
deleted file mode 100755
index eee7041..0000000
--- a/gce/delete-instance.sh
+++ /dev/null
@@ -1,10 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -z $FUCHSIA_GCE_PROJECT ]]; then
-  source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/env.sh
-fi
-
-gcloud compute instances delete $FUCHSIA_GCE_INSTANCE
diff --git a/gce/efi-cmdline.txt b/gce/efi-cmdline.txt
deleted file mode 100644
index 8aa533b..0000000
--- a/gce/efi-cmdline.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-bootloader.timeout=0
-bootloader.default=local
\ No newline at end of file
diff --git a/gce/env.sh b/gce/env.sh
deleted file mode 100755
index b4ad610..0000000
--- a/gce/env.sh
+++ /dev/null
@@ -1,38 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/../devshell/lib/vars.sh || exit $?
-fx-config-read
-
-get-gcloud-config() {
-  gcloud -q config get-value "$@" 2>/dev/null
-}
-
-makefile() {
-  local size=$1
-  local path=$2
-  case $(uname) in
-    Linux)
-      fallocate -l "$size" "$path"
-      ;;
-    Darwin)
-      mkfile -n "$size" "$path"
-      ;;
-    *)
-      echo "Unsupported platform $(uname)" >&2
-      exit 1
-      ;;
-  esac
-}
-
-FUCHSIA_GCE_PROJECT=${FUCHSIA_GCE_PROJECT:-$(get-gcloud-config project)}
-FUCHSIA_GCE_ZONE=${FUCHSIA_GCE_ZONE:-$(get-gcloud-config compute/zone)}
-FUCHSIA_GCE_USER=${FUCHSIA_GCE_USER:-"$USER"}
-FUCHSIA_GCE_INSTANCE=${FUCHSIA_GCE_INSTANCE:-"$FUCHSIA_GCE_USER-fuchsia"}
-FUCHSIA_GCE_IMAGE=${FUCHSIA_GCE_IMAGE:-"$FUCHSIA_GCE_INSTANCE-img"}
-FUCHSIA_GCE_DISK=${FUCHSIA_GCE_DISK:-"$FUCHSIA_GCE_INSTANCE-disk"}
-
-[[ -n $FUCHSIA_GCE_PROJECT ]] || (echo "Set a default gcloud config for project or set \$FUCHSIA_GCE_PROJECT" >&2 && exit 1)
-[[ -n $FUCHSIA_GCE_ZONE ]] || (echo "Set a default gcloud config for compute zone or set \$FUCHSIA_GCE_ZONE" >&2 && exit 1)
diff --git a/gce/gce b/gce/gce
deleted file mode 100755
index 0589fa1..0000000
--- a/gce/gce
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-mydir="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-if [[ -z $FUCHSIA_GCE_PROJECT ]]; then
-  source $mydir/env.sh
-fi
-
-usage() {
-	echo -n "$(basename $0) ["
-	(
-	cd $mydir;
-	ls *.sh | grep -v env.sh | cut -d . -f 1
-	) | tr "\n" "|"
-	echo "]"
-}
-
-if [[ $# = 0 ]] || [[ "help" = $1 ]]; then
-	usage
-	exit 1
-fi
-
-cmd=$1
-shift
-if [[ ! -f $mydir/$cmd.sh ]]; then
-	usage
-	exit 1
-fi
-
-exec $mydir/$cmd.sh "$@"
diff --git a/gce/kernel-cmdline.txt b/gce/kernel-cmdline.txt
deleted file mode 100644
index cd6220d..0000000
--- a/gce/kernel-cmdline.txt
+++ /dev/null
@@ -1 +0,0 @@
-kernel.serial=legacy
\ No newline at end of file
diff --git a/gce/make-fuchsia-image.sh b/gce/make-fuchsia-image.sh
deleted file mode 100755
index cf8a482..0000000
--- a/gce/make-fuchsia-image.sh
+++ /dev/null
@@ -1,22 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -z $FUCHSIA_GCE_PROJECT ]]; then
-  source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/env.sh
-fi
-
-mfv=$FUCHSIA_BUILD_DIR/tools/make-fuchsia-vol
-
-if [[ ! -x $mfv ]]; then
-	echo "You need to build the 'make-fuchsia-vol' package" >&2
-	exit 1
-fi
-
-diskimage="$FUCHSIA_OUT_DIR/$FUCHSIA_GCE_IMAGE.img"
-
-# TODO(raggi): look at size that sys part needs to be and use that.
-makefile 10g "$diskimage"
-
-$mfv "$@" "$diskimage" || exit 1
diff --git a/gce/serial.sh b/gce/serial.sh
deleted file mode 100755
index 3f40d90..0000000
--- a/gce/serial.sh
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-if [[ -z $FUCHSIA_GCE_PROJECT ]]; then
-  source "$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"/env.sh
-fi
-
-# ctlpath frequently gets too long here, so instead, just ssh without it.
-# was: gcloud compute connect-to-serial-port $instance
-ssh -S none -p 9600 $FUCHSIA_GCE_PROJECT.$FUCHSIA_GCE_ZONE.$FUCHSIA_GCE_INSTANCE.$FUCHSIA_GCE_USER@ssh-serialport.googleapis.com
diff --git a/gdb/__init__.py b/gdb/__init__.py
deleted file mode 100644
index fa81ada..0000000
--- a/gdb/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-# empty file
diff --git a/gdb/build-gdb.sh b/gdb/build-gdb.sh
deleted file mode 100755
index 170879b..0000000
--- a/gdb/build-gdb.sh
+++ /dev/null
@@ -1,109 +0,0 @@
-#!/usr/bin/env bash
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-set -eo pipefail
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")"/.. >/dev/null 2>&1 && pwd)"
-
-# N.B. This must be an absolute path.
-readonly ROOT_DIR="$(dirname "${SCRIPT_DIR}")"
-
-readonly HOST_ARCH=$(uname -m)
-readonly HOST_OS=$(uname | tr '[:upper:]' '[:lower:]')
-readonly HOST_TRIPLE="${HOST_ARCH}-${HOST_OS}"
-
-if [[ "x${HOST_OS}" == "xlinux" ]]; then
-  readonly DEFAULT_JOBS=$(grep ^processor /proc/cpuinfo | wc -l)
-elif [[ "x${HOST_OS}" == "xdarwin" ]]; then
-  readonly DEFAULT_JOBS=$(sysctl -n hw.ncpu)
-else
-  echo "Unsupported system: ${HOST_OS}" 1>&2
-  exit 1
-fi
-
-[[ "${TRACE}" ]] && set -x
-
-usage() {
-  printf >&2 '%s: [-c] [-o outdir] [-d destdir] [-j jobs]\n' "$0"
-  echo >&2 "-c:         clean the build directories first"
-  echo >&2 "-o outdir:  build the tools here"
-  echo >&2 "-d destdir: install the tools here"
-  echo >&2 "-j jobs:    passed to make"
-  exit 1
-}
-
-build() {
-  local outdir="$1" destdir="$2" clean="$3" jobs="$4"
-  # This is where gdb will be installed.
-  # We don't pass it to --prefix however so as to not encode
-  # any path info in the install. Instead we pass --prefix=/
-  # and use DESTDIR=${prefix} during the make install.
-  local prefix="${destdir}/${HOST_TRIPLE}/gdb"
-  local builddir="${outdir}/build-gdb-${HOST_TRIPLE}"
-
-  if [[ "${clean}" = "true" ]]; then
-    rm -rf -- "${builddir}"
-  fi
-
-  rm -rf -- "${prefix}"
-
-  mkdir -p -- "${builddir}"
-  pushd "${builddir}"
-  # TODOs:
-  # Better separate debug dir?
-  # Require python? (instead of only using it if found)
-  # Require expat? (instead of only using it if found)
-  # Augment/change auto-load directories?
-  config_prefix="/"
-  # The // is a hack to preserve relocatability which doesn't handle prefix=/.
-  # The specified value is the default, but given that we specify a
-  # system.gdbinit file we set it explicitly to document the relationship.
-  # Specifying prefix=/ is already a hack, so this is coping with that hack.
-  config_datadir="//share/gdb"
-  [[ -f "${builddir}/Makefile" ]] || ${ROOT_DIR}/third_party/gdb/configure \
-    --prefix="$config_prefix" \
-    --enable-targets=arm-elf,aarch64-elf,aarch64-fuchsia,x86_64-elf,x86_64-fuchsia \
-    --disable-werror \
-    --disable-nls \
-    --with-gdb-datadir="$config_datadir" \
-    --with-system-gdbinit="$config_datadir/system-gdbinit/fuchsia.py"
-  make -j "${jobs}" all-gdb
-  make -j "${jobs}" install-gdb DESTDIR="${prefix}"
-  popd
-
-  local stamp="$(LC_ALL=POSIX cat $(find "${prefix}" -type f | sort) | shasum -a1  | awk '{print $1}')"
-  echo "${stamp}" > "${prefix}/.stamp"
-}
-
-declare CLEAN="${CLEAN:-false}"
-declare OUTDIR="${OUTDIR:-${ROOT_DIR}/out}"
-declare DESTDIR="${DESTDIR:-${OUTDIR}/toolchain}"
-declare JOBS="${DEFAULT_JOBS}"
-
-while getopts "cd:j:o:" opt; do
-  case "${opt}" in
-    c) CLEAN="true" ;;
-    d) DESTDIR="${OPTARG}" ;;
-    j) JOBS="${OPTARG}" ;;
-    o) OUTDIR="${OPTARG}" ;;
-    *) usage;;
-  esac
-done
-
-absolute_path() {
-  local -r path="$1"
-  case "$path" in
-    /*) echo "$path" ;;
-    *) echo "$(pwd)/$path" ;;
-  esac
-}
-
-# These must be absolute paths.
-OUTDIR=$(absolute_path "${OUTDIR}")
-DESTDIR=$(absolute_path "${DESTDIR}")
-
-readonly CLEAN OUTDIR DESTDIR JOBS
-
-build "${OUTDIR}" "${DESTDIR}" "${CLEAN}" "${JOBS}"
diff --git a/gdb/fuchsia/__init__.py b/gdb/fuchsia/__init__.py
deleted file mode 100644
index fa81ada..0000000
--- a/gdb/fuchsia/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-# empty file
diff --git a/gdb/fuchsia/gdb/__init__.py b/gdb/fuchsia/gdb/__init__.py
deleted file mode 100644
index 0e76a30..0000000
--- a/gdb/fuchsia/gdb/__init__.py
+++ /dev/null
@@ -1,287 +0,0 @@
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Initialize gdb for debugging Fuchsia code."""
-
-import gdb
-import glob
-import os
-
-
-# Extract the GDB version number so scripts can easily examine it.
-# We export four variables:
-# GDB_MAJOR_VERSION, GDB_MINOR_VERSION, GDB_PATCH_VERSION, GDB_GOOGLE_VERSION.
-# The first three are standard, e.g. 7.10.1.
-# GDB_GOOGLE_VERSION is the N in "-gfN" Fuchsia releases.
-# A value of zero means this isn't a Google Fuchsia gdb.
-_GDB_DASH_VERSION = gdb.VERSION.split("-")
-_GDB_DOT_VERSION = _GDB_DASH_VERSION[0].split(".")
-GDB_MAJOR_VERSION = int(_GDB_DOT_VERSION[0])
-GDB_MINOR_VERSION = int(_GDB_DOT_VERSION[1])
-if len(_GDB_DOT_VERSION) >= 3:
-    GDB_PATCH_VERSION = int(_GDB_DOT_VERSION[2])
-else:
-    GDB_PATCH_VERSION = 0
-GDB_GOOGLE_VERSION = 0
-if len(_GDB_DASH_VERSION) >= 2:
-    if _GDB_DASH_VERSION[1].startswith("gf"):
-        try:
-            GDB_GOOGLE_VERSION = int(_GDB_DASH_VERSION[1][2:])
-        except ValueError:
-            pass
-
-# The top level zircon build directory
-_TOP_ZIRCON_BUILD_DIR = "out/build-zircon"
-
-# Prefix of zircon build directories within _TOP_ZIRCON_BUILD_DIR.
-_ZIRCON_BUILD_SUBDIR_PREFIX = "build"
-
-# True if fuchsia support has been initialized.
-_INITIALIZED_FUCHSIA_SUPPORT = False
-
-# The prefix for fuchsia commands.
-_FUCHSIA_COMMAND_PREFIX = "fuchsia"
-
-
-class _FuchsiaPrefix(gdb.Command):
-    """Prefix command for Fuchsia-specific commands."""
-
-    def __init__(self):
-        super(_FuchsiaPrefix, self).__init__(
-            _FUCHSIA_COMMAND_PREFIX, gdb.COMMAND_USER, prefix=True)
-
-
-class _FuchsiaSetPrefix(gdb.Command):
-    """Prefix "set" command for Fuchsia parameters."""
-
-    def __init__(self):
-        super(_FuchsiaSetPrefix, self).__init__(
-            "set %s" % (_FUCHSIA_COMMAND_PREFIX), gdb.COMMAND_USER,
-            prefix=True)
-
-
-class _FuchsiaShowPrefix(gdb.Command):
-    """Prefix "show" command for Fuchsia parameters."""
-
-    def __init__(self):
-        super(_FuchsiaShowPrefix, self).__init__(
-            "show %s" % (_FUCHSIA_COMMAND_PREFIX), gdb.COMMAND_USER,
-            prefix=True)
-
-    def invoke(self, from_tty):
-        # TODO(dje): Show all the parameters, a la cmd_show_list.
-        pass
-
-
-class _FuchsiaVerbosity(gdb.Parameter):
-    """Verbosity for Fuchsia gdb support.
-
-    There are four levels of verbosity:
-    0 = off
-    1 = minimal
-    2 = what a typical user might want to see
-    3 = everything, intended for maintainers only
-    """
-    # Note: While not every verbosity level is exercised today, these levels
-    # are convention in Google's internal gdb.
-
-    set_doc = "Set level of Fuchsia verbosity."
-    show_doc = "Show level of Fuchsia verbosity."
-
-    def __init__(self):
-        super(_FuchsiaVerbosity, self).__init__(
-            "%s verbosity" % (_FUCHSIA_COMMAND_PREFIX),
-            gdb.COMMAND_FILES, gdb.PARAM_ZINTEGER)
-        # Default to basic informational messages to help users know
-        # what's going on.
-        self.value = 1
-
-    def get_show_string(self, pvalue):
-        return "Fuchsia verbosity is " + pvalue + "."
-
-    def get_set_string(self):
-        # Ugh.  There doesn't seem to be a way to implement a gdb parameter in
-        # Python that will be silent when the user changes the value.
-        return "Fuchsia verbosity been set to %d." % (self.value)
-
-
-def _IsFuchsiaFile(objfile):
-    """Return True if objfile is a Fuchsia file."""
-    # TODO(dje): Not sure how to effectively achieve this.
-    # Assume we're always debugging a Fuchsia program for now.
-    # If the user wants to debug native programs, s/he can use native gdb.
-    return True
-
-
-def _ClearObjfilesHandler(event):
-    """Reset debug information tracking when all objfiles are unloaded."""
-    event.progspace.seen_exec = False
-
-
-def _FindSysroot(arch):
-    """Return the path to the sysroot for arch."""
-    if arch == "x64":
-        suffix = "-x64"
-    elif arch == "arm64":
-        suffix = "-arm64"
-    else:
-        assert(False)
-    print ("TRYING: %s/%s*%s" % (
-        _TOP_ZIRCON_BUILD_DIR, _ZIRCON_BUILD_SUBDIR_PREFIX, suffix))
-    for filename in glob.iglob("%s/%s*%s" % (
-            _TOP_ZIRCON_BUILD_DIR, _ZIRCON_BUILD_SUBDIR_PREFIX, suffix)):
-        return "%s/sysroot" % (filename)
-    return None
-
-
-def _NewObjfileHandler(event):
-    """Handle new objfiles being loaded."""
-    # TODO(dje): Use this hook to automagically fetch debug info.
-
-    verbosity = gdb.parameter(
-        "%s verbosity" % (_FUCHSIA_COMMAND_PREFIX))
-    if verbosity >= 3:
-        print "Hi, I'm the new_objfile event handler."
-
-    objfile = event.new_objfile
-    progspace = objfile.progspace
-    # Assume the first objfile we see is the main executable.
-    # There's nothing else we can do at this point.
-    seen_exec = hasattr(progspace, "seen_exec") and progspace.seen_exec
-
-    # Early exit if nothing to do.
-    # We don't handle multiple arches so we KISS.
-    if seen_exec:
-        if verbosity >= 3:
-            print "Already seen exec, ignoring: %s" % (basename)
-        return
-    progspace.seen_exec = True
-
-    filename = objfile.username
-    basename = os.path.basename(filename)
-    if objfile.owner is not None:
-        if verbosity >= 3:
-            print "Separate debug file, ignoring: %s" % (basename)
-        return
-
-    # If we're debugging a native executable, unset the solib search path.
-    if not _IsFuchsiaFile(objfile):
-        if verbosity >= 3:
-            print "Debugging non-Fuchsia file: %s" % (basename)
-        print "Note: Unsetting solib-search-path."
-        gdb.execute("set solib-search-path")
-        return
-
-    # The sysroot to use is dependent on the architecture of the program.
-    # This is needed to find ld.so debug info.
-    # TODO(dje): IWBN to not need ld.so debug info.
-    # TODO(dje): IWBN if objfiles exposed their arch field.
-    arch_string = gdb.execute("show arch", to_string=True)
-    if arch_string.find("arm64") >= 0:
-        # Alas there are different directories for different arm64 builds
-        # (qemu, rpi3, etc.). Pick something, hopefully this can go away soon.
-        sysroot_dir = _FindSysroot("arm64")
-    elif arch_string.find("x64") >= 0:
-        sysroot_dir = _FindSysroot("x64")
-    else:
-        print "WARNING: unsupported architecture\n%s" % (arch_string)
-        return
-
-    # TODO(dje): We can't use sysroot to find ld.so.1 because it doesn't
-    # have a path on Fuchsia. Plus files in Fuchsia are intended to be
-    # "ephemeral" by nature. So we punt on setting sysroot for now, even
-    # though IWBN if we could use it.
-    if sysroot_dir:
-        solib_search_path = "%s/debug" % (sysroot_dir)
-        print "Note: Setting solib-search-path to %s" % (solib_search_path)
-        gdb.execute("set solib-search-path %s" % (solib_search_path))
-    else:
-        print "WARNING: could not find sysroot directory"
-
-
-def _InitializeFuchsiaObjfileTracking():
-    # We *need* solib-search-path set so that we can find debug info for
-    # ld.so.1. Otherwise it's game over for a usable debug session:
-    # We won't be able to set a breakpoint at the dynamic linker breakpoint
-    # and we won't be able to relocate the program (all Fuchsia executables
-    # are PIE). However, we don't necessarily know which architecture we're
-    # debugging yet so we don't know which directory to set the search path
-    # to. To solve this we hook into the "new objfile" event.
-    # This event can also let us automagically fetch debug info for files
-    # as they're loaded (TODO(dje)).
-    gdb.events.clear_objfiles.connect(_ClearObjfilesHandler)
-    gdb.events.new_objfile.connect(_NewObjfileHandler)
-
-
-class _SetFuchsiaDefaults(gdb.Command):
-    """Set GDB parameters to values useful for Fuchsia code.
-
-    Usage: set-fuchsia-defaults
-
-    These changes are made:
-      set non-stop on
-      set target-async on
-      set remotetimeout 10
-      set sysroot # (set to empty path)
-
-    Fuchsia gdbserver currently supports non-stop only (and even that support
-    is preliminary so heads up).
-    """
-
-    def __init__(self):
-        super(_SetFuchsiaDefaults, self).__init__(
-            "%s set-defaults" % (_FUCHSIA_COMMAND_PREFIX),
-            gdb.COMMAND_DATA)
-
-    # The name and parameters of this function are defined by GDB.
-    # pylint: disable=invalid-name
-    # pylint: disable=unused-argument
-    def invoke(self, arg, from_tty):
-        """GDB calls this to perform the command."""
-        # We don't need to tell the user about everything we do.
-        # But it's helpful to give a heads up for things s/he may trip over.
-        print "Note: Enabling non-stop, target-async."
-        gdb.execute("set non-stop on")
-        gdb.execute("set target-async on")
-        gdb.execute("set remotetimeout 10")
-
-        # The default is "target:" which will cause gdb to fetch every dso,
-        # which is ok sometimes, but for right now it's a nuisance.
-        print "Note: Unsetting sysroot."
-        gdb.execute("set sysroot")
-
-
-def _InstallFuchsiaCommands():
-    # We don't do anything with the result, we just need to call
-    # the constructor.
-    _FuchsiaPrefix()
-    _FuchsiaSetPrefix()
-    _FuchsiaShowPrefix()
-    _FuchsiaVerbosity()
-    _SetFuchsiaDefaults()
-
-
-def initialize():
-    """Set up GDB for debugging Fuchsia code.
-
-    This function is invoked via gdb's "system.gdbinit"
-    when it detects it is being started in a fuchsia tree.
-
-    It is ok to call this function multiple times, but only the first
-    is effective.
-
-    Returns:
-        Nothing.
-    """
-
-    global _INITIALIZED_FUCHSIA_SUPPORT
-    if _INITIALIZED_FUCHSIA_SUPPORT:
-        print "Fuchsia support already loaded."
-        return
-    _INITIALIZED_FUCHSIA_SUPPORT = True
-
-    _InstallFuchsiaCommands()
-    _InitializeFuchsiaObjfileTracking()
-    print "Setting fuchsia defaults. 'help fuchsia set-defaults' for details."
-    gdb.execute("fuchsia set-defaults")
diff --git a/generate-intellij-config.py b/generate-intellij-config.py
deleted file mode 100755
index 1842a68..0000000
--- a/generate-intellij-config.py
+++ /dev/null
@@ -1,112 +0,0 @@
-#!/usr/bin/env python
-
-from lxml import etree
-import os
-import re
-import subprocess
-import sys
-
-try:
-  fuchsia_dir = os.environ['FUCHSIA_DIR']
-except KeyError, e:
-  print 'Missing FUCHSIA_DIR environment variable. Please run this script as:'
-  print '  fx exec scripts/generate-intellij-config.py'
-  sys.exit(1)
-
-idea_dir = os.path.join(fuchsia_dir, '.idea')
-
-
-def find_dart_directories():
-  # Ask GN for all labels that depends on the target Dart SDK.
-  refs = subprocess.check_output([
-      os.path.join(fuchsia_dir, 'buildtools',
-                   'gn'), 'refs', os.environ['FUCHSIA_BUILD_DIR'],
-      '//dart:create_sdk(//build/toolchain:host_x64)'
-  ])
-  # Turn that into a set of unique directories for the labels.
-  label_dirs = {
-      os.path.join(fuchsia_dir, re.sub(r':.*', '', label[2:]))
-      for label in refs.split('\n') if len(label)
-  }
-  # Filter to just the ones that contain dart code.
-  dart_dirs = [
-      d for d in label_dirs if os.path.exists(os.path.join(d, 'pubspec.yaml'))
-  ]
-  # Sort them by the leaf name. This how IntelliJ seems to sort them.
-  dart_dirs.sort(key=lambda d: os.path.basename(d))
-
-  return dart_dirs
-
-
-def write_dart_iml(dart_dir):
-  # TODO(ianloic): check if it already exists?
-  # TODO(ianloic): handle tests specially
-  relative_dart_dir = os.path.relpath(dart_dir, fuchsia_dir)
-  iml_file = os.path.join(idea_dir, 'modules', relative_dart_dir) + '.iml'
-  iml_dir = os.path.dirname(iml_file)
-  if not os.path.exists(iml_dir):
-    os.makedirs(iml_dir)
-  relative_source_dir = os.path.relpath(dart_dir, iml_dir)
-  module = etree.Element('module', type='WEB_MODULE', version='4')
-  component = etree.SubElement(module, 'component', name='NewModuleRootManager')
-  component.set('inherit-compiler-output', 'true')
-  etree.SubElement(component, 'exclude-output')
-  etree.SubElement(
-      component, 'content', url='file://$MODULE_DIR$/' + relative_source_dir)
-  etree.SubElement(component, 'orderEntry', type='inheritedJdk')
-  etree.SubElement(
-      component, 'orderEntry', type='sourceFolder', forTests='false')
-
-  with open(iml_file, 'w') as f:
-    f.write(
-        etree.tostring(
-            module, encoding='UTF-8', xml_declaration=True, pretty_print=True))
-
-  return iml_file
-
-
-def write_dart_modules(dart_dirs):
-  # TODO(ianloic): merge with existing file?
-  project = etree.Element('project', version='4')
-  component = etree.SubElement(
-      project, 'component', name='ProjectModuleManager')
-  modules = etree.SubElement(component, 'modules')
-  for dart_dir in dart_dirs:
-    iml_file = write_dart_iml(dart_dir)
-    relative_dart_dir = os.path.relpath(dart_dir, fuchsia_dir)
-    relative_iml_file = os.path.relpath(iml_file, fuchsia_dir)
-    etree.SubElement(
-        modules,
-        'module',
-        fileurl='file://$PROJECT_DIR$/' + relative_iml_file,
-        filepath='$PROJECT_DIR$/' + relative_iml_file,
-        group=os.path.dirname(relative_dart_dir))
-
-  with open(os.path.join(idea_dir, 'modules.xml'), 'w') as f:
-    f.write(
-        etree.tostring(
-            project, encoding='UTF-8', xml_declaration=True, pretty_print=True))
-
-
-def write_basic_project_files():
-  # Make the .idea directory if needed.
-  if not os.path.exists(idea_dir):
-    os.makedirs(idea_dir)
-
-  # Create skeleton misc.xml and workspace.xml.
-  skeleton = etree.tostring(
-      etree.Element('project', version='4'),
-      encoding='UTF-8',
-      xml_declaration=True,
-      pretty_print=True)
-  for fn in ('misc.xml', 'workspace.xml'):
-    path = os.path.join(idea_dir, fn)
-    if not os.path.exists(path):
-      with open(path, 'w') as f:
-        f.write(skeleton)
-
-
-if __name__ == '__main__':
-  dart_dirs = find_dart_directories()
-  write_basic_project_files()
-  write_dart_modules(dart_dirs)
diff --git a/git-file-tidy b/git-file-tidy
deleted file mode 100755
index 61f6629..0000000
--- a/git-file-tidy
+++ /dev/null
@@ -1,199 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-"""Runs clang-tidy on modified files.
-
-The tool uses `git diff-index` against the newest parent commit in the upstream
-branch (or against HEAD if no such commit is found) in order to find the files
-to be formatted. In result, the tool lints files that are locally modified,
-staged or touched by any commits introduced on the local branch.
-"""
-
-import argparse
-import multiprocessing
-import os
-import platform
-import re
-import subprocess
-import sys
-
-import git_utils
-import paths
-
-clang_os = "linux"
-if platform.platform().startswith("Darwin"):
-    clang_os = "mac"
-CLANG_TIDY_TOOL = os.path.join(paths.BUILDTOOLS_ROOT,
-                               "%s-x64" % clang_os, "clang", "bin",
-                               "clang-tidy")
-NINJA_TOOL = os.path.join(paths.BUILDTOOLS_ROOT, "ninja")
-
-
-def find_ancestor_with(filepath, relpath):
-    """Returns the lowest ancestor of |filepath| that contains |relpath|."""
-    cur_dir_path = os.path.abspath(os.path.dirname(filepath))
-    while True:
-        if os.path.exists(os.path.join(cur_dir_path, relpath)):
-            return cur_dir_path
-
-        next_dir_path = os.path.dirname(cur_dir_path)
-        if next_dir_path != cur_dir_path:
-            cur_dir_path = next_dir_path
-        else:
-            return None
-
-
-def get_out_dir(args):
-    if args.out_dir:
-        out_dir = args.out_dir
-
-        if not os.path.isabs(out_dir):
-            out_dir = os.path.join(paths.FUCHSIA_ROOT, out_dir)
-
-        if not os.path.isdir(out_dir):
-            print out_dir + " is not a directory"
-            sys.exit(-1)
-        return out_dir
-
-    if os.environ.get("FUCHSIA_BUILD_DIR"):
-        return os.environ.get("FUCHSIA_BUILD_DIR")
-
-    fuchsia_dir = os.environ.get("FUCHSIA_DIR", paths.FUCHSIA_ROOT)
-    fuchsia_config_file = os.path.join(fuchsia_dir, '.config')
-    if os.path.isfile(fuchsia_config_file):
-        fuchsia_config = open(fuchsia_config_file).read()
-        m = re.search(r'FUCHSIA_BUILD_DIR=[\'"]([^\s\'"]*)', fuchsia_config)
-        if m:
-            return os.path.join(fuchsia_dir, m.group(1))
-
-    print("Couldn't find the output directory, pass --out-dir " +
-          "(absolute or relative to Fuchsia root) or set FUCHSIA_BUILD_DIR.")
-    sys.exit(-1)
-
-
-def generate_db(out_dir):
-    cmd = [NINJA_TOOL, "-C", out_dir, "-t", "compdb", "cc", "cxx"]
-    db = subprocess.check_output(
-        cmd, cwd=paths.FUCHSIA_ROOT, universal_newlines=True)
-
-    # Strip away `gomacc` from the compile commands. This seems to fix problems
-    # with clang-tidy not being able to load system headers.
-    db = re.sub("\"/[\S]+/gomacc ", "\"", db)
-
-    with open(os.path.join(out_dir, "compile_commands.json"), "w+") as db_file:
-        db_file.write(db)
-
-
-def go(args):
-    out_dir = get_out_dir(args)
-
-    # generate the compilation database
-    generate_db(out_dir)
-
-    # Find the files to be checked.
-    if args.all:
-        files = git_utils.get_all_files()
-    else:
-        files = git_utils.get_diff_files()
-
-    filtered_files = []
-    for file_path in files:
-        # Skip deleted files.
-        if not os.path.isfile(file_path):
-            if args.verbose:
-                print "skipping " + file_path + " (deleted)"
-            continue
-
-        # Skip files with parent directories containing .nolint
-        if find_ancestor_with(file_path, ".nolint"):
-            if args.verbose:
-                print "skipping " + file_path + " (.nolint)"
-            continue
-        filtered_files.append(file_path)
-
-    if args.verbose:
-        print
-        print "Files to be checked:"
-        for file in filtered_files:
-            print " - " + file
-        if not filtered_files:
-            print " (no files)"
-        print
-
-    # change the working directory to Fuchsia root.
-    os.chdir(paths.FUCHSIA_ROOT)
-
-    # It's not safe to run in parallel with "--fix", as clang-tidy traverses and
-    # fixes header files, and we might end up with concurrent writes to the same
-    # header file.
-    if args.no_parallel or args.fix:
-        parallel_jobs = 1
-    else:
-        parallel_jobs = multiprocessing.cpu_count()
-        print("Running " + str(parallel_jobs) +
-              " jobs in parallel, pass --no-parallel to disable")
-
-    jobs = set()
-
-    for file_path in filtered_files:
-        _, extension = os.path.splitext(file_path)
-        if extension == ".cc":
-            relpath = os.path.relpath(file_path)
-            cmd = [CLANG_TIDY_TOOL, "-p", out_dir, relpath]
-            if args.checks:
-                cmd.append("-checks=" + args.checks)
-            if args.fix:
-                cmd.append("-fix")
-            if not args.verbose:
-                cmd.append("-quiet")
-
-            if args.verbose:
-                print "checking " + file_path + ": " + str(cmd)
-            jobs.add(subprocess.Popen(cmd))
-            if len(jobs) >= parallel_jobs:
-                os.wait()
-                jobs.difference_update(
-                    [job for job in jobs if job.poll() is not None])
-    for job in jobs:
-        if job.poll() is None:
-            job.wait()
-
-
-def main():
-    parser = argparse.ArgumentParser(description="Lint modified files.")
-    parser.add_argument(
-        "--all",
-        dest="all",
-        action="store_true",
-        default=False,
-        help="process all files in the repo under current working directory")
-    parser.add_argument(
-        "--fix",
-        dest="fix",
-        action="store_true",
-        default=False,
-        help="automatically generate fixes when possible")
-    parser.add_argument("--checks", help="overrides the list of checks to use")
-    parser.add_argument(
-        "--out-dir",
-        help="Output directory, needed to generate compilation db for clang.")
-    parser.add_argument(
-        "--no-parallel",
-        action="store_true",
-        default=False,
-        help="Process one file at a time")
-    parser.add_argument(
-        "--verbose",
-        dest="verbose",
-        action="store_true",
-        default=False,
-        help="tell me what you're doing")
-    args = parser.parse_args()
-    go(args)
-
-    return 0
-
-
-if __name__ == "__main__":
-    sys.exit(main())
diff --git a/git-fuchsia-review b/git-fuchsia-review
deleted file mode 100755
index bfda75e..0000000
--- a/git-fuchsia-review
+++ /dev/null
@@ -1,30 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Opens the given commit ref (or HEAD) if no commit ref is passed in gerrit.
-#
-# Example: `git fuchsia-review <commit hash>`, `git fuchsia-review`.
-
-usage() {
-  printf 'usage: git fuchsia-review [<commit ref>]\n'
-  exit 0
-}
-
-set -e
-
-if [[ ($1 == "-h") || ($1 == "--help") ]]
-then
-  usage;
-fi
-
-ID=`git show $1 | egrep 'Change-Id' | awk '{print $(NF);}'`
-URL="https://fuchsia-review.googlesource.com/q/$ID"
-
-echo Opening Change-Id $ID
-if [[ "$OSTYPE" == "darwin"* ]]; then
-  open $URL
-else
-  xdg-open $URL
-fi
diff --git a/git_utils.py b/git_utils.py
deleted file mode 100644
index d215269..0000000
--- a/git_utils.py
+++ /dev/null
@@ -1,65 +0,0 @@
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import os
-import os.path
-import subprocess
-
-
-def _get_diff_base():
-    """Returns the newest local commit that is also in the upstream branch, or
-    "HEAD" if no such commit can be found. If no upstream branch is set, assumes
-    that origin/master is the upstream.
-    """
-    try:
-        with open(os.devnull, 'w') as devnull:
-            try:
-                upstream = subprocess.check_output([
-                    "git", "rev-parse", "--abbrev-ref", "--symbolic-full-name", "@{u}"
-                ], stderr = devnull).strip()
-            except subprocess.CalledProcessError:
-                upstream = "origin/master"
-            # Get local commits not in upstream.
-            local_commits = filter(
-                len,
-                subprocess.check_output(
-                    ["git", "rev-list", "HEAD", "^" + upstream, "--"]).split("\n"))
-            if not local_commits:
-                return "HEAD"
-
-            # Return parent of the oldest commit.
-            return subprocess.check_output(
-                ["git", "rev-parse", local_commits[-1] + "^"],
-                stderr = devnull).strip()
-
-    except subprocess.CalledProcessError:
-        return "HEAD"
-
-
-def get_git_root():
-    """Returns the path of the root of the git repository."""
-    return subprocess.check_output(["git", "rev-parse",
-                                    "--show-toplevel"]).strip()
-
-
-def get_diff_files():
-    """Returns absolute paths to files that are locally modified, staged or
-    touched by any commits introduced on the local branch.
-    """
-
-    list_command = [
-        "git", "diff-index", "--name-only",
-        _get_diff_base()
-    ]
-    git_root_path = get_git_root()
-    paths = filter(len, subprocess.check_output(list_command).split("\n"))
-    return [ os.path.join(git_root_path, x) for x in paths ]
-
-def get_all_files():
-    """Returns absolute paths to all files in the git repo under the current
-    working directory.
-    """
-    list_command = ["git", "ls-files"]
-    paths = filter(len, subprocess.check_output(list_command).split("\n"))
-    return [ os.path.abspath(x) for x in paths ]
diff --git a/gn_to_cmake.py b/gn_to_cmake.py
deleted file mode 100644
index 7f2f364..0000000
--- a/gn_to_cmake.py
+++ /dev/null
@@ -1,713 +0,0 @@
-#!/usr/bin/env python
-#
-# Copyright 2016 Google Inc.
-#
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-
-"""
-Usage: gn_to_cmake.py <json_file_name>
-
-gn gen out/config --ide=json --json-ide-script=../../gn/gn_to_cmake.py
-
-or
-
-gn gen out/config --ide=json
-python gn/gn_to_cmake.py out/config/project.json
-
-The first is recommended, as it will auto-update.
-"""
-
-
-import itertools
-import functools
-import json
-import posixpath
-import os
-import string
-import sys
-
-
-def CMakeStringEscape(a):
-  """Escapes the string 'a' for use inside a CMake string.
-
-  This means escaping
-  '\' otherwise it may be seen as modifying the next character
-  '"' otherwise it will end the string
-  ';' otherwise the string becomes a list
-
-  The following do not need to be escaped
-  '#' when the lexer is in string state, this does not start a comment
-  """
-  return a.replace('\\', '\\\\').replace(';', '\\;').replace('"', '\\"')
-
-
-def CMakeTargetEscape(a):
-  """Escapes the string 'a' for use as a CMake target name.
-
-  CMP0037 in CMake 3.0 restricts target names to "^[A-Za-z0-9_.:+-]+$"
-  The ':' is only allowed for imported targets.
-  """
-  def Escape(c):
-    if c in string.ascii_letters or c in string.digits or c in '_.+-':
-      return c
-    else:
-      return '__'
-  return ''.join(map(Escape, a))
-
-
-def SetVariable(out, variable_name, value):
-  """Sets a CMake variable."""
-  out.write('set("')
-  out.write(CMakeStringEscape(variable_name))
-  out.write('" "')
-  out.write(CMakeStringEscape(value))
-  out.write('")\n')
-
-
-def SetVariableList(out, variable_name, values):
-  """Sets a CMake variable to a list."""
-  if not values:
-    return SetVariable(out, variable_name, "")
-  if len(values) == 1:
-    return SetVariable(out, variable_name, values[0])
-  out.write('list(APPEND "')
-  out.write(CMakeStringEscape(variable_name))
-  out.write('"\n  "')
-  out.write('"\n  "'.join([CMakeStringEscape(value) for value in values]))
-  out.write('")\n')
-
-
-def SetFilesProperty(output, variable, property_name, values, sep):
-  """Given a set of source files, sets the given property on them."""
-  output.write('set_source_files_properties(')
-  WriteVariable(output, variable)
-  output.write(' PROPERTIES ')
-  output.write(property_name)
-  output.write(' "')
-  for value in values:
-    output.write(CMakeStringEscape(value))
-    output.write(sep)
-  output.write('")\n')
-
-
-def SetCurrentTargetProperty(out, property_name, values, sep=''):
-  """Given a target, sets the given property."""
-  out.write('set_target_properties("${target}" PROPERTIES ')
-  out.write(property_name)
-  out.write(' "')
-  for value in values:
-    out.write(CMakeStringEscape(value))
-    out.write(sep)
-  out.write('")\n')
-
-
-def WriteVariable(output, variable_name, prepend=None):
-  if prepend:
-    output.write(prepend)
-  output.write('${')
-  output.write(variable_name)
-  output.write('}')
-
-
-# See GetSourceFileType in gn
-source_file_types = {
-  '.cc': 'cxx',
-  '.cpp': 'cxx',
-  '.cxx': 'cxx',
-  '.c': 'c',
-  '.s': 'asm',
-  '.S': 'asm',
-  '.asm': 'asm',
-  '.o': 'obj',
-  '.obj': 'obj',
-}
-
-
-class CMakeTargetType(object):
-  def __init__(self, command, modifier, property_modifier, is_linkable):
-    self.command = command
-    self.modifier = modifier
-    self.property_modifier = property_modifier
-    self.is_linkable = is_linkable
-CMakeTargetType.custom = CMakeTargetType('add_custom_target', 'SOURCES',
-                                         None, False)
-
-# See GetStringForOutputType in gn
-cmake_target_types = {
-  'unknown': CMakeTargetType.custom,
-  'group': CMakeTargetType.custom,
-  'executable': CMakeTargetType('add_executable', None, 'RUNTIME', True),
-  'loadable_module': CMakeTargetType('add_library', 'MODULE', 'LIBRARY', True),
-  'shared_library': CMakeTargetType('add_library', 'SHARED', 'LIBRARY', True),
-  'static_library': CMakeTargetType('add_library', 'STATIC', 'ARCHIVE', False),
-  'source_set': CMakeTargetType('add_library', 'OBJECT', None, False),
-  'copy': CMakeTargetType.custom,
-  'action': CMakeTargetType.custom,
-  'action_foreach': CMakeTargetType.custom,
-  'bundle_data': CMakeTargetType.custom,
-  'create_bundle': CMakeTargetType.custom,
-}
-
-
-def FindFirstOf(s, a):
-  return min(s.find(i) for i in a if i in s)
-
-
-class Project(object):
-  def __init__(self, project_json):
-    self.targets = project_json['targets']
-    build_settings = project_json['build_settings']
-    self.root_path = build_settings['root_path']
-    self.build_path = posixpath.join(self.root_path,
-                                     build_settings['build_dir'][2:])
-
-  def GetAbsolutePath(self, path):
-    if path.startswith("//"):
-      return self.root_path + "/" + path[2:]
-    else:
-      return path
-
-  def GetObjectSourceDependencies(self, gn_target_name, object_dependencies):
-    """All OBJECT libraries whose sources have not been absorbed."""
-    dependencies = self.targets[gn_target_name].get('deps', [])
-    for dependency in dependencies:
-      dependency_type = self.targets[dependency].get('type', None)
-      if dependency_type == 'source_set':
-        object_dependencies.add(dependency)
-      if dependency_type not in gn_target_types_that_absorb_objects:
-        self.GetObjectSourceDependencies(dependency, object_dependencies)
-
-  def GetObjectLibraryDependencies(self, gn_target_name, object_dependencies):
-    """All OBJECT libraries whose libraries have not been absorbed."""
-    dependencies = self.targets[gn_target_name].get('deps', [])
-    for dependency in dependencies:
-      dependency_type = self.targets[dependency].get('type', None)
-      if dependency_type == 'source_set':
-        object_dependencies.add(dependency)
-        self.GetObjectLibraryDependencies(dependency, object_dependencies)
-
-  def GetCMakeTargetName(self, gn_target_name):
-    # See <chromium>/src/tools/gn/label.cc#Resolve
-    # //base/test:test_support(//build/toolchain/win:msvc)
-    path_separator = FindFirstOf(gn_target_name, (':', '('))
-    location = None
-    name = None
-    toolchain = None
-    if not path_separator:
-      location = gn_target_name[2:]
-    else:
-      location = gn_target_name[2:path_separator]
-      toolchain_separator = gn_target_name.find('(', path_separator)
-      if toolchain_separator == -1:
-        name = gn_target_name[path_separator + 1:]
-      else:
-        if toolchain_separator > path_separator:
-          name = gn_target_name[path_separator + 1:toolchain_separator]
-        assert gn_target_name.endswith(')')
-        toolchain = gn_target_name[toolchain_separator + 1:-1]
-    assert location or name
-
-    cmake_target_name = None
-    if location.endswith('/' + name):
-      cmake_target_name = location
-    elif location:
-      cmake_target_name = location + '_' + name
-    else:
-      cmake_target_name = name
-    if toolchain:
-      cmake_target_name += '--' + toolchain
-    return CMakeTargetEscape(cmake_target_name)
-
-
-class Target(object):
-  def __init__(self, gn_target_name, project):
-    self.gn_name = gn_target_name
-    self.properties = project.targets[self.gn_name]
-    self.cmake_name = project.GetCMakeTargetName(self.gn_name)
-    self.gn_type = self.properties.get('type', None)
-    self.cmake_type = cmake_target_types.get(self.gn_type, None)
-
-
-def WriteAction(out, target, project, sources, synthetic_dependencies):
-  outputs = []
-  output_directories = set()
-  for output in target.properties.get('outputs', []):
-    output_abs_path = project.GetAbsolutePath(output)
-    outputs.append(output_abs_path)
-    output_directory = posixpath.dirname(output_abs_path)
-    if output_directory:
-      output_directories.add(output_directory)
-  outputs_name = '${target}__output'
-  SetVariableList(out, outputs_name, outputs)
-
-  out.write('add_custom_command(OUTPUT ')
-  WriteVariable(out, outputs_name)
-  out.write('\n')
-
-  if output_directories:
-    out.write('  COMMAND ${CMAKE_COMMAND} -E make_directory "')
-    out.write('" "'.join(map(CMakeStringEscape, output_directories)))
-    out.write('"\n')
-
-  script = target.properties['script']
-  arguments = target.properties['args']
-  out.write('  COMMAND python "')
-  out.write(CMakeStringEscape(project.GetAbsolutePath(script)))
-  out.write('"')
-  if arguments:
-    out.write('\n    "')
-    out.write('"\n    "'.join(map(CMakeStringEscape, arguments)))
-    out.write('"')
-  out.write('\n')
-
-  out.write('  DEPENDS ')
-  for sources_type_name in sources.values():
-    WriteVariable(out, sources_type_name, ' ')
-  out.write('\n')
-
-  #TODO: CMake 3.7 is introducing DEPFILE
-
-  out.write('  WORKING_DIRECTORY "')
-  out.write(CMakeStringEscape(project.build_path))
-  out.write('"\n')
-
-  out.write('  COMMENT "Action: ${target}"\n')
-
-  out.write('  VERBATIM)\n')
-
-  synthetic_dependencies.add(outputs_name)
-
-
-def ExpandPlaceholders(source, a):
-  source_dir, source_file_part = posixpath.split(source)
-  source_name_part, _ = posixpath.splitext(source_file_part)
-  #TODO: {{source_gen_dir}}, {{source_out_dir}}, {{response_file_name}}
-  return a.replace('{{source}}', source) \
-          .replace('{{source_file_part}}', source_file_part) \
-          .replace('{{source_name_part}}', source_name_part) \
-          .replace('{{source_dir}}', source_dir) \
-          .replace('{{source_root_relative_dir}}', source_dir)
-
-
-def WriteActionForEach(out, target, project, sources, synthetic_dependencies):
-  all_outputs = target.properties.get('outputs', [])
-  inputs = target.properties.get('sources', [])
-  # TODO: consider expanding 'output_patterns' instead.
-  outputs_per_input = len(all_outputs) / len(inputs)
-  for count, source in enumerate(inputs):
-    source_abs_path = project.GetAbsolutePath(source)
-
-    outputs = []
-    output_directories = set()
-    for output in all_outputs[outputs_per_input *  count:
-                              outputs_per_input * (count+1)]:
-      output_abs_path = project.GetAbsolutePath(output)
-      outputs.append(output_abs_path)
-      output_directory = posixpath.dirname(output_abs_path)
-      if output_directory:
-        output_directories.add(output_directory)
-    outputs_name = '${target}__output_' + str(count)
-    SetVariableList(out, outputs_name, outputs)
-
-    out.write('add_custom_command(OUTPUT ')
-    WriteVariable(out, outputs_name)
-    out.write('\n')
-
-    if output_directories:
-      out.write('  COMMAND ${CMAKE_COMMAND} -E make_directory "')
-      out.write('" "'.join(map(CMakeStringEscape, output_directories)))
-      out.write('"\n')
-
-    script = target.properties['script']
-    # TODO: need to expand {{xxx}} in arguments
-    arguments = target.properties['args']
-    out.write('  COMMAND python "')
-    out.write(CMakeStringEscape(project.GetAbsolutePath(script)))
-    out.write('"')
-    if arguments:
-      out.write('\n    "')
-      expand = functools.partial(ExpandPlaceholders, source_abs_path)
-      out.write('"\n    "'.join(map(CMakeStringEscape, map(expand,arguments))))
-      out.write('"')
-    out.write('\n')
-
-    out.write('  DEPENDS')
-    if 'input' in sources:
-      WriteVariable(out, sources['input'], ' ')
-    out.write(' "')
-    out.write(CMakeStringEscape(source_abs_path))
-    out.write('"\n')
-
-    #TODO: CMake 3.7 is introducing DEPFILE
-
-    out.write('  WORKING_DIRECTORY "')
-    out.write(CMakeStringEscape(project.build_path))
-    out.write('"\n')
-
-    out.write('  COMMENT "Action ${target} on ')
-    out.write(CMakeStringEscape(source_abs_path))
-    out.write('"\n')
-
-    out.write('  VERBATIM)\n')
-
-    synthetic_dependencies.add(outputs_name)
-
-
-def WriteCopy(out, target, project, sources, synthetic_dependencies):
-  inputs = target.properties.get('sources', [])
-  raw_outputs = target.properties.get('outputs', [])
-
-  # TODO: consider expanding 'output_patterns' instead.
-  outputs = []
-  for output in raw_outputs:
-    output_abs_path = project.GetAbsolutePath(output)
-    outputs.append(output_abs_path)
-  outputs_name = '${target}__output'
-  SetVariableList(out, outputs_name, outputs)
-
-  out.write('add_custom_command(OUTPUT ')
-  WriteVariable(out, outputs_name)
-  out.write('\n')
-
-  for src, dst in zip(inputs, outputs):
-    out.write('  COMMAND ${CMAKE_COMMAND} -E copy "')
-    out.write(CMakeStringEscape(project.GetAbsolutePath(src)))
-    out.write('" "')
-    out.write(CMakeStringEscape(dst))
-    out.write('"\n')
-
-  out.write('  DEPENDS ')
-  for sources_type_name in sources.values():
-    WriteVariable(out, sources_type_name, ' ')
-  out.write('\n')
-
-  out.write('  WORKING_DIRECTORY "')
-  out.write(CMakeStringEscape(project.build_path))
-  out.write('"\n')
-
-  out.write('  COMMENT "Copy ${target}"\n')
-
-  out.write('  VERBATIM)\n')
-
-  synthetic_dependencies.add(outputs_name)
-
-
-def WriteCompilerFlags(out, target, project, sources):
-  # Hack, set linker language to c if no c or cxx files present.
-  if not 'c' in sources and not 'cxx' in sources:
-    SetCurrentTargetProperty(out, 'LINKER_LANGUAGE', ['C'])
-
-  # Mark uncompiled sources as uncompiled.
-  if 'input' in sources:
-    SetFilesProperty(out, sources['input'], 'HEADER_FILE_ONLY', ('True',), '')
-  if 'other' in sources:
-    SetFilesProperty(out, sources['other'], 'HEADER_FILE_ONLY', ('True',), '')
-
-  # Mark object sources as linkable.
-  if 'obj' in sources:
-    SetFilesProperty(out, sources['obj'], 'EXTERNAL_OBJECT', ('True',), '')
-
-  # TODO: 'output_name', 'output_dir', 'output_extension'
-  # This includes using 'source_outputs' to direct compiler output.
-
-  # Includes
-  includes = target.properties.get('include_dirs', [])
-  if includes:
-    out.write('set_property(TARGET "${target}" ')
-    out.write('APPEND PROPERTY INCLUDE_DIRECTORIES')
-    for include_dir in includes:
-      out.write('\n  "')
-      out.write(project.GetAbsolutePath(include_dir))
-      out.write('"')
-    out.write(')\n')
-
-  # Defines
-  defines = target.properties.get('defines', [])
-  if defines:
-    SetCurrentTargetProperty(out, 'COMPILE_DEFINITIONS', defines, ';')
-
-  # Compile flags
-  # "arflags", "asmflags", "cflags",
-  # "cflags_c", "clfags_cc", "cflags_objc", "clfags_objcc"
-  # CMake does not have per target lang compile flags.
-  # TODO: $<$<COMPILE_LANGUAGE:CXX>:cflags_cc style generator expression.
-  #       http://public.kitware.com/Bug/view.php?id=14857
-  flags = []
-  flags.extend(target.properties.get('cflags', []))
-  cflags_asm = target.properties.get('asmflags', [])
-  cflags_c = target.properties.get('cflags_c', [])
-  cflags_cxx = target.properties.get('cflags_cc', [])
-  if 'c' in sources and not any(k in sources for k in ('asm', 'cxx')):
-    flags.extend(cflags_c)
-  elif 'cxx' in sources and not any(k in sources for k in ('asm', 'c')):
-    flags.extend(cflags_cxx)
-  else:
-    # TODO: This is broken, one cannot generally set properties on files,
-    # as other targets may require different properties on the same files.
-    if 'asm' in sources and cflags_asm:
-      SetFilesProperty(out, sources['asm'], 'COMPILE_FLAGS', cflags_asm, ' ')
-    if 'c' in sources and cflags_c:
-      SetFilesProperty(out, sources['c'], 'COMPILE_FLAGS', cflags_c, ' ')
-    if 'cxx' in sources and cflags_cxx:
-      SetFilesProperty(out, sources['cxx'], 'COMPILE_FLAGS', cflags_cxx, ' ')
-  if flags:
-    SetCurrentTargetProperty(out, 'COMPILE_FLAGS', flags, ' ')
-
-  # Linker flags
-  ldflags = target.properties.get('ldflags', [])
-  if ldflags:
-    SetCurrentTargetProperty(out, 'LINK_FLAGS', ldflags, ' ')
-
-
-gn_target_types_that_absorb_objects = (
-  'executable',
-  'loadable_module',
-  'shared_library',
-  'static_library'
-)
-
-
-def WriteSourceVariables(out, target, project):
-  # gn separates the sheep from the goats based on file extensions.
-  # A full separation is done here because of flag handing (see Compile flags).
-  source_types = {'cxx':[], 'c':[], 'asm':[],
-                  'obj':[], 'obj_target':[], 'input':[], 'other':[]}
-
-  all_sources = target.properties.get('sources', [])
-
-  # As of cmake 3.11 add_library must have sources. If there are
-  # no sources, add empty.cpp as the file to compile.
-  if len(all_sources) == 0:
-    all_sources.append(posixpath.join(project.build_path, 'empty.cpp'))
-
-  # TODO .def files on Windows
-  for source in all_sources:
-    _, ext = posixpath.splitext(source)
-    source_abs_path = project.GetAbsolutePath(source)
-    source_types[source_file_types.get(ext, 'other')].append(source_abs_path)
-
-  for input_path in target.properties.get('inputs', []):
-    input_abs_path = project.GetAbsolutePath(input_path)
-    source_types['input'].append(input_abs_path)
-
-  # OBJECT library dependencies need to be listed as sources.
-  # Only executables and non-OBJECT libraries may reference an OBJECT library.
-  # https://gitlab.kitware.com/cmake/cmake/issues/14778
-  if target.gn_type in gn_target_types_that_absorb_objects:
-    object_dependencies = set()
-    project.GetObjectSourceDependencies(target.gn_name, object_dependencies)
-    for dependency in object_dependencies:
-      cmake_dependency_name = project.GetCMakeTargetName(dependency)
-      obj_target_sources = '$<TARGET_OBJECTS:' + cmake_dependency_name + '>'
-      source_types['obj_target'].append(obj_target_sources)
-
-  sources = {}
-  for source_type, sources_of_type in source_types.items():
-    if sources_of_type:
-      sources[source_type] = '${target}__' + source_type + '_srcs'
-      SetVariableList(out, sources[source_type], sources_of_type)
-  return sources
-
-
-def WriteTarget(out, target, project):
-  out.write('\n#')
-  out.write(target.gn_name)
-  out.write('\n')
-
-  if target.cmake_type is None:
-    print ('Target %s has unknown target type %s, skipping.' %
-          (        target.gn_name,            target.gn_type ) )
-    return
-
-  SetVariable(out, 'target', target.cmake_name)
-
-  sources = WriteSourceVariables(out, target, project)
-
-  synthetic_dependencies = set()
-  if target.gn_type == 'action':
-    WriteAction(out, target, project, sources, synthetic_dependencies)
-  if target.gn_type == 'action_foreach':
-    WriteActionForEach(out, target, project, sources, synthetic_dependencies)
-  if target.gn_type == 'copy':
-    WriteCopy(out, target, project, sources, synthetic_dependencies)
-
-  out.write(target.cmake_type.command)
-  out.write('("${target}"')
-  if target.cmake_type.modifier is not None:
-    out.write(' ')
-    out.write(target.cmake_type.modifier)
-  for sources_type_name in sources.values():
-    WriteVariable(out, sources_type_name, ' ')
-  if synthetic_dependencies:
-    out.write(' DEPENDS')
-    for synthetic_dependencie in synthetic_dependencies:
-      WriteVariable(out, synthetic_dependencie, ' ')
-  out.write(')\n')
-
-  if target.cmake_type.command != 'add_custom_target':
-    WriteCompilerFlags(out, target, project, sources)
-
-  libraries = set()
-  nonlibraries = set()
-
-  dependencies = set(target.properties.get('deps', []))
-  # Transitive OBJECT libraries are in sources.
-  # Those sources are dependent on the OBJECT library dependencies.
-  # Those sources cannot bring in library dependencies.
-  object_dependencies = set()
-  if target.gn_type != 'source_set':
-    project.GetObjectLibraryDependencies(target.gn_name, object_dependencies)
-  for object_dependency in object_dependencies:
-    dependencies.update(project.targets.get(object_dependency).get('deps', []))
-
-  for dependency in dependencies:
-    gn_dependency_type = project.targets.get(dependency, {}).get('type', None)
-    cmake_dependency_type = cmake_target_types.get(gn_dependency_type, None)
-    cmake_dependency_name = project.GetCMakeTargetName(dependency)
-    if cmake_dependency_type.command != 'add_library':
-      nonlibraries.add(cmake_dependency_name)
-    elif cmake_dependency_type.modifier != 'OBJECT':
-      if target.cmake_type.is_linkable:
-        libraries.add(cmake_dependency_name)
-      else:
-        nonlibraries.add(cmake_dependency_name)
-
-  # Non-library dependencies.
-  if nonlibraries:
-    out.write('add_dependencies("${target}"')
-    for nonlibrary in nonlibraries:
-      out.write('\n  "')
-      out.write(nonlibrary)
-      out.write('"')
-    out.write(')\n')
-
-  # Non-OBJECT library dependencies.
-  external_libraries = target.properties.get('libs', [])
-  if target.cmake_type.is_linkable and (external_libraries or libraries):
-    library_dirs = target.properties.get('lib_dirs', [])
-    if library_dirs:
-      SetVariableList(out, '${target}__library_directories', library_dirs)
-
-    system_libraries = []
-    for external_library in external_libraries:
-      if '/' in external_library:
-        libraries.add(project.GetAbsolutePath(external_library))
-      else:
-        if external_library.endswith('.framework'):
-          external_library = external_library[:-len('.framework')]
-        system_library = 'library__' + external_library
-        if library_dirs:
-          system_library = system_library + '__for_${target}'
-        out.write('find_library("')
-        out.write(CMakeStringEscape(system_library))
-        out.write('" "')
-        out.write(CMakeStringEscape(external_library))
-        out.write('"')
-        if library_dirs:
-          out.write(' PATHS "')
-          WriteVariable(out, '${target}__library_directories')
-          out.write('"')
-        out.write(')\n')
-        system_libraries.append(system_library)
-    out.write('target_link_libraries("${target}"')
-    for library in libraries:
-      out.write('\n  "')
-      out.write(CMakeStringEscape(library))
-      out.write('"')
-    for system_library in system_libraries:
-      WriteVariable(out, system_library, '\n  "')
-      out.write('"')
-    out.write(')\n')
-
-
-def WriteProject(project):
-  out = open(posixpath.join(project.build_path, 'CMakeLists.txt'), 'w+')
-  extName = posixpath.join(project.build_path, 'CMakeLists.ext')
-  out.write('# Generated by gn_to_cmake.py.\n')
-  out.write('cmake_minimum_required(VERSION 2.8.8 FATAL_ERROR)\n')
-  out.write('cmake_policy(VERSION 2.8.8)\n\n')
-
-  out.write('file(WRITE "')
-  out.write(CMakeStringEscape(posixpath.join(project.build_path, "empty.cpp")))
-  out.write('")\n')
-
-  # Update the gn generated ninja build.
-  # If a build file has changed, this will update CMakeLists.ext if
-  # gn gen out/config --ide=json --json-ide-script=../../gn/gn_to_cmake.py
-  # style was used to create this config.
-  out.write('execute_process(COMMAND\n')
-  out.write('  ninja -C "')
-  out.write(CMakeStringEscape(project.build_path))
-  out.write('" build.ninja\n')
-  out.write('  RESULT_VARIABLE ninja_result)\n')
-  out.write('if (ninja_result)\n')
-  out.write('  message(WARNING ')
-  out.write('"Regeneration failed running ninja: ${ninja_result}")\n')
-  out.write('endif()\n')
-
-  out.write('include("')
-  out.write(CMakeStringEscape(extName))
-  out.write('")\n')
-  out.close()
-
-  out = open(extName, 'w+')
-  out.write('# Generated by gn_to_cmake.py.\n')
-  out.write('cmake_minimum_required(VERSION 2.8.8 FATAL_ERROR)\n')
-  out.write('cmake_policy(VERSION 2.8.8)\n')
-
-  # The following appears to be as-yet undocumented.
-  # http://public.kitware.com/Bug/view.php?id=8392
-  out.write('enable_language(ASM)\n\n')
-  # ASM-ATT does not support .S files.
-  # output.write('enable_language(ASM-ATT)\n')
-
-  # Current issues with automatic re-generation:
-  # The gn generated build.ninja target uses build.ninja.d
-  #   but build.ninja.d does not contain the ide or gn.
-  # Currently the ide is not run if the project.json file is not changed
-  #   but the ide needs to be run anyway if it has itself changed.
-  #   This can be worked around by deleting the project.json file.
-  out.write('file(READ "')
-  gn_deps_file = posixpath.join(project.build_path, 'build.ninja.d')
-  out.write(CMakeStringEscape(gn_deps_file))
-  out.write('" "gn_deps_string" OFFSET ')
-  out.write(str(len('build.ninja: ')))
-  out.write(')\n')
-  # One would think this would need to worry about escaped spaces
-  # but gn doesn't escape spaces here (it generates invalid .d files).
-  out.write('string(REPLACE " " ";" "gn_deps" ${gn_deps_string})\n')
-  out.write('foreach("gn_dep" ${gn_deps})\n')
-  out.write('  configure_file("')
-  out.write(CMakeStringEscape(project.build_path))
-  out.write('${gn_dep}" "CMakeLists.devnull" COPYONLY)\n')
-  out.write('endforeach("gn_dep")\n')
-
-  out.write('list(APPEND other_deps "')
-  out.write(CMakeStringEscape(os.path.abspath(__file__)))
-  out.write('")\n')
-  out.write('foreach("other_dep" ${other_deps})\n')
-  out.write('  configure_file("${other_dep}" "CMakeLists.devnull" COPYONLY)\n')
-  out.write('endforeach("other_dep")\n')
-
-  for target_name in project.targets.keys():
-    out.write('\n')
-    WriteTarget(out, Target(target_name, project), project)
-
-
-def main():
-  if len(sys.argv) != 2:
-    print('Usage: ' + sys.argv[0] + ' <json_file_name>')
-    exit(1)
-
-  json_path = sys.argv[1]
-  project = None
-  with open(json_path, 'r') as json_file:
-    project = json.loads(json_file.read())
-
-  WriteProject(Project(project))
-
-
-if __name__ == "__main__":
-  main()
diff --git a/list-available-packages.py b/list-available-packages.py
deleted file mode 100755
index df75b9e..0000000
--- a/list-available-packages.py
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import sys
-import json
-
-def main():
-    parser = argparse.ArgumentParser(description=("List all targets in the pushable available set"))
-    parser.add_argument('--build-dir', action='store', required=True)
-
-    args = parser.parse_args()
-    with open(os.path.join(args.build_dir, "packages.json")) as f:
-      data = json.load(f)
-
-    available_build_packages = set(data["available"])
-
-    with open(os.path.join(args.build_dir, "amber-files", "repository", "targets.json")) as f:
-      data = json.load(f)
-
-    published_packages = set([s.split('/')[1] for s in data['signed']['targets'].keys()])
-
-    available_packages = published_packages & available_build_packages
-
-    for tgt in available_packages:
-      print(tgt)
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/manifest/minimal b/manifest/minimal
deleted file mode 100644
index 415de62..0000000
--- a/manifest/minimal
+++ /dev/null
@@ -1,14 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<manifest>
-  <projects>
-    <project name="scripts"
-             path="scripts"
-             remote="https://fuchsia.googlesource.com/scripts"
-             gerrithost="https://fuchsia-review.googlesource.com"/>
-  </projects>
-  <hooks>
-    <hook name="install-fx"
-          project="scripts"
-          action="devshell/lib/add_symlink_to_bin.sh"/>
-  </hooks>
-</manifest>
diff --git a/manifest/scripts b/manifest/scripts
deleted file mode 100644
index 9a328b0..0000000
--- a/manifest/scripts
+++ /dev/null
@@ -1,13 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<manifest>
-  <imports>
-    <localimport file="minimal"/>
-  </imports>
-  <projects>
-    <project name="third_party/pytoml"
-             path="third_party/pytoml"
-             remote="https://fuchsia.googlesource.com/third_party/pytoml"
-             gerrithost="https://fuchsia-review.googlesource.com"
-             revision="8641351699f44401232786cd9f320ac373c3a02e"/>
-  </projects>
-</manifest>
diff --git a/memory/log.sh b/memory/log.sh
deleted file mode 100755
index 39155b3..0000000
--- a/memory/log.sh
+++ /dev/null
@@ -1,15 +0,0 @@
-#!/usr/bin/env sh
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-set -e
-
-mkdir -p logs
-
-while true; do
-    now=$(date +%FT%T%z)
-    echo "${now}"
-    fx shell memgraph -tvH > "logs/${now}"
-    sleep 10m
-done
diff --git a/memory/treemap.py b/memory/treemap.py
deleted file mode 100755
index ff96c7a..0000000
--- a/memory/treemap.py
+++ /dev/null
@@ -1,519 +0,0 @@
-#!/usr/bin/env python
-#
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Visualizes the output of Zircon's "memgraph" tool.
-
-For usage, see
-https://fuchsia.googlesource.com/zircon/+/master/docs/memory.md#Visualize-memory-usage
-"""
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import cgi
-import collections
-import json
-import sys
-import os.path
-import textwrap
-
-FUCHSIA_DIR = os.path.abspath(os.path.join(__file__, os.pardir, os.pardir, os.pardir))
-
-# Magic value for nodes with empty names.
-UNNAMED_NAME = '<unnamed>'
-
-
-class Node(object):
-    """A generic node in the kernel/job/process/memory tree."""
-
-    def __init__(self):
-        self.type = ''
-        self.koid = 0
-        self.name = ''
-        self.area = 0
-        self.children = []
-
-    def html_label(self):
-        """Returns a safe HTML string that identifies this Node."""
-        tag = ''
-        if self.type:
-            tag = ('<span class="treemap-node-type '
-                   'treemap-node-type-{}">{}</span> ').format(
-                           cgi.escape(self.type[0]),
-                           cgi.escape(self.type[0].upper()))
-        if self.name in ('', UNNAMED_NAME):
-            name = '<i>UNNAMED</i> [koid {}]'.format(self.koid)
-        else:
-            name = cgi.escape(self.name)
-        return tag + name
-
-
-# Mapping of unique ID strings to Node objects.
-ids_to_nodes = {}
-
-
-def lookup(node_id):
-    """Returns or creates the Node associated with an ID string.
-
-    Args:
-        node_id: ID string to look up
-    Returns:
-        A Node object
-    """
-    node = ids_to_nodes.get(node_id)
-    if node is None:
-        node = Node()
-        ids_to_nodes[node_id] = node
-    return node
-
-
-def sum_area(node):
-    """Recursively calculates the Node.area values of a subtree.
-
-    Args:
-        node: The Node at the root of the tree to walk
-    Returns:
-        The final area of |node|
-    """
-    # Area should either be set explicitly or calculated from the children.
-    if node.children and (node.area != 0):
-        raise AssertionError(
-                'Node {} has {} children and non-zero area {}'.format(
-                        node.name, len(node.children), node.area))
-    node.area += sum(map(sum_area, node.children))
-    return node.area
-
-
-def format_size(nbytes):
-    """Formats a size as a human-readable string like "123.4k".
-
-    Units are in powers of 1024, so "k" is technically "kiB", etc.
-    Values smaller than "k" have the suffix "B".
-
-    Exact multiples of a unit are displayed without a decimal;
-    e.g., "17k" means the value is exactly 17 * 1024.
-
-    Otherwise, a decimal is present; e.g., "17.0k" means the value
-    is (17 * 1024) +/- epsilon.
-
-    Args:
-        nbytes: The value to format
-    Returns:
-        The formatted string
-    """
-    units = 'BkMGTPE'
-    ui = 0
-    r = 0
-    whole = True
-
-    while nbytes >= 10000 or (nbytes != 0 and (nbytes & 1023) == 0):
-        ui += 1
-        if nbytes & 1023:
-            whole = False
-        r = nbytes % 1024
-        nbytes //= 1024
-
-    if whole:
-        return '{}{}'.format(nbytes, units[ui])
-
-    round_up = (r % 100) >= 50
-    r = (r // 100) + round_up
-    if r == 10:
-        nbytes += 1
-        r = 0
-
-    return '{}.{}{}'.format(nbytes, r, units[ui])
-
-
-# Enum for tracking VMO reference types.
-VIA_HANDLE = 1
-VIA_MAPPING = 2
-
-
-def populate_process(process_node, process_record, hide_aggregated=True):
-    """Adds the process's child nodes.
-
-    Args:
-        process_node: A process's Node
-        process_record: The same process's input record
-        hide_aggregated: If true, do not create Nodes for individual VMOs
-                that have been aggregated by name into a single Node
-    """
-    # If there aren't any VMOs, use the sizes in the record.
-    if not process_record.get('vmo_refs', []):
-        # Get the breakdown.
-        priv = process_record.get('private_bytes', 0)
-        pss = process_record.get('pss_bytes', 0)
-        shared = max(0, pss - priv)  # Kernel calls this "scaled shared"
-
-        pid = process_record['id']
-        if priv:
-            node = lookup(pid + '/priv')
-            node.name = 'Private'
-            node.area = priv
-            process_node.children.append(node)
-        if shared:
-            node = lookup(pid + '/shared')
-            node.name = 'Proportional shared'
-            node.area = shared
-            process_node.children.append(node)
-        # The process's area will be set to the sum of the children.
-        return
-    # Otherwise, this entry has VMOs.
-
-    # Build the set of reference types from this process to its VMOs.
-    koid_to_ref_types = collections.defaultdict(set)
-    for vmo_ref in process_record.get('vmo_refs', []):
-        ref_types = koid_to_ref_types[vmo_ref['vmo_koid']]
-        if 'HANDLE' in vmo_ref['via']:
-            ref_types.update([VIA_HANDLE])
-        if 'MAPPING' in vmo_ref['via']:
-            ref_types.update([VIA_MAPPING])
-
-    # De-dup the set of VMOs known to the process, and group them by name. Each
-    # of these entries are equivalent, though some values may be different (like
-    # committed_bytes) because they were snapshotted at different times.
-    name_to_vmo = collections.defaultdict(list)
-    koid_to_vmo = dict()
-    id_prefix = '{}/vmo'.format(process_record['id'])
-    for vmo in process_record.get('vmos', []):
-        # Although multiple processes may point to the same VMO, we're building
-        # a tree and thus need to create unique IDs for VMOs under this process.
-        vmo_koid = vmo['koid']
-        vmo_id = '{}/{}'.format(id_prefix, vmo_koid)
-        vmo_node = lookup(vmo_id)
-        if vmo_node.name:
-            # This is a duplicate of a VMO we've already seen.
-            continue
-
-        vmo_node.type = 'vmo'
-        vmo_node.koid = vmo_koid
-        vmo_node.name = vmo['name'] if vmo['name'] else UNNAMED_NAME
-        name_to_vmo[vmo_node.name].append(vmo_node)
-        koid_to_vmo[vmo_koid] = vmo_node
-
-        # Figure out a size for the VMO.
-        ref_types = koid_to_ref_types[vmo_koid]
-        if VIA_MAPPING in ref_types:
-            # The VMO is already accounted for in the process's pss_bytes value.
-            # TODO(dbort): To make the VMO areas exactly line up with pss_bytes,
-            # we'd need sub-VMO mapping information like what 'vmaps' provides:
-            # this process may only map a subset of the VMO's committed pages,
-            # but we're counting all of them. This isn't necessarily wrong,
-            # just different.
-            vmo_node.area = int(
-                    float(vmo['committed_bytes']) / vmo['share_count'])
-            # NB: This counts as private memory if share_count is 1.
-        else:
-            # The process only has a handle to this VMO but does not map it: the
-            # process's pss_bytes value does not account for this VMO.
-            assert ref_types == set([VIA_HANDLE])
-            # Treat our handle reference as an increment to the VMO's
-            # share_count. This may over-estimate this process's share, because
-            # other processes could also have handle-only references that we
-            # don't know about.
-            vmo_node.area = int(float(vmo['committed_bytes']) /
-                                (float(vmo['share_count']) + 1))
-
-    # Create the aggregated VMO nodes.
-    children = []
-    for name, vmos in name_to_vmo.iteritems():
-        if len(vmos) == 1 or name == UNNAMED_NAME:
-            # Only one VMO with this name, or multiple VMOs with an empty name.
-            # Add them as direct children.
-            children.extend(vmos)
-        else:
-            # Create a parent VMO for all of these VMOs with the same name.
-            parent_id = '{}/{}'.format(id_prefix, name)
-            pnode = lookup(parent_id)
-            pnode.name = '{}[{}]'.format(name, len(vmos))
-            pnode.type = 'vmo'
-            if hide_aggregated:
-                pnode.area = sum(map(sum_area, vmos))
-                # And then drop the vmo nodes on the ground (by not adding
-                # them as children).
-            else:
-                # The area will be calculated from the children.
-                pnode.children.extend(vmos)
-            children.append(pnode)
-    # TODO(dbort): Call out VMOs/aggregates that are only reachable via handle?
-
-    process_node.children.extend(children)
-
-
-def build_webtreemap(node):
-    """Returns a JSON-able dict tree representing a Node tree.
-
-    See
-    https://github.com/evmar/webtreemap/blob/gh-pages/README.markdown#input-format
-    For a description of this data format.
-
-    Args:
-        node: The Node at the root of the tree to walk
-    Returns:
-        A webtreemap-compatible dict representing the tree
-    """
-    return {
-            'name': '{} ({})'.format(node.html_label(), format_size(node.area)),
-            'data': {
-                    '$area': node.area,
-                    # TODO(dbort): Turn this on and style different node types
-                    # if https://github.com/evmar/webtreemap/pull/15 is
-                    # accepted. Would define a class like
-                    # 'webtreemap-symbol-<type>' but there's a bug in
-                    # webtreemap.js.
-                    # '$symbol': node.type,
-            },
-            'children': map(build_webtreemap, node.children)
-    }
-
-
-def dump_html_table(node, depth=0, parent_area=None, total_area=None):
-    """Returns an HTML representation of the tree.
-
-    Args:
-        node: The root of the tree to dump
-        depth: The depth of the node. Use 0 for the root node.
-        parent_area: The size of the parent node; used to show fractions.
-                Use None for the root node.
-        total_area: The total size of the tree; used to show fractions.
-                Use None for the root node.
-    Returns:
-        A sequence of HTML lines, joinable by whitespace
-    """
-    lines = []
-
-    if not depth:
-        # We're the root node. Dump the headers.
-        lines.extend([
-                '<style>',
-                'table#tree {',
-                '    border-collapse: collapse;',
-                '    border-spacing: 0;',
-                '}',
-                'table#tree tr:nth-child(even) {',
-                '    background-color: #eee;',
-                '}',
-                'table#tree tr:nth-child(odd) {',
-                '    background-color: #fff;',
-                '}',
-                'table#tree tr:hover {',
-                '    background-color: #ff8;',
-                '}',
-                'table#tree td {',
-                '    text-align: right;',
-                '    padding-left: 1em;',
-                '    padding-right: 1em;',
-                '    font-family:Consolas,Monaco,Lucida Console,',
-                '        Liberation Mono,DejaVu Sans Mono,',
-                '        Bitstream Vera Sans Mono,Courier New,monospace;',
-                '}',
-                'table#tree td.name {',
-                '    text-align: left;',
-                '}',
-                '</style>',
-                '<table id="tree">',
-                '<tr>',
-                '<th>Name</th>',
-                '<th>Size<br/>(bytes/1024^n)</th>',
-                '<th>Size (bytes)</th>',
-                '<th>Fraction of parent</th>',
-                '<th>Fraction of total</th>',
-                '</tr>',
-        ])
-
-    lines.extend([
-            '<tr>',
-            # Indent the names based on depth.
-            '<td class="name"><span style="color:#bbb">{indent}</span>'
-            '{label}</td>'.format(
-                    indent=('|' + '&nbsp;' * 2) * depth,
-                    label=node.html_label()),
-            '<td>{fsize}</td>'.format(fsize=format_size(node.area)),
-            '<td>{size}</td>'.format(size=node.area),
-    ])
-
-    if depth:
-        # We're not the root node.
-        pfrac = node.area / float(parent_area) if parent_area else 0
-        tfrac = node.area / float(total_area) if total_area else 0
-        for frac in (pfrac, tfrac):
-            lines.extend([
-                    ('<td>{pct:.3f}%&nbsp;'
-                     '<progress value="{frac}"></progress></td>')
-                    .format(pct=frac * 100, frac=frac)
-            ])
-    else:
-        lines.append('<td></td>' * 2)
-    lines.append('</tr>')
-
-    if total_area is None:
-        total_area = node.area
-
-    # Append children by size, largest to smallest.
-    def dump_child(child):
-        return dump_html_table(child, depth=depth+1,
-                               parent_area=node.area, total_area=total_area)
-
-    children = sorted(node.children, reverse=True, key=lambda n: n.area)
-    for line in [dump_child(c) for c in children]:
-        lines.extend(line)
-
-    if not depth:
-        lines.append('</table>')
-
-    return lines
-
-
-def build_tree(dataset):
-    """Builds a Node tree from a set of memgraph records.
-
-    See
-    https://fuchsia.googlesource.com/zircon/+/master/docs/memory.md#Visualize-memory-usage
-    for an example of generating memgraph JSON data.
-
-    Args:
-        dataset: A sequence of memgraph records, typically parsed from JSON
-    Returns:
-        The root of the new Node tree
-    """
-    ids_to_nodes.clear()  # Clear out the global registry.
-    root_node = None
-    root_job = None
-
-    for record in dataset:
-        record_type = record['type']
-        # Only read certain types.
-        if record_type not in ('kernel', 'j', 'p'):
-            continue
-        node = lookup(record['id'])
-        node.type = record_type
-        node.koid = record.get('koid', 0)
-        node.name = record['name']
-        if record_type == 'kernel':
-            node.area = record.get('size_bytes', 0)
-        elif record_type == 'j':
-            if record['parent'].startswith('kernel/'):
-                assert not root_job, 'Found multiple root jobs'
-                root_job = node
-        elif record_type == 'p':
-            # Add the process's children, which will determine its area.
-            populate_process(node, record)
-        if not record['parent']:
-            # The root node has an empty parent.
-            assert not root_node, 'Found multiple root objects'
-            root_node = node
-        else:
-            parent_node = lookup(record['parent'])
-            parent_node.children.append(node)
-
-    assert root_node, 'Did not find root object'
-    assert root_job, 'Did not find root job'
-
-    # A better name for physmem.
-    lookup('kernel/physmem').name = 'All physical memory'
-
-    # Sum up the job tree. Don't touch kernel entries, which already have
-    # the correct sizes.
-    sum_area(root_job)
-
-    # The root job is usually named "root";
-    # make it more clear that it's a job.
-    root_job.name = 'root job'
-
-    # Give users a hint that processes live in the VMO entry.
-    kvmo_node = lookup('kernel/vmo')
-    kvmo_node.name = 'VMOs/processes'
-
-    # Create a fake entry to cover the portion of kernel/vmo that isn't
-    # covered by the job tree.
-    node = lookup('kernel/vmo/unknown')
-    node.name = 'unknown (kernel & unmapped)'
-    node.area = kvmo_node.area - root_job.area
-    kvmo_node.children.append(node)
-
-    return root_node
-
-
-def print_html_document(root_node):
-    """Prints to stdout an HTML document that visualizes a Node tree.
-
-    Args:
-        root_node: The Node at the root of the tree to walk
-    """
-    html = '''\
-    <!DOCTYPE html>
-    <title>Memory usage</title>
-    <script>
-    var kTree = %(json)s
-    </script>
-    <link rel='stylesheet' href='%(css)s'>
-    <style>
-    body {
-      font-family: sans-serif;
-      font-size: 0.8em;
-      margin: 2ex 4ex;
-    }
-    h1 {
-      font-weight: normal;
-    }
-    #map {
-      width: 800px;
-      height: 600px;
-      position: relative;
-      cursor: pointer;
-      -webkit-user-select: none;
-    }
-    .treemap-node-type { font-weight: bold; }
-    /* Colorblind-safe colors from http://mkweb.bcgsc.ca/colorblind/ */
-    .treemap-node-type-k { color: black; }
-    .treemap-node-type-j { color: RGB(213, 94, 0); } /* Vermillion */
-    .treemap-node-type-p { color: RGB(0, 114, 178); } /* Blue */
-    .treemap-node-type-v { color: RGB(0, 158, 115); } /* Bluish green */
-    </style>
-
-    <h1>Memory usage</h1>
-
-    <p>Click on a box to zoom in.  Click on the outermost box to zoom out.</p>
-
-    <div id='map'></div>
-
-    <script src='%(js)s'></script>
-    <script>
-    var map = document.getElementById('map');
-    appendTreemap(map, kTree);
-    </script>
-
-    <ul style="list-style: none">
-    <li><span class="treemap-node-type treemap-node-type-k">K</span>: Kernel memory
-    <li><span class="treemap-node-type treemap-node-type-j">J</span>: Job
-    <li><span class="treemap-node-type treemap-node-type-p">P</span>: Process
-    <li><span class="treemap-node-type treemap-node-type-v">V</span>: VMO
-    <ul style="list-style: none">
-        <li> VMO names with <b>[<i>n</i>]</b> suffixes are aggregates of <i>n</i>
-             VMOs that have the same name.
-    </ul>
-    </ul>
-
-    <hr>
-    %(table)s
-    ''' % {
-            'json': json.dumps(build_webtreemap(root_node)),
-            'table': ' '.join(dump_html_table(root_node)),
-            'css': os.path.join(FUCHSIA_DIR, 'scripts', 'third_party', 'webtreemap', 'webtreemap.css'),
-            'js': os.path.join(FUCHSIA_DIR, 'scripts', 'third_party', 'webtreemap', 'webtreemap.js'),
-    }
-    print(textwrap.dedent(html))
-
-
-def main():
-    root_node = build_tree(json.load(sys.stdin))
-    print_html_document(root_node)
-
-
-if __name__ == '__main__':
-    main()
diff --git a/packages/README.md b/packages/README.md
deleted file mode 100644
index 176a11e..0000000
--- a/packages/README.md
+++ /dev/null
@@ -1,47 +0,0 @@
-# Build package and products tools
-
-This folder contains a set of utilities to manage products and build packages,
-i.e. files defined under `//<layer>/{packages,products}`.
-
-## verify_layer
-
-This tool verifies that a given layer's `packages/` and `products/` directories
-are properly organized. It checks that:
-
--   all files in the directories are JSON files;
--   all package files are valid according to
-    [the package schema][package-schema];
--   all product files are valid according to
-    [the product schema][package-schema];
--   all package subdirectories (except a few canonical ones for which it does
-    not make sense) have a file named `all` which contains all files in that
-    subdirectory;
--   all packages files listed as import are valid files;
--   the root directories contain a set of canonical files.
-
-The tool relies on a JSON validator commonly built as part of the Fuchsia build.
-The validator can be found at:
-
-```sh
-out/<build_type>/<host_toolchain>/json_validator
-```
-
-## visualize_hierarchy
-
-This tool generates a visualization of the package hierarchy for a given package
-file. The resulting graph file uses the [DOT format][dot-format].
-
-```sh
-python packages/visualize_hierarchy.py --package <topaz/packages/file> --output <graph.dot>
-```
-
-In order to generate an image file from the graph file, use the following
-command:
-
-```sh
-dot -Tpng <graph.dot> -o graph.png
-```
-
-[package-schema]: package_schema.json
-[product-schema]: product_schema.json
-[dot-format]: https://en.wikipedia.org/wiki/DOT_(graph_description_language)
diff --git a/packages/common.py b/packages/common.py
deleted file mode 100644
index 2701e81..0000000
--- a/packages/common.py
+++ /dev/null
@@ -1,19 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import json
-import os
-
-
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # packages
-    os.path.abspath(__file__))))
-
-
-def get_package_imports(package):
-    with open(os.path.join(FUCHSIA_ROOT, package), 'r') as package_file:
-        data = json.load(package_file)
-    return data['imports'] if 'imports' in data else []
diff --git a/packages/package_schema.json b/packages/package_schema.json
deleted file mode 100644
index 628244d..0000000
--- a/packages/package_schema.json
+++ /dev/null
@@ -1,43 +0,0 @@
-{
-  "description": "Schema for a build package file",
-  "type": "object",
-  "properties": {
-    "imports": {
-      "description": "A list of paths to other build package files to include",
-      "type": "array",
-      "items": {
-        "type": "string"
-      }
-    },
-    "labels": {
-      "description": "A list of free-form GN labels to build; useful for e.g. host tools",
-      "type": "array",
-      "items": {
-        "$ref": "#/definitions/gnLabel"
-      }
-    },
-    "packages": {
-      "description": "A list of GN labels representing Fuchsia packages",
-      "type": "array",
-      "items": {
-        "$ref": "#/definitions/gnLabel"
-      }
-    },
-    "host_tests": {
-      "description": "A list of GN labels to build, each representing a host test",
-      "type": "array",
-      "items": {
-        "$ref": "#/definitions/gnLabel"
-      }
-    }
-  },
-  "minProperties": 1,
-  "additionalProperties": false,
-  "definitions": {
-    "gnLabel": {
-      "description": "An absolute GN label",
-      "type": "string",
-      "pattern": "^/(/[^/]+)+(:[^/]+)?(\\(/(/[^/]+)+(:[^/]+)?\\))?$"
-    }
-  }
-}
diff --git a/packages/product_schema.json b/packages/product_schema.json
deleted file mode 100644
index 0ad90fe..0000000
--- a/packages/product_schema.json
+++ /dev/null
@@ -1,29 +0,0 @@
-{
-  "description": "Schema for a build product file",
-  "type": "object",
-  "properties": {
-    "monolith": {
-      "description": "A list of paths to build package files to include in the OTA and image builds",
-      "type": "array",
-      "items": {
-        "type": "string"
-      }
-    },
-    "preinstall": {
-      "description": "A list of paths to build package files to preload in images (but not OTA)",
-      "type": "array",
-      "items": {
-        "type": "string"
-      }
-    },
-    "available": {
-      "description": "A list of paths to build package files to build for installation",
-      "type": "array",
-      "items": {
-        "type": "string"
-      }
-    }
-  },
-  "minProperties": 1,
-  "additionalProperties": false
-}
diff --git a/packages/verify_layer.py b/packages/verify_layer.py
deleted file mode 100755
index b476b56..0000000
--- a/packages/verify_layer.py
+++ /dev/null
@@ -1,192 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-from common import FUCHSIA_ROOT, get_package_imports
-import json
-import os
-import subprocess
-import sys
-
-
-SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
-
-# Standard names for root packages in a layer.
-ROOT_CANONICAL_PACKAGES = [
-    'buildbot',
-    'default',
-    'kitchen_sink',
-]
-
-REQUIRED_PRODUCTS = [
-    'default'
-]
-
-# Standard names for packages in a layer.
-CANONICAL_PACKAGES = [
-    'all',
-]
-
-# Directories which do not require aggregation.
-NO_AGGREGATION_DIRECTORIES = [
-    'config',
-    'disabled',
-    'products',
-]
-
-
-def check_json(packages):
-    '''Verifies that all files in the list are JSON files.'''
-    all_json = True
-    for package in packages:
-        with open(package, 'r') as file:
-            try:
-                json.load(file)
-            except ValueError:
-                all_json = False
-                print('Non-JSON file: %s' % package)
-    return all_json
-
-
-def check_schema(packages, validator, schema):
-    '''Verifies that all files adhere to the schema.'''
-    all_valid = True
-    for package in packages:
-        if subprocess.call([validator, schema, package]) != 0:
-            all_valid = False
-    return all_valid
-
-
-def check_deps_exist(dep_map):
-    '''Verifies that all dependencies exist.'''
-    all_exist = True
-    for (package, deps) in dep_map.iteritems():
-        for dep in deps:
-            if not os.path.isfile(dep):
-                all_exist = False
-                print('Dependency of %s does not exist: %s' % (package, dep))
-    return all_exist
-
-
-def check_all(directory, dep_map, layer, is_root=True):
-    '''Verifies that directories contain an "all" package and that this packages
-       lists all the files in the directory.
-       '''
-    for dirpath, dirnames, filenames in os.walk(directory):
-        dirnames = [d for d in dirnames if d not in NO_AGGREGATION_DIRECTORIES]
-        is_clean = True
-        for dir in dirnames:
-            subdir = os.path.join(dirpath, dir)
-            if not check_all(subdir, dep_map, layer, is_root=False):
-                is_clean = False
-        if not is_clean:
-            return False
-        all_package = os.path.join(dirpath, 'all')
-        if not os.path.isfile(all_package):
-            print('Directory does not contain an "all" package: %s' % dirpath)
-            return False
-        known_deps = dep_map[all_package]
-        has_all_files = True
-        def verify(package):
-            if package not in known_deps:
-                print('The "all" package %s does not import package %s' % (all_package, package))
-                return False
-            return True
-        for file in filenames:
-            if is_root and (file in ROOT_CANONICAL_PACKAGES or file == layer):
-                continue
-            if file in CANONICAL_PACKAGES or file == 'README.md':
-                continue
-            package = os.path.join(dirpath, file)
-            if not verify(package):
-                has_all_files = False
-        for dir in dirnames:
-            package = os.path.join(dirpath, dir, 'all')
-            if not verify(package):
-                has_all_files = False
-        return has_all_files
-
-
-def check_no_fuchsia_packages_in_all(packages):
-    allowed_keys = {'imports'}
-    all_clear = True
-    for package in [p for p in packages if os.path.basename(p) == 'all']:
-        with open(package, 'r') as file:
-            data = json.load(file)
-            keys = set(data.keys())
-            if not keys.issubset(allowed_keys):
-                all_clear = False
-                print('"all" should only contain imports: %s' % package)
-    return all_clear
-
-
-def check_root(base, layer):
-    '''Verifies that all canonical packages are present at the root.'''
-    all_there = True
-    for file in ROOT_CANONICAL_PACKAGES + [layer]:
-        if not os.path.isfile(os.path.join(base, file)):
-            all_there = False
-            print('Missing root package: %s' % file)
-    return all_there
-
-
-def main():
-    parser = argparse.ArgumentParser(
-            description=('Checks that packages in a given layer are properly '
-                         'formatted and organized'))
-    layer_group = parser.add_mutually_exclusive_group(required=True)
-    layer_group.add_argument('--layer',
-                             help='Name of the layer to analyze',
-                             choices=['garnet', 'peridot', 'topaz'])
-    layer_group.add_argument('--vendor-layer',
-                             help='Name of the vendor layer to analyze')
-    parser.add_argument('--json-validator',
-                        help='Path to the JSON validation tool',
-                        required=True)
-    args = parser.parse_args()
-
-    os.chdir(FUCHSIA_ROOT)
-    if args.layer:
-        layer = args.layer
-        packages_base = os.path.join(layer, 'packages')
-    else:
-        layer = args.vendor_layer
-        packages_base = os.path.join('vendor', layer, 'packages')
-
-    # List all packages files.
-    packages = []
-    for dirpath, dirnames, filenames in os.walk(packages_base):
-        packages.extend([os.path.join(dirpath, f) for f in filenames if f != 'README.md'])
-
-    if not check_json(packages):
-        return False
-
-    schema = os.path.join(SCRIPT_DIR, 'package_schema.json')
-    if not check_schema(packages, args.json_validator, schema):
-        return False
-
-    deps = dict([(p, get_package_imports(p)) for p in packages])
-
-    if not check_deps_exist(deps):
-        return False
-
-    if not check_all(packages_base, deps, layer):
-        return False
-
-    if not check_no_fuchsia_packages_in_all(packages):
-        return False
-
-    if not check_root(packages_base, layer):
-        return False
-
-    return True
-
-
-if __name__ == '__main__':
-    return_code = 0
-    if not main():
-        print('Errors!')
-        return_code = 1
-    sys.exit(return_code)
diff --git a/packages/visualize_hierarchy.py b/packages/visualize_hierarchy.py
deleted file mode 100755
index d353520..0000000
--- a/packages/visualize_hierarchy.py
+++ /dev/null
@@ -1,83 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-from common import FUCHSIA_ROOT, get_package_imports
-import json
-import os
-import sys
-
-
-def get_package_id(package):
-    return package.replace('/', '_').replace('-', '_')
-
-
-def get_package_nick(package):
-    parts = package.split('/')
-    base = parts.index('packages')
-    return '/'.join(parts[base+1:])
-
-
-def main():
-    parser = argparse.ArgumentParser(
-            description=('Creates a graph of a build package hierarchy'))
-    group = parser.add_mutually_exclusive_group(required=True)
-    group.add_argument('--product',
-                        help='Path to the build product file to analyze')
-    group.add_argument('--package',
-                        help='Path to the build package file to analyze')
-    parser.add_argument('--output',
-                        help='Path to the generated .dot file',
-                        required=True)
-    args = parser.parse_args()
-
-    packages = []
-    if args.product:
-        with open(args.product) as product_file:
-            data = json.load(product_file)
-            packages.extend(data['monolith'] if 'monolith' in data else [])
-            packages.extend(data['preinstall'] if 'preinstall' in data else [])
-            packages.extend(data['available'] if 'available' in data else [])
-    else:
-        packages = [args.package]
-
-    # Build the dependency tree of packages.
-    deps = {}
-    while packages:
-        current = packages.pop(0)
-        imports = get_package_imports(current)
-        deps[current] = imports
-        new_packages = [p for p in imports if p not in deps]
-        packages.extend(new_packages)
-
-    layers = {}
-    for package in deps:
-        parts = package.split('/')
-        try:
-            package_base = parts.index('packages')
-        except:
-            raise Exception('Unexpected directory structure for %s' % package)
-        layer = '/'.join(parts[0:package_base])
-        layers.setdefault(layer, []).append(package)
-
-    with open(args.output, 'w') as out:
-        out.write('digraph fuchsia {\n')
-        for index, pair in enumerate(layers.iteritems()):
-            layer, packages = pair
-            out.write('subgraph cluster_%s {\n' % index)
-            out.write('label="%s";\n' % layer)
-            for package in packages:
-                out.write('%s [label="%s"];\n' % (get_package_id(package),
-                                                  get_package_nick(package)))
-            for package in packages:
-                dep_ids = [get_package_id(d) for d in deps[package]]
-                out.write('%s -> { %s }\n' % (get_package_id(package),
-                                              ' '.join(dep_ids)))
-            out.write('}\n')
-        out.write('}\n')
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/paths.py b/paths.py
deleted file mode 100644
index 7ca6b1f..0000000
--- a/paths.py
+++ /dev/null
@@ -1,25 +0,0 @@
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import os.path
-import platform
-
-FUCHSIA_ROOT = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
-ZIRCON_ROOT = os.path.join(FUCHSIA_ROOT, "zircon")
-BUILDTOOLS_ROOT = os.path.join(FUCHSIA_ROOT, "buildtools")
-FLUTTER_ROOT = os.path.join(FUCHSIA_ROOT, "lib", "flutter")
-
-# This variable is set by fx, so it is only set if we are running this script
-# within fx.
-if "ZIRCON_TOOLS_DIR" in os.environ:
-    ZIRCON_TOOLS_ROOT = os.path.dirname(os.environ['ZIRCON_TOOLS_DIR'])
-
-DART_PLATFORM = {
-    "Linux": "linux-x64",
-    "Darwin": "mac-x64",
-    "Windows": "win-x64"
-}[platform.system()]
-
-DART_ROOT = os.path.join(FUCHSIA_ROOT, "topaz", "tools", "prebuilt-dart-sdk",
-                         DART_PLATFORM)
diff --git a/run-dart-action.py b/run-dart-action.py
deleted file mode 100755
index e30bea9..0000000
--- a/run-dart-action.py
+++ /dev/null
@@ -1,169 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import json
-import multiprocessing
-import os
-import paths
-import Queue
-import subprocess
-import sys
-import threading
-
-
-def gn_describe(out, path):
-    gn = os.path.join(paths.FUCHSIA_ROOT, 'buildtools', 'gn')
-    data = subprocess.check_output(
-        [gn, 'desc', out, path, '--format=json'], cwd=paths.FUCHSIA_ROOT)
-    return json.loads(data)
-
-
-class WorkerThread(threading.Thread):
-    '''
-    A worker thread to run scripts from a queue and return exit codes and output
-    on a queue.
-    '''
-
-    def __init__(self, script_queue, result_queue, args):
-        threading.Thread.__init__(self)
-        self.script_queue = script_queue
-        self.result_queue = result_queue
-        self.args = args
-        self.daemon = True
-
-    def run(self):
-        while True:
-            try:
-                script = self.script_queue.get(False)
-            except Queue.Empty, e:
-                # No more scripts to run.
-                return
-            if not os.path.exists(script):
-                self.result_queue.put((script, -1, 'Script does not exist.'))
-                continue
-            job = subprocess.Popen(
-                [script] + self.args,
-                stdout=subprocess.PIPE,
-                stderr=subprocess.PIPE)
-            stdout, stderr = job.communicate()
-            self.result_queue.put((script, job.returncode, stdout + stderr))
-
-
-def main():
-    parser = argparse.ArgumentParser(
-        '''Run Dart actions (analysis, test, target-test) for Dart build
-targets. Extra flags will be passed to the supporting Dart tool if applicable.
-''')
-    parser.add_argument(
-        '--out',
-        help='Path to the base output directory, e.g. out/debug-x64',
-        required=True)
-    parser.add_argument(
-        '--tree',
-        help='Restrict analysis to a source subtree, e.g. //topaz/shell/*',
-        default='*')
-    parser.add_argument(
-        '--jobs', '-j',
-        help='Number of concurrent instances to run',
-        type=int,
-        default=multiprocessing.cpu_count())
-    parser.add_argument(
-        '--verbose', '-v',
-        help='Show output from tests that pass',
-        action='store_true')
-    parser.add_argument(
-        'action',
-        help='Action to perform on the targets',
-        choices=('analyze', 'test', 'target-test'))
-    args, extras = parser.parse_known_args()
-
-    if not os.path.isdir(os.path.join(paths.FUCHSIA_ROOT, args.out)):
-        print 'Invalid output directory: %s' % args.out
-        return 1
-
-    tree = args.tree
-    if args.action == 'analyze':
-        tree = '%s(//build/dart:dartlang)' % tree
-
-    # Ask gn about all the dart analyzer scripts.
-    scripts = []
-    targets = gn_describe(args.out, tree)
-    if not targets:
-        print 'No targets found...'
-        exit(1)
-
-    for target_name, properties in targets.items():
-        if args.action == 'analyze':
-          script_valid = (
-              'script' in properties and properties['script'] ==
-              '//build/dart/gen_analyzer_invocation.py'
-          )
-        elif args.action == 'test':
-          script_valid = (
-              'script' in properties and
-              properties['script'] == '//build/dart/gen_test_invocation.py'
-          )
-        else:  # 'target-test'
-          script_valid = (
-              'script' in properties and
-              properties['script'] ==
-              '//build/dart/gen_remote_test_invocation.py'
-          )
-        if ('type' not in properties or
-                properties['type'] != 'action' or
-                'script' not in properties or
-                not script_valid or
-                'outputs' not in properties or
-                not len(properties['outputs'])):
-            continue
-        script_path = properties['outputs'][0]
-        script_path = script_path[2:]  # Remove the leading //
-        scripts.append(os.path.join(paths.FUCHSIA_ROOT, script_path))
-
-    # Put all the analyzer scripts in a queue that workers will work from
-    script_queue = Queue.Queue()
-    for script in scripts:
-        script_queue.put(script)
-    # Make a queue to receive results from workers.
-    result_queue = Queue.Queue()
-    # Track return codes from scripts.
-    script_results = []
-    failed_scripts = []
-
-    # Create a worker thread for each CPU on the machine.
-    for i in range(args.jobs):
-        WorkerThread(script_queue, result_queue, extras).start()
-
-    def print_progress():
-        sys.stdout.write('\rProgress: %d/%d\033[K' % (len(script_results),
-                                                      len(scripts)))
-        sys.stdout.flush()
-
-    print_progress()
-
-    # Handle results from workers.
-    while len(script_results) < len(scripts):
-        script, returncode, output = result_queue.get(True)
-        script_results.append(returncode)
-        print_progress()
-        if returncode != 0:
-            failed_scripts.append(script)
-        if args.verbose or returncode != 0:
-            print '\r----------------------------------------------------------'
-            print script
-            print output
-
-    print ''
-    if len(failed_scripts):
-        failed_scripts.sort()
-        print 'Failures in:'
-        for script in failed_scripts:
-            print '  %s' % script
-        exit(1)
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/run-zircon-arm64 b/run-zircon-arm64
deleted file mode 100755
index 6492034..0000000
--- a/run-zircon-arm64
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/usr/bin/env bash
-
-# Copyright 2016 The Fuchsia Authors
-#
-# Use of this source code is governed by a MIT-style
-# license that can be found in the LICENSE file or at
-# https://opensource.org/licenses/MIT
-
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
-
-exec $DIR/../zircon/scripts/run-zircon -a arm64 \
-    -o $DIR/../out/build-zircon/build-arm64 "$@"
diff --git a/run-zircon-x86 b/run-zircon-x86
deleted file mode 100755
index d26fe5d..0000000
--- a/run-zircon-x86
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/usr/bin/env bash
-
-# Copyright 2016 The Fuchsia Authors
-#
-# Use of this source code is governed by a MIT-style
-# license that can be found in the LICENSE file or at
-# https://opensource.org/licenses/MIT
-
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
-
-exec $DIR/../zircon/scripts/run-zircon -a x64 \
-    -o $DIR/../out/build-zircon/build-x64 "$@"
diff --git a/rust/build_cargo_vendor.sh b/rust/build_cargo_vendor.sh
deleted file mode 100755
index 27c0fae..0000000
--- a/rust/build_cargo_vendor.sh
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/sh
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# NOTE: building cargo-vendor manually is currently necessary as cargo-vendor
-# cannot be built from sources in the Fuchsia tree AND cannot be installed via
-# "cargo install"...
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly ROOT_DIR="$(dirname $(dirname "${SCRIPT_DIR}"))"
-
-if [[ "$(uname -s)" = "Darwin" ]]; then
-  readonly PLATFORM="mac-x64"
-else
-  readonly PLATFORM="linux-x64"
-fi
-readonly RUST_BASE="$ROOT_DIR/buildtools/$PLATFORM/rust"
-readonly CARGO="$RUST_BASE/bin/cargo"
-
-command -v cmake >/dev/null 2>&1
-if [[ "$?" != 0 ]]; then
-  echo "cmake not found, aborting"
-  exit 1
-fi
-
-export RUSTC="$RUST_BASE/bin/rustc"
-export CARGO_TARGET_DIR="$ROOT_DIR/out/cargo-vendor"
-
-mkdir -p $CARGO_TARGET_DIR
-cd "$ROOT_DIR/third_party/rust-mirrors/cargo-vendor"
-$CARGO build
diff --git a/rust/build_toolchain.py b/rust/build_toolchain.py
deleted file mode 100755
index 42ba17f..0000000
--- a/rust/build_toolchain.py
+++ /dev/null
@@ -1,164 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import subprocess
-import sys
-
-
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # rust
-    os.path.abspath(__file__))))
-
-BUILD_CONFIG = '''
-[llvm]
-optimize = true
-static-libstdcpp = true
-ninja = true
-targets = "X86;AArch64"
-
-[build]
-target = ["{target}-fuchsia"]
-docs = false
-extended = true
-openssl-static = true
-
-[install]
-prefix = "{prefix}"
-sysconfdir = "etc"
-
-[rust]
-optimize = true
-
-[target.{target}-fuchsia]
-cc = "{cc}"
-cxx = "{cxx}"
-ar = "{ar}"
-linker = "{cc}"
-
-[dist]
-'''
-
-CARGO_CONFIG = '''
-[target.{target}-fuchsia]
-linker = "{linker}"
-ar = "{ar}"
-rustflags = [
-    "-C", "link-arg=--target={target}-fuchsia",
-    "-C", "link-arg=--sysroot={sysroot}",
-    "-C", "link-arg=-L{shared_libs_root}",
-]
-'''
-
-def ensure_dir(dir):
-    if not os.path.exists(dir):
-        os.makedirs(dir)
-    return dir
-
-def main():
-    parser = argparse.ArgumentParser(description='Build a Rust toolchain for Fuchsia')
-    parser.add_argument(
-            '--rust-root',
-            help='root directory of Rust checkout',
-            required=True)
-    parser.add_argument(
-            '--sysroot',
-            help='zircon sysroot (possibly //out/release-x64/sdk/exported/zircon_sysroot/sysroot)',
-            required=True)
-    parser.add_argument(
-            '--shared-libs-root',
-            help='shared libs root (possibly //out/release-x64/x64-shared)',
-            required=True)
-    parser.add_argument(
-            '--host-os',
-            help='host operating system',
-            choices=['linux', 'mac'],
-            default='linux')
-    parser.add_argument(
-            '--target',
-            help='target architecture',
-            choices=['x86_64', 'aarch64'],
-            default='x86_64')
-    parser.add_argument(
-            '--staging-dir',
-            help='directory in which to stage Rust build configuration artifacts',
-            default='/tmp/fuchsia_rustc_staging')
-    parser.add_argument(
-            '--debug',
-            help='turn on debug mode, with extra logs',
-            action='store_true')
-    args = parser.parse_args()
-
-    rust_root = os.path.abspath(args.rust_root)
-    sysroot = os.path.abspath(args.sysroot)
-    shared_libs_root = os.path.abspath(args.shared_libs_root)
-    host_os = args.host_os
-    target = args.target
-    staging_dir = os.path.abspath(args.staging_dir)
-    debug = args.debug
-
-    build_dir = ensure_dir(os.path.join(staging_dir, 'build'))
-    toolchain_dir = ensure_dir(os.path.join(staging_dir, 'toolchain'))
-    clang_dir = os.path.join(FUCHSIA_ROOT, 'buildtools', host_os + '-x64',
-                             'clang')
-
-    config_file = os.path.join(build_dir, 'config.toml')
-    with open(config_file, 'w') as file:
-        file.write(BUILD_CONFIG.format(
-            target=target,
-            prefix=toolchain_dir,
-            cc=os.path.join(clang_dir, 'bin', 'clang'),
-            cxx=os.path.join(clang_dir, 'bin', 'clang++'),
-            ar=os.path.join(clang_dir, 'bin', 'llvm-ar'),
-        ))
-
-    cargo_dir = ensure_dir(os.path.join(staging_dir, '.cargo'))
-    with open(os.path.join(cargo_dir, 'config'), 'w') as file:
-        file.write(CARGO_CONFIG.format(
-            target=target,
-            linker=os.path.join(clang_dir, 'bin', 'clang'),
-            ar=os.path.join(clang_dir, 'bin', 'llvm-ar'),
-            sysroot=sysroot,
-            shared_libs_root=shared_libs_root,
-        ))
-
-    cflags_key = 'CFLAGS_%s-fuchsia' % target
-    cflags_val = '--target=%s-fuchsia --sysroot=%s' % (target, sysroot)
-
-    env = {
-        'CARGO_HOME': cargo_dir,
-        cflags_key: cflags_val,
-        'PATH': os.environ['PATH'],
-        'RUST_BACKTRACE': '1',
-    }
-
-    def run_build_command(command):
-        command_args = [
-            os.path.join(rust_root, 'x.py'),
-        ]
-        command_args += command
-        command_args += [
-            '--config',
-            config_file,
-            '--src',
-            rust_root,
-        ]
-        if debug:
-            command_args.append('--verbose')
-        print('Running: %s' % ' '.join(command_args))
-        # The builds need to run from a subdirectory of the staging dir
-        # otherwise the cargo config set up above will get clobbered by x.py.
-        out_dir = ensure_dir(os.path.join(staging_dir, 'out'))
-        subprocess.check_call(command_args, env=env, cwd=out_dir)
-
-    run_build_command(['install'])
-
-    print('The toolchain is ready at: %s' % toolchain_dir)
-
-
-if __name__ == '__main__':
-    main()
diff --git a/rust/check_rust_licenses.py b/rust/check_rust_licenses.py
deleted file mode 100755
index 1f4d0fb..0000000
--- a/rust/check_rust_licenses.py
+++ /dev/null
@@ -1,117 +0,0 @@
-#!/usr/bin/env python
-#
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# A script to fetch missing LICENSE files when building a vendored repo
-
-# Should be run from top-level of third_party/rust-crates repository.
-
-import argparse
-import os
-import re
-import sys
-import urllib2
-
-repo_re = re.compile('\s*repository\s*=\s*"(.*)"\s*$')
-
-def die(reason):
-    raise Exception(reason)
-
-def get_repo_path(subdir):
-    for line in open(os.path.join(subdir, 'Cargo.toml')):
-        m = repo_re.match(line)
-        if m:
-            return m.group(1)
-
-def find_github_blob_path(path):
-    s = path.split('/')
-    if s[2] == 'github.com':
-        s[2] = 'raw.githubusercontent.com'
-        # github redirects "github.com/$USER/$PROJECT.git" to
-        # "github.com/$USER/$PROJECT".
-        if s[4].endswith('.git'):
-          s[4] = s[4][:len('.git')]
-    else:
-        die('don\'t know raw content path for ' + path)
-    if s[-1] == '':
-        del s[-1]
-    if len(s) >= 6 and s[5] == 'tree':
-        del s[5]
-    else:
-        s.append('master')
-    return '/'.join(s)
-
-def fetch_license(subdir):
-    repo_path = get_repo_path(subdir)
-    if repo_path is None:
-        die('can\'t find repo path for ' + subdir)
-    baseurl = find_github_blob_path(repo_path)
-    text = []
-    # 'LICENCE' is a British English spelling variant used in https://github.com/ebarnard/rust-plist
-    for license_filename in ('LICENSE', 'LICENSE-APACHE', 'LICENSE-MIT', 'COPYING', 'LICENCE', 'LICENSE.md'):
-        url = '/'.join((baseurl, license_filename))
-        try:
-            resp = urllib2.urlopen(url)
-            contents = resp.read()
-            if text: text.append('=' * 40 + '\n')
-            text.append(url + ':\n\n')
-            text.append(contents)
-        except urllib2.HTTPError:
-            pass
-    if not text:
-        die('no licenses found under ' + baseurl)
-    else:
-        license_out = open(os.path.join(subdir, 'LICENSE'), 'w')
-        license_out.write(''.join(text))
-
-
-def check_licenses(directory, verify=False):
-    success = True
-    os.chdir(directory)
-    for subdir in sorted(os.listdir(os.getcwd())):
-        # TODO(pylaligand): remove this temporary hack when a new version of
-        # the crate is published.
-        if (subdir.startswith('magenta-sys') or
-                subdir.startswith('fuchsia-zircon-sys')):
-            print 'IGNORED  %s' % subdir
-            continue
-        if subdir.startswith('.') or not os.path.isdir(subdir):
-            continue
-        license_files = [file for file in os.listdir(subdir)
-                         if file.startswith('LICENSE') or
-                         file.startswith('LICENCE') or
-                         file.startswith('license')]
-        if license_files:
-            print 'OK       %s' % subdir
-            continue
-        if verify:
-            print 'MISSING  %s' % subdir
-            success = False
-            continue
-        try:
-            fetch_license(subdir)
-            print 'FETCH    %s' % subdir
-        except Exception as err:
-            print 'ERROR    %s (%s)' % (subdir, err.message)
-            success = False
-    return success
-
-
-def main():
-    parser = argparse.ArgumentParser(
-        'Verifies licenses for third-party Rust crates')
-    parser.add_argument('--directory',
-                        help='Directory containing the crates',
-                        default=os.getcwd())
-    parser.add_argument('--verify',
-                        help='Simply check whether licenses are up-to-date',
-                        action='store_true')
-    args = parser.parse_args()
-    if not check_licenses(args.directory, verify=args.verify):
-        sys.exit(1)
-
-
-if __name__ == '__main__':
-    main()
diff --git a/rust/install_cargo_vendor.sh b/rust/install_cargo_vendor.sh
deleted file mode 100755
index 01e34bf..0000000
--- a/rust/install_cargo_vendor.sh
+++ /dev/null
@@ -1,23 +0,0 @@
-#!/usr/bin/env bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# NOTE: installing cargo-vendor manually is currently necessary as cargo-vendor
-# cannot be built from sources in the Fuchsia tree.
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly ROOT_DIR="$(dirname $(dirname "${SCRIPT_DIR}"))"
-
-if [[ "$(uname -s)" = "Darwin" ]]; then
-  readonly TRIPLE="x86_64-apple-darwin"
-else
-  readonly TRIPLE="x86_64-unknown-linux-gnu"
-fi
-readonly RUST_BASE="$ROOT_DIR/buildtools/rust/rust-$TRIPLE"
-readonly CARGO="$RUST_BASE/bin/cargo"
-
-export PATH="$PATH:$ROOT_DIR/buildtools/cmake/bin"
-export RUSTC="$RUST_BASE/bin/rustc"
-
-$CARGO install cargo-vendor
diff --git a/rust/rustdoc_no_ld_library_path.sh b/rust/rustdoc_no_ld_library_path.sh
deleted file mode 100755
index bb38e23..0000000
--- a/rust/rustdoc_no_ld_library_path.sh
+++ /dev/null
@@ -1,21 +0,0 @@
-#!/usr/bin/env bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# This script is a thin wrapper for rustdoc that unsets LD_LIBRARY_PATH
-# to avoid the incorrect env provided by `cargo doc`.
-#
-# TODO(cramertj) remove pending fix to our builds of cargo doc to prevent this
-
-readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-readonly ROOT_DIR="$(dirname $(dirname "${SCRIPT_DIR}"))"
-
-if [[ "$(uname -s)" = "Darwin" ]]; then
-  readonly PLATFORM="mac-x64"
-else
-  readonly PLATFORM="linux-x64"
-fi
-
-unset LD_LIBRARY_PATH
-$ROOT_DIR/buildtools/$PLATFORM/rust/bin/rustdoc --cap-lints=allow "$@"
diff --git a/sdk/MAINTAINERS b/sdk/MAINTAINERS
deleted file mode 100644
index 0b7024a..0000000
--- a/sdk/MAINTAINERS
+++ /dev/null
@@ -1,2 +0,0 @@
-alainv@google.com
-pylaligand@google.com
diff --git a/sdk/README.md b/sdk/README.md
deleted file mode 100644
index afec54a..0000000
--- a/sdk/README.md
+++ /dev/null
@@ -1,8 +0,0 @@
-SDK frontends
-=============
-
-This directory contains frontends to the SDK pipeline:
-- [`bazel/`](bazel): creates a C/C++/Dart/Flutter Bazel workspace.
-
-In addition, the `common/` directory provides plumbing shared by all frontends,
-and `tools/` contains various tools to work with SDK manifests.
diff --git a/sdk/bazel/README.md b/sdk/bazel/README.md
deleted file mode 100644
index fda925d..0000000
--- a/sdk/bazel/README.md
+++ /dev/null
@@ -1,152 +0,0 @@
-# Bazel SDK
-
-The Bazel SDK frontend produces a [Bazel](https://bazel.build/) workspace.
-
-## Directory structure
-
-- `generate.py`: the script that generates the SDK;
-- `templates`: Mako templates used to produce various SDK files;
-- `base`: SDK contents that are copied verbatim;
-- `tests`: various SDK tests, copied over to the test workspace.
-
-## Output layout
-
-```
-$root/
-    tools/                                 # host tools
-    dart/                                  # Dart packages
-        lorem/
-            BUILD
-            lib/
-    pkg/                                   # C++ package contents
-        foo/
-            BUILD                          # generated Bazel build file for this package
-            include/                       # headers
-            arch                           # target-independent prebuilts
-                x64/
-                    lib/
-                        libfoo.so          # ABI only, to link against
-                    dist/
-                        libfoo.so          # to include in Fuchsia packages
-                    debug/
-                        libfoo.so          # unstripped version
-                arm64/
-                    lib/
-                    dist/
-                    debug/
-            BUILD
-        bar/
-            include/
-            src/                           # sources for a C++ library
-            BUILD
-    arch/
-        x64/
-            sysroot/                       # x64 sysroot (libc, libzircon, and friends)
-        arm64/
-            sysroot/                       # arm64 sysroot
-```
-
-## Generating
-
-In order to generate a Bazel workspace, point the `generate.py` script to an
-SDK archive, e.g.:
-```
-$ scripts/sdk/bazel/generate.py \
-    --archive my_sdk_archive.tar.gz \
-    --output my_workspace/
-```
-
-## Testing
-
-The `generate.py` script optionally creates a workspace for testing the
-generated SDK:
-```
-$ scripts/sdk/bazel/generate.py \
-    --archive my_sdk_archive.tar.gz \
-    --output my_workspace/ \
-    --tests my_test_workspace/
-```
-
-Tests are then run with:
-```
-$ my_test_workspace/run.py
-```
-
-To exclude a target from the suite, mark it as ignored with:
-```
-my_rule(
-    name = "foobar",
-    ...
-    tags = [
-        "ignored",
-    ],
-)
-```
-To force-build ignored targets, use the `--ignored` flag.
-
-The test runner also builds targets in the SDK itself. To bypass this step, use
-the `--no-sdk` flag.
-
-## Consuming
-
-### C++
-
-The produced Bazel SDK can be consumed by adding those lines to a Bazel
-`WORKSPACE`:
-
-```
-http_archive(
-  name = "fuchsia_sdk",
-  path = "<FUCHSIA_SDK_URL>",
-)
-
-load("@fuchsia_sdk//build_defs:fuchsia_setup.bzl", "fuchsia_setup")
-fuchsia_setup(with_toolchain = True)
-```
-
-This adds the Fuchsia SDK to the workspace and sets up the necessary toolchains
-for cross compilation.
-
-To reference the toolchains, add this to the .bazelrc file:
-
-```
-build:fuchsia --crosstool_top=@fuchsia_crosstool//:toolchain
-build:fuchsia --cpu=x86_64
-build:fuchsia --host_crosstool_top=@bazel_tools//tools/cpp:toolchain
-```
-
-Targets can then be built for Fuchsia with:
-
-```
-$ bazel build --config=fuchsia //...
-```
-
-### Dart & Flutter
-
-To build Dart & Flutter packages using the Bazel SDK, add those lines to the
-Bazel `WORKSPACE`:
-
-```
-http_archive(
-  name = "fuchsia_sdk",
-  path = "<FUCHSIA_SDK_URL>",
-)
-
-load("@fuchsia_sdk//build_defs:fuchsia_setup.bzl", "fuchsia_setup")
-fuchsia_setup(with_toolchain = False)
-
-http_archive(
-  name = "io_bazel_rules_dart",
-  url = "https://github.com/dart-lang/rules_dart/archive/master.zip",
-  strip_prefix = "rules_dart-master",
-)
-
-load("@io_bazel_rules_dart//dart/build_rules:repositories.bzl", "dart_repositories")
-dart_repositories()
-
-load("@fuchsia_sdk//build_defs:setup_dart.bzl", "setup_dart")
-setup_dart()
-
-load("@fuchsia_sdk//build_defs:setup_flutter.bzl", "setup_flutter")
-setup_flutter()
-```
diff --git a/sdk/bazel/base/cc/build_defs/cc_binary_component.bzl b/sdk/bazel/base/cc/build_defs/cc_binary_component.bzl
deleted file mode 100644
index 156f665..0000000
--- a/sdk/bazel/base/cc/build_defs/cc_binary_component.bzl
+++ /dev/null
@@ -1,100 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":package_info.bzl", "get_aggregate_info", "PackageGeneratedInfo", "PackageComponentInfo")
-
-"""
-Makes a cc_binary ready for inclusion in a fuchsia_package.
-
-Args:
-    name: The name of this build target.
-    target: The cc_binary or cc_test target to include in the fuchsia package.
-    **kwargs: Additional arguments, such as testonly.
-"""
-
-def _cc_contents_impl(target, context):
-    if not context.rule.kind in ["cc_binary", "cc_test"]:
-        return [
-            PackageGeneratedInfo(mappings = []),
-        ]
-    mappings = {}
-    for file in target[DefaultInfo].files.to_list():
-        if file.extension == "":
-            mappings["bin/" + file.basename] = file
-        elif file.extension == "so":
-            mappings["lib/" + file.basename] = file
-    return [
-        PackageGeneratedInfo(mappings = mappings.items()),
-    ]
-
-# This aspect looks for cc_binary targets in the dependency tree of the given
-# target. For each of these targets, it then generates package content mappings.
-_cc_contents_aspect = aspect(
-    implementation = _cc_contents_impl,
-    attr_aspects = [
-        "data",
-        "deps",
-        "srcs",
-    ],
-    provides = [
-        PackageGeneratedInfo,
-    ],
-)
-
-def _cc_binary_component_impl(context):
-    if len(context.attr.deps) != 1:
-        fail("'deps' attribute must have exactly one element.", "deps")
-    return [
-        PackageComponentInfo(
-            name = context.attr.component_name,
-            manifest = context.file.manifest,
-        ),
-    ]
-
-_cc_binary_component = rule(
-    implementation = _cc_binary_component_impl,
-    attrs = {
-        "deps": attr.label_list(
-            doc = "The cc_binary for the component",
-            mandatory = True,
-            allow_empty = False,
-            allow_files = False,
-            aspects = [_cc_contents_aspect],
-        ),
-        "component_name": attr.string(
-            doc = "The name of the component",
-            mandatory = True,
-        ),
-        "manifest": attr.label(
-            doc = "The component's manifest file (.cmx)",
-            mandatory = True,
-            allow_single_file = True,
-        )
-    },
-    provides = [PackageComponentInfo],
-)
-
-def cc_binary_component(name, deps, component_name, manifest, **kwargs):
-    packaged_name = name + "_packaged"
-
-    _cc_binary_component(
-        name = packaged_name,
-        deps = deps,
-        component_name = component_name,
-        manifest = manifest,
-        **kwargs
-    )
-
-    # The filegroup is needed so that the packaging can properly crawl all the
-    # dependencies and look for package contents.
-    native.filegroup(
-        name = name,
-        srcs = [
-            ":" + packaged_name,
-            Label("//build_defs/toolchain:dist"),
-            Label("//pkg/fdio"),
-            Label("//pkg/sysroot"),
-        ],
-        **kwargs
-    )
diff --git a/sdk/bazel/base/cc/build_defs/cc_fidl_library.bzl b/sdk/bazel/base/cc/build_defs/cc_fidl_library.bzl
deleted file mode 100644
index b103200..0000000
--- a/sdk/bazel/base/cc/build_defs/cc_fidl_library.bzl
+++ /dev/null
@@ -1,132 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":fidl_library.bzl", "FidlLibraryInfo")
-
-# A cc_library backed by a FIDL library.
-#
-# Parameters
-#
-#   library
-#     Label of the FIDL library.
-
-CodegenInfo = provider(fields=["impl"])
-
-def _codegen_impl(context):
-    ir = context.attr.library[FidlLibraryInfo].ir
-    name = context.attr.library[FidlLibraryInfo].name
-
-    base_path = context.attr.name + ".cc"
-    # This declaration is needed in order to get access to the full path.
-    output = context.actions.declare_directory(base_path)
-    stem = base_path + "/" + name.replace(".", "/") + "/cpp/fidl"
-    header = context.actions.declare_file(stem + ".h")
-    source = context.actions.declare_file(stem + ".cc")
-
-    context.actions.run(
-        executable = context.executable._fidlgen,
-        arguments = [
-            "--json",
-            ir.path,
-            "--output-base",
-            header.dirname + "/fidl",
-            "--include-base",
-            output.path,
-            "--generators",
-            "cpp",
-        ],
-        inputs = [
-            ir,
-        ],
-        outputs = [
-            header,
-            output,
-            source,
-        ],
-        mnemonic = "FidlGenCc",
-    )
-
-    return [
-        CodegenInfo(impl = source),
-        DefaultInfo(files = depset([header]))
-    ]
-
-def _impl_wrapper_impl(context):
-    file = context.attr.codegen[CodegenInfo].impl
-    return [DefaultInfo(files = depset([file]))]
-
-# Runs fidlgen to produce both the header file and the implementation file.
-# Only exposes the header as a source, as the two files need to be consumed by
-# the cc_library as two separate rules.
-_codegen = rule(
-    implementation = _codegen_impl,
-    # Files must be generated in genfiles in order for the header to be included
-    # anywhere.
-    output_to_genfiles = True,
-    attrs = {
-        "library": attr.label(
-            doc = "The FIDL library to generate code for",
-            mandatory = True,
-            allow_files = False,
-            providers = [FidlLibraryInfo],
-        ),
-        "_fidlgen": attr.label(
-            default = Label("//tools:fidlgen"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-    }
-)
-
-# Simply declares the implementation file generated by the codegen target as an
-# output.
-# This allows the implementation file to be exposed as a source in its own rule.
-_impl_wrapper = rule(
-    implementation = _impl_wrapper_impl,
-    output_to_genfiles = True,
-    attrs = {
-        "codegen": attr.label(
-            doc = "The codegen rules generating the implementation file",
-            mandatory = True,
-            allow_files = False,
-            providers = [CodegenInfo],
-        ),
-    }
-)
-
-def cc_fidl_library(name, library, deps=[], tags=[], visibility=None):
-    gen_name = "%s_codegen" % name
-    impl_name = "%s_impl" % name
-
-    _codegen(
-        name = gen_name,
-        library = library,
-    )
-
-    _impl_wrapper(
-        name = impl_name,
-        codegen = ":%s" % gen_name,
-    )
-
-    native.cc_library(
-        name = name,
-        hdrs = [
-            ":%s" % gen_name,
-        ],
-        srcs = [
-            ":%s" % impl_name,
-            # For the coding tables.
-            library,
-        ],
-        includes = [
-            # This is necessary in order to locate generated headers.
-            gen_name + ".cc",
-        ],
-        deps = deps + [
-            Label("//pkg/fidl_cpp"),
-        ],
-        tags = tags,
-        visibility = visibility,
-    )
diff --git a/sdk/bazel/base/cc/build_defs/toolchain/dist.bzl b/sdk/bazel/base/cc/build_defs/toolchain/dist.bzl
deleted file mode 100644
index 1913437..0000000
--- a/sdk/bazel/base/cc/build_defs/toolchain/dist.bzl
+++ /dev/null
@@ -1,24 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("//build_defs:package_info.bzl", "PackageLocalInfo")
-
-def _toolchain_dist_impl(context):
-    mappings = {}
-    for file in context.attr.files[DefaultInfo].files.to_list():
-        mappings["lib/" + file.basename] = file
-    return [
-        PackageLocalInfo(mappings = mappings.items()),
-    ]
-
-toolchain_dist = rule(
-    implementation = _toolchain_dist_impl,
-    attrs = {
-        "files": attr.label(
-            doc = "The filegroup target listing the toolchain libraries to include in packages",
-            mandatory = True,
-            allow_files = False,
-        ),
-    },
-)
diff --git a/sdk/bazel/base/common/WORKSPACE b/sdk/bazel/base/common/WORKSPACE
deleted file mode 100644
index de784fc..0000000
--- a/sdk/bazel/base/common/WORKSPACE
+++ /dev/null
@@ -1,3 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
diff --git a/sdk/bazel/base/common/build_defs/BUILD b/sdk/bazel/base/common/build_defs/BUILD
deleted file mode 100644
index 41860ad..0000000
--- a/sdk/bazel/base/common/build_defs/BUILD
+++ /dev/null
@@ -1,9 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-licenses(["notice"])
-
-exports_files(
-    glob(["*"]),
-)
diff --git a/sdk/bazel/base/common/build_defs/fidl_library.bzl b/sdk/bazel/base/common/build_defs/fidl_library.bzl
deleted file mode 100644
index 373fbe6..0000000
--- a/sdk/bazel/base/common/build_defs/fidl_library.bzl
+++ /dev/null
@@ -1,114 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# A FIDL library.
-#
-# Parameters
-#
-#   library
-#     Name of the FIDL library.
-#
-#   srcs
-#     List of source files.
-#
-#   deps
-#     List of labels for FIDL libraries this library depends on.
-
-FidlLibraryInfo = provider(
-    fields = {
-        # TODO(pylaligand): this should be a depset.
-        "info": "List of structs(name, files) representing the library's dependencies",
-        "name": "Name of the FIDL library",
-        "ir": "Path to the JSON file with the library's intermediate representation",
-    },
-)
-
-def _gather_dependencies(deps):
-    info = []
-    libs_added = []
-    for dep in deps:
-        for lib in dep[FidlLibraryInfo].info:
-            name = lib.name
-            if name in libs_added:
-                continue
-            libs_added.append(name)
-            info.append(lib)
-    return info
-
-def _fidl_library_impl(context):
-    ir = context.outputs.ir
-    tables = context.outputs.coding_tables
-    library_name = context.attr.library
-
-    info = _gather_dependencies(context.attr.deps)
-    info.append(struct(
-        name = library_name,
-        files = context.files.srcs,
-    ))
-
-    files_argument = []
-    inputs = []
-    for lib in info:
-        files_argument += ["--files"] + [f.path for f in lib.files]
-        inputs.extend(lib.files)
-
-    context.actions.run(
-        executable = context.executable._fidlc,
-        arguments = [
-            "--json",
-            ir.path,
-            "--name",
-            library_name,
-            "--tables",
-            tables.path,
-        ] + files_argument,
-        inputs = inputs,
-        outputs = [
-            ir,
-            tables,
-        ],
-        mnemonic = "Fidlc",
-    )
-
-    return [
-        # Exposing the coding tables here so that the target can be consumed as a
-        # C++ source.
-        DefaultInfo(files = depset([tables])),
-        # Passing library info for dependent libraries.
-        FidlLibraryInfo(info=info, name=library_name, ir=ir),
-    ]
-
-fidl_library = rule(
-    implementation = _fidl_library_impl,
-    attrs = {
-        "library": attr.string(
-            doc = "The name of the FIDL library",
-            mandatory = True,
-        ),
-        "srcs": attr.label_list(
-            doc = "The list of .fidl source files",
-            mandatory = True,
-            allow_files = True,
-            allow_empty = False,
-        ),
-        "deps": attr.label_list(
-            doc = "The list of libraries this library depends on",
-            mandatory = False,
-            providers = [FidlLibraryInfo],
-        ),
-        "_fidlc": attr.label(
-            default = Label("//tools:fidlc"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-    },
-    outputs = {
-        # The intermediate representation of the library, to be consumed by bindings
-        # generators.
-        "ir": "%{name}_ir.json",
-        # The C coding tables.
-        "coding_tables": "%{name}_tables.cc",
-    },
-)
diff --git a/sdk/bazel/base/common/build_defs/fuchsia_select.bzl b/sdk/bazel/base/common/build_defs/fuchsia_select.bzl
deleted file mode 100644
index 664d298..0000000
--- a/sdk/bazel/base/common/build_defs/fuchsia_select.bzl
+++ /dev/null
@@ -1,27 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Description:
-#   A variation of select() that prints a meaningful error message if
-#   --config=fuchsia is absent. This avoid cryptic build errors about
-#   missing attribute values.
-
-_ERROR = """
-***********************************************************
-* You have to specify a config in order to build Fuchsia. *
-*                                                         *
-* For example: --config=fuchsia.                          *
-***********************************************************
-"""
-
-def fuchsia_select(configs):
-  """ select() variant that prints a meaningful error.
-
-  Args:
-    config: A dict of config name-value pairs.
-
-  Returns:
-    Selected attribute value depending on the config.
-  """
-  return select(configs, no_match_error = _ERROR)
diff --git a/sdk/bazel/base/common/build_defs/fuchsia_setup.bzl b/sdk/bazel/base/common/build_defs/fuchsia_setup.bzl
deleted file mode 100644
index 53d78fe..0000000
--- a/sdk/bazel/base/common/build_defs/fuchsia_setup.bzl
+++ /dev/null
@@ -1,25 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""
-Sets up the Fuchsia SDK.
-
-Must be called even if all attributes are set to false.
-"""
-
-load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
-load("//build_defs/internal/crosstool:crosstool.bzl", "configure_crosstool")
-
-def fuchsia_setup(with_toolchain=False):
-    # Needed for the package component runner tool.
-    http_archive(
-        name = "subpar",
-        url = "https://github.com/google/subpar/archive/1.0.0.zip",
-        strip_prefix = "subpar-1.0.0",
-    )
-
-    if with_toolchain:
-        configure_crosstool(
-            name = "fuchsia_crosstool",
-        )
diff --git a/sdk/bazel/base/common/build_defs/internal/component_runner/BUILD b/sdk/bazel/base/common/build_defs/internal/component_runner/BUILD
deleted file mode 100644
index 820ddce..0000000
--- a/sdk/bazel/base/common/build_defs/internal/component_runner/BUILD
+++ /dev/null
@@ -1,14 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@subpar//:subpar.bzl", "par_binary")
-
-licenses(["notice"])
-
-package(default_visibility = ["//visibility:public"])
-
-par_binary(
-    name = "component_runner",
-    srcs = ["component_runner.py"],
-)
diff --git a/sdk/bazel/base/common/build_defs/internal/component_runner/component_runner.py b/sdk/bazel/base/common/build_defs/internal/component_runner/component_runner.py
deleted file mode 100644
index a067ab8..0000000
--- a/sdk/bazel/base/common/build_defs/internal/component_runner/component_runner.py
+++ /dev/null
@@ -1,129 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import re
-import shutil
-from subprocess import Popen, PIPE
-import sys
-import tempfile
-
-
-def run_command(*args, **kwargs):
-    no_redirect = kwargs.pop('no_redirect', False)
-    output = None if no_redirect else PIPE
-    process = Popen(args, stdout=output, stderr=output)
-    stdout, stderr = process.communicate()
-    if process.returncode:
-        if no_redirect:
-            raise Exception('Command %s failed' % args)
-        else:
-            raise Exception('Command %s failed: %s' % (args, stdout + stderr))
-    return stdout
-
-
-def get_device_addresses(dev_finder):
-    # Find a target device.
-    stdout = run_command(dev_finder, 'list', '-device_limit', '1', '-full')
-    match = re.match('^([^\s]+)\s+([^\s]+)$', stdout.strip())
-    if not match:
-        raise Exception('Could not parse target parameters in %s' % stdout)
-    target_address = match.group(1)
-    target_name = match.group(2)
-
-    # Get the matching host address for that device.
-    stdout = run_command(dev_finder, 'resolve', '-local', target_name)
-    host_address = stdout.strip()
-    return target_address, host_address
-
-
-def serve_package(pm, package, directory):
-    # Set up the package repository.
-    run_command(pm, 'newrepo', '-repo', directory)
-    run_command(pm, 'publish', '-a', '-r', directory, '-f', package)
-
-    # Start the server.
-    server = Popen([pm, 'serve', '-repo', directory+'/repository'], stdout=PIPE,
-                   stderr=PIPE)
-    return lambda: server.kill()
-
-
-class MyParser(argparse.ArgumentParser):
-
-    def error(self, message):
-        print('Usage: bazel run <package label> -- <component name> --ssh-key '
-              '<path to private key>')
-        sys.exit(1)
-
-
-def main():
-    parser = MyParser()
-    parser.add_argument('--config',
-                        help='The path to the list of components in the package',
-                        required=True)
-    parser.add_argument('--package-name',
-                        help='The name of the Fuchsia package',
-                        required=True)
-    parser.add_argument('--package',
-                        help='The path to the Fuchsia package',
-                        required=True)
-    parser.add_argument('--dev-finder',
-                        help='The path to the dev_finder tool',
-                        required=True)
-    parser.add_argument('--pm',
-                        help='The path to the pm tool',
-                        required=True)
-    subparse = parser.add_subparsers().add_parser('run')
-    subparse.add_argument('component',
-                          nargs=1)
-    subparse.add_argument('--ssh-key',
-                          help='The absolute path to a private SSH key',
-                          required=True)
-    args = parser.parse_args()
-
-    if not os.path.isabs(args.ssh_key):
-        print('Path to SSH key must be absolute, got %s' % args.ssh_key)
-        return 1
-
-    with open(args.config, 'r') as config_file:
-        components = config_file.readlines()
-
-    component = args.component[0]
-    if component not in components:
-        print('Error: "%s" not in %s' % (component, components))
-        return 1
-
-    staging_dir = tempfile.mkdtemp(prefix='fuchsia-run')
-
-    try:
-        target_address, host_address = get_device_addresses(args.dev_finder)
-        stop_server = serve_package(args.pm, args.package, staging_dir)
-        try:
-            def run_ssh_command(*params, **kwargs):
-                base = [
-                    'ssh', '-i', args.ssh_key,
-                    'fuchsia@' + target_address,
-                    '-o', 'StrictHostKeyChecking=no',
-                    '-o', 'UserKnownHostsFile=/dev/null',
-                ]
-                run_command(*(base + list(params)), **kwargs)
-            server_address = 'http://%s:8083/config.json' % host_address
-            run_ssh_command('amber_ctl', 'add_src', '-x', '-f', server_address)
-            component_uri = "fuchsia-pkg://fuchsia.com/%s#meta/%s.cmx" % (
-                    args.package_name, component)
-            run_ssh_command('run', component_uri, no_redirect=True)
-        finally:
-            stop_server()
-    except Exception as e:
-        print(e)
-        return 1
-    finally:
-        shutil.rmtree(staging_dir, ignore_errors=True)
-
-    return 0
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/sdk/bazel/base/common/build_defs/internal/crosstool/BUILD b/sdk/bazel/base/common/build_defs/internal/crosstool/BUILD
deleted file mode 100644
index 41860ad..0000000
--- a/sdk/bazel/base/common/build_defs/internal/crosstool/BUILD
+++ /dev/null
@@ -1,9 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-licenses(["notice"])
-
-exports_files(
-    glob(["*"]),
-)
diff --git a/sdk/bazel/base/common/build_defs/package.bzl b/sdk/bazel/base/common/build_defs/package.bzl
deleted file mode 100644
index 11144c2..0000000
--- a/sdk/bazel/base/common/build_defs/package.bzl
+++ /dev/null
@@ -1,279 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":package_info.bzl", "PackageAggregateInfo", "PackageComponentInfo",
-     "PackageGeneratedInfo", "PackageInfo", "PackageLocalInfo",
-     "get_aggregate_info")
-
-"""
-Defines a Fuchsia package
-
-The package template is used to define a unit of related code and data.
-
-Parameters
-
-    name(string, required)
-        The name of the package
-
-    deps(list, required)
-        The list of targets to be built into this package
-"""
-
-# The attributes along which the aspect propagates.
-_ASPECT_ATTRIBUTES = [
-    "data",
-    "deps",
-    "srcs",
-]
-
-def _info_impl(target, context):
-    components = []
-    mappings = []
-    if PackageComponentInfo in target:
-        info = target[PackageComponentInfo]
-        components += [(info.name, info.manifest)]
-    if PackageLocalInfo in target:
-        mappings += target[PackageLocalInfo].mappings
-    if PackageGeneratedInfo in target:
-        mappings += target[PackageGeneratedInfo].mappings
-    deps = []
-    for attribute in _ASPECT_ATTRIBUTES:
-        if hasattr(context.rule.attr, attribute):
-            value = getattr(context.rule.attr, attribute)
-            deps += value
-    return [
-        get_aggregate_info(components, mappings, deps),
-    ]
-
-# An aspect which turns PackageLocalInfo providers into a PackageAggregateInfo
-# provider to identify all elements which need to be included in the package.
-_info_aspect = aspect(
-    implementation = _info_impl,
-    attr_aspects = _ASPECT_ATTRIBUTES,
-    provides = [
-        PackageAggregateInfo,
-    ],
-    # If any other aspect is applied to produce package mappings, let the result
-    # of that process be visible to the present aspect.
-    required_aspect_providers = [
-        PackageGeneratedInfo,
-    ],
-)
-
-def _fuchsia_package_impl(context):
-    # List all the files that need to be included in the package.
-    info = get_aggregate_info([], [], context.attr.deps)
-    manifest_file_contents = ""
-    package_contents = []
-
-    # Generate the manifest file with a script: this helps ignore empty files.
-    base = context.attr.name + "_pkg/"
-    manifest_file = context.actions.declare_file(base + "package_manifest")
-
-    content = "#!/bin/bash\n"
-    for dest, source in info.mappings.to_list():
-        # Only add file to the manifest if not empty.
-        content += "if [[ -s %s ]]; then\n" % source.path
-        content += "  echo '%s=%s' >> %s\n" % (dest, source.path,
-                                               manifest_file.path)
-        content += "fi\n"
-        package_contents.append(source)
-
-    # Add cmx file for each component.
-    for name, cmx in info.components.to_list():
-        content += "echo 'meta/%s.cmx=%s' >> %s\n" % (name, cmx.path,
-                                                      manifest_file.path)
-        package_contents.append(cmx)
-
-    # Add the meta/package file to the manifest.
-    meta_package = context.actions.declare_file(base + "meta/package")
-    content += "echo 'meta/package=%s' >> %s\n" % (meta_package.path,
-                                                   manifest_file.path)
-
-    # Write the manifest file.
-    manifest_script = context.actions.declare_file(base + "package_manifest.sh")
-    context.actions.write(
-        output = manifest_script,
-        content = content,
-        is_executable = True,
-    )
-    context.actions.run(
-        executable = manifest_script,
-        inputs = package_contents,
-        outputs = [
-            manifest_file,
-        ],
-        mnemonic = "FuchsiaManifest",
-    )
-
-    # Initialize the package's meta directory.
-    package_dir = manifest_file.dirname
-    context.actions.run(
-        executable = context.executable._pm,
-        arguments = [
-            "-o",
-            package_dir,
-            "-n",
-            context.attr.name,
-            "init",
-        ],
-        outputs = [
-            meta_package,
-        ],
-        mnemonic = "PmInit",
-    )
-
-    # TODO(pylaligand): figure out how to specify this key.
-    # Generate a signing key.
-    signing_key = context.actions.declare_file(base + "development.key")
-    context.actions.run(
-        executable = context.executable._pm,
-        arguments = [
-            "-o",
-            package_dir,
-            "-k",
-            signing_key.path,
-            "genkey",
-        ],
-        inputs = [
-            meta_package,
-        ],
-        outputs = [
-            signing_key,
-        ],
-        mnemonic = "PmGenkey",
-    )
-
-    # Build the package metadata.
-    meta_far = context.actions.declare_file(base + "meta.far")
-    context.actions.run(
-        executable = context.executable._pm,
-        arguments = [
-            "-o",
-            package_dir,
-            "-k",
-            signing_key.path,
-            "-m",
-            manifest_file.path,
-            "build",
-        ],
-        inputs = package_contents + [
-            manifest_file,
-            meta_package,
-            signing_key,
-        ],
-        outputs = [
-            meta_far,
-        ],
-        mnemonic = "PmBuild",
-    )
-
-    # Create the package archive.
-    package_archive = context.actions.declare_file(base + context.attr.name + "-0.far")
-    context.actions.run(
-        executable = context.executable._pm,
-        arguments = [
-            "-o",
-            package_dir,
-            "-k",
-            signing_key.path,
-            "-m",
-            manifest_file.path,
-            "archive",
-        ],
-        inputs = [
-            manifest_file,
-            signing_key,
-            meta_far,
-        ] + package_contents,
-        outputs = [
-            package_archive,
-        ],
-        mnemonic = "PmArchive",
-    )
-
-    components_file = context.actions.declare_file(context.attr.name + "_components.txt")
-    components_contents = "\n".join([n for n, _ in info.components.to_list()])
-    context.actions.write(
-        output = components_file,
-        content = components_contents,
-    )
-
-    executable_file = context.actions.declare_file(context.attr.name + "_run.sh")
-    executable_contents = """#!/bin/sh\n
-%s \\
-    --config %s \\
-    --package-name %s \\
-    --package %s \\
-    --dev-finder %s \\
-    --pm %s \\
-    run \\
-    \"$@\"
-""" % (
-        context.executable._runner.short_path,
-        components_file.short_path,
-        context.attr.name,
-        package_archive.short_path,
-        context.executable._dev_finder.short_path,
-        context.executable._pm.short_path,
-    )
-    context.actions.write(
-        output = executable_file,
-        content = executable_contents,
-        is_executable = True,
-    )
-
-    runfiles = context.runfiles(files = [
-        components_file,
-        context.executable._dev_finder,
-        context.executable._pm,
-        context.executable._runner,
-        executable_file,
-        package_archive,
-    ])
-
-    return [
-        DefaultInfo(
-            files = depset([package_archive]),
-            executable = executable_file,
-            runfiles = runfiles,
-        ),
-        PackageInfo(
-            name = context.attr.name,
-            archive = package_archive,
-        ),
-    ]
-
-fuchsia_package = rule(
-    implementation = _fuchsia_package_impl,
-    attrs = {
-        "deps": attr.label_list(
-            doc = "The objects to include in the package",
-            aspects = [
-                _info_aspect,
-            ],
-            mandatory = True,
-        ),
-        "_pm": attr.label(
-            default = Label("//tools:pm"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-        "_dev_finder": attr.label(
-            default = Label("//tools:dev_finder"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-        "_runner": attr.label(
-            default = Label("//build_defs/internal/component_runner:component_runner.par"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-    },
-    provides = [PackageInfo],
-    executable = True,
-)
diff --git a/sdk/bazel/base/common/build_defs/package_files.bzl b/sdk/bazel/base/common/build_defs/package_files.bzl
deleted file mode 100644
index 00b5294..0000000
--- a/sdk/bazel/base/common/build_defs/package_files.bzl
+++ /dev/null
@@ -1,37 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":package_info.bzl", "PackageLocalInfo")
-
-"""
-Declares some files to be included in a Fuchsia package
-
-Parameters
-
-    name(string, required)
-        The name of the targets
-
-    contents(dict, required)
-        The mappings of source file to path in package
-"""
-
-def _package_files_impl(context):
-    mappings = {}
-    for label, dest in context.attr.contents.items():
-        source = label.files.to_list()[0]
-        mappings[dest] = source
-    return [
-        PackageLocalInfo(mappings = mappings.items()),
-    ]
-
-package_files = rule(
-    implementation = _package_files_impl,
-    attrs = {
-        "contents": attr.label_keyed_string_dict(
-            doc = "Mappings of source file to path in package",
-            mandatory = True,
-            allow_files = True,
-        )
-    }
-)
diff --git a/sdk/bazel/base/common/build_defs/package_info.bzl b/sdk/bazel/base/common/build_defs/package_info.bzl
deleted file mode 100644
index 2a62bfb..0000000
--- a/sdk/bazel/base/common/build_defs/package_info.bzl
+++ /dev/null
@@ -1,60 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""
-Some utilities to declare and aggregate package contents.
-"""
-
-# Identifies a component added to a package.
-PackageComponentInfo = provider(
-    fields = {
-        "name": "name of the component",
-        "manifest": "path to the component manifest file",
-    },
-)
-
-# Represents a set of files to be added to a package.
-PackageLocalInfo = provider(
-    fields = {
-        "mappings": "list of (package dest, source) pairs",
-    },
-)
-
-# Identical to PackageLocalInfo, but a different type is needed when that
-# information if generated from an aspect so that it does not collide with any
-# existing PackageLocalInfo returned provider.
-PackageGeneratedInfo = provider(
-    fields = {
-        "mappings": "list of (package dest, source) pairs",
-    },
-)
-
-# Aggregates the information provided by the above providers.
-PackageAggregateInfo = provider(
-    fields = {
-        "components": "depset of (name, manifest) pairs",
-        "mappings": "depset of (package dest, source) pairs",
-    },
-)
-
-def get_aggregate_info(components, mappings, deps):
-    transitive_components = []
-    transitive_mappings = []
-    for dep in deps:
-        if PackageAggregateInfo not in dep:
-            continue
-        transitive_components.append(dep[PackageAggregateInfo].components)
-        transitive_mappings.append(dep[PackageAggregateInfo].mappings)
-    return PackageAggregateInfo(
-        components = depset(components, transitive = transitive_components),
-        mappings = depset(mappings, transitive = transitive_mappings),
-    )
-
-# Contains information about a built Fuchsia package.
-PackageInfo = provider(
-    fields = {
-        "name": "name of the package",
-        "archive": "archive file",
-    },
-)
diff --git a/sdk/bazel/base/common/build_defs/target_cpu/BUILD b/sdk/bazel/base/common/build_defs/target_cpu/BUILD
deleted file mode 100644
index 3cec764..0000000
--- a/sdk/bazel/base/common/build_defs/target_cpu/BUILD
+++ /dev/null
@@ -1,23 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-licenses(["notice"])
-
-package(
-    default_visibility = ["//visibility:public"],
-)
-
-config_setting(
-    name = "arm64",
-    values = {
-        "cpu": "aarch64",
-    },
-)
-
-config_setting(
-    name = "x64",
-    values = {
-        "cpu": "x86_64",
-    },
-)
diff --git a/sdk/bazel/base/dart/build_defs/BUILD.flutter b/sdk/bazel/base/dart/build_defs/BUILD.flutter
deleted file mode 100644
index 891ae50..0000000
--- a/sdk/bazel/base/dart/build_defs/BUILD.flutter
+++ /dev/null
@@ -1,19 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@io_bazel_rules_dart//dart/build_rules:core.bzl", "dart_library")
-
-package(default_visibility = ["//visibility:public"])
-
-dart_library(
-    name = "flutter",
-    pub_pkg_name = "flutter",
-    srcs = glob(["lib/**/*.dart"]),
-    deps = [
-        "@vendor_collection//:collection",
-        "@vendor_meta//:meta",
-        "@vendor_typed_data//:typed_data",
-        "@vendor_vector_math//:vector_math",
-    ],
-)
diff --git a/sdk/bazel/base/dart/build_defs/BUILD.flutter_root b/sdk/bazel/base/dart/build_defs/BUILD.flutter_root
deleted file mode 100644
index e258ba0..0000000
--- a/sdk/bazel/base/dart/build_defs/BUILD.flutter_root
+++ /dev/null
@@ -1,10 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-package(default_visibility = ["//visibility:public"])
-
-alias(
-    name = "flutter",
-    actual = "//packages/flutter:flutter",
-)
diff --git a/sdk/bazel/base/dart/build_defs/dart.bzl b/sdk/bazel/base/dart/build_defs/dart.bzl
deleted file mode 100644
index a226746..0000000
--- a/sdk/bazel/base/dart/build_defs/dart.bzl
+++ /dev/null
@@ -1,215 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(
-    "@io_bazel_rules_dart//dart/build_rules/common:context.bzl",
-    "collect_dart_context",
-    "make_dart_context",
-)
-load(
-    "@io_bazel_rules_dart//dart/build_rules/internal:common.bzl",
-    "package_spec_action",
-)
-
-"""Common attributes used by the `compile_kernel_action`."""
-COMMON_COMPILE_KERNEL_ACTION_ATTRS = {
-    "main": attr.label(
-        doc = "The main script file",
-        mandatory = True,
-        allow_single_file = True,
-    ),
-    "srcs": attr.label_list(
-        doc = "Additional source files",
-        allow_files = True,
-    ),
-    "package_name": attr.string(
-        doc = "The Dart package name",
-        mandatory = True,
-    ),
-    "deps": attr.label_list(
-        doc = "The list of libraries this app depends on",
-        mandatory = False,
-        providers = ["dart"],
-    ),
-    "space_dart": attr.bool(
-        doc = "Whether or not to use SpaceDart (defaults to true)",
-        default = True,
-    ),
-    "_dart": attr.label(
-        default = Label("//tools:dart"),
-        allow_single_file = True,
-        executable = True,
-        cfg = "host",
-    ),
-    "_kernel_compiler": attr.label(
-        default = Label("//tools/dart_prebuilts:kernel_compiler.snapshot"),
-        allow_single_file = True,
-        cfg = "host",
-    ),
-}
-
-def compile_kernel_action(
-        context,
-        package_name,
-        dest_dir,
-        dart_exec,
-        kernel_compiler,
-        sdk_root,
-        main,
-        srcs,
-        deps,
-        kernel_snapshot_file,
-        manifest_file,
-        main_dilp_file,
-        dilp_list_file):
-    """Creates an action that generates the Dart kernel and its dependencies.
-
-    Args:
-        context: The rule context.
-        package_name: The Dart package name.
-        dest_dir: The directory under data/ where to install compiled files.
-        dart_exec: The Dart executable `File`.
-        kernel_compiler: The kernel compiler snapshot `File`.
-        sdk_root: The Dart SDK root `File` (Dart or Flutter's platform libs).
-        main: The main `File`.
-        srcs: Additional list of source `File`.
-        deps: A list of `Label`s this app depends on.
-        kernel_snapshot_file: The kernel snapshot `File` output.
-        manifest_file: The Fuchsia manifest `File` output.
-        main_dilp_file: The compiled main dilp `File` output.
-        dilp_list_file: The dilplist `File` output.
-
-    Returns:
-        Mapping `dict` to be used for packaging.
-    """
-    build_dir = context.label.name + ".build/"
-    dart_ctx = make_dart_context(
-        ctx = context,
-        package = package_name,
-        deps = deps,
-    )
-    additional_args = []
-
-    # 1. Create the .packages file.
-    package_spec_path = context.label.name + ".packages"
-    package_spec = context.actions.declare_file(package_spec_path)
-    package_spec_action(
-        ctx = context,
-        output = package_spec,
-        dart_ctx = dart_ctx,
-    )
-
-    # 2. Declare *.dilp files for all dependencies.
-    data_root = "data/%s/" % dest_dir
-    mappings = {}
-    dart_ctxs = collect_dart_context(dart_ctx).values()
-    for dc in dart_ctxs:
-        dilp_file = context.actions.declare_file(
-            context.label.name + "_kernel.dil-" + dc.package + ".dilp",
-        )
-        mappings[data_root + dc.package + ".dilp"] = dilp_file
-
-    # 3. Create a wrapper script around the kernel compiler.
-    #    The kernel compiler only generates .dilp files for libraries that are
-    #    actually used by app. However, we declare a .dilp file for all packages
-    #    in the dependency graph: not creating that file would yield a Bazel error.
-    content = "#!/bin/bash\n"
-    content += dart_exec.path
-    content += " $@ || exit $?\n"
-    for dilp in mappings.values():
-        content += "if ! [[ -f %s ]]; then\n" % dilp.path
-        content += "  echo 'Warning: %s is not needed, generating empty file.' >&2\n" % dilp.path
-        content += "  touch %s\n" % dilp.path
-        content += "fi\n"
-
-    kernel_script = context.actions.declare_file(context.label.name + "_compile_kernel.sh")
-    context.actions.write(
-        output = kernel_script,
-        content = content,
-        is_executable = True,
-    )
-
-    # 4. Find all possible roots for multi-root scheme
-    roots_dict = {}
-    for dc in dart_ctxs:
-        dart_srcs = list(dc.dart_srcs)
-        if len(dart_srcs) == 0:
-            continue
-        src = dart_srcs[0]
-        index = src.path.find(dc.lib_root)
-        if index > 0:
-            root = src.path[:index]
-            roots_dict[root] = True
-
-    # Include the root for package spec file
-    roots_dict[package_spec.root.path] = True
-
-    # And current directory as it was ignored in the previous logic
-    roots_dict["."] = True
-
-    for root in roots_dict.keys():
-        additional_args += ["--filesystem-root", root]
-
-    if context.attr.space_dart:
-        additional_args += ["--gen-bytecode"]
-
-    # 5. Compile the kernel.
-    multi_root_scheme = "main-root"
-    context.actions.run(
-        executable = kernel_script,
-        arguments = [
-            kernel_compiler.path,
-            "--data-dir",
-            dest_dir,
-            "--target",
-            "dart_runner",
-            "--platform",
-            sdk_root.path,
-            "--filesystem-scheme",
-            multi_root_scheme,
-        ] + additional_args + [
-            "--packages",
-            "%s:///%s" % (multi_root_scheme, package_spec.short_path),
-            "--no-link-platform",
-            "--split-output-by-packages",
-            "--manifest",
-            manifest_file.path,
-            "--output",
-            kernel_snapshot_file.path,
-            "%s:///%s" % (multi_root_scheme, main.short_path),
-        ],
-        inputs = dart_ctx.transitive_srcs.files + srcs + [
-            kernel_compiler,
-            sdk_root,
-            package_spec,
-            main,
-            dart_exec,
-        ],
-        outputs = [
-            main_dilp_file,
-            dilp_list_file,
-            kernel_snapshot_file,
-            manifest_file,
-        ] + mappings.values(),
-        mnemonic = "DartKernelCompiler",
-    )
-    mappings[data_root + "main.dilp"] = main_dilp_file
-    mappings[data_root + "app.dilplist"] = dilp_list_file
-
-    if context.attr.space_dart:
-        enable_interpreter = context.actions.declare_file(
-            context.label.name + "_enable_interpreter",
-        )
-        context.actions.write(
-            output = enable_interpreter,
-            # The existence of this file is enough to enable SpaceDart, we add
-            # a random string to prevent the `package` rule from removing this
-            # file when empty.
-            # See:
-            #   https://fuchsia.googlesource.com/topaz/+/2a6073f931edc4136761c5b8dcfd2245efc79d45/runtime/flutter_runner/component.cc#57
-            content = "No content",
-        )
-        mappings["data/enable_interpreter"] = enable_interpreter
-
-    return mappings
diff --git a/sdk/bazel/base/dart/build_defs/dart_app.bzl b/sdk/bazel/base/dart/build_defs/dart_app.bzl
deleted file mode 100644
index bf51608..0000000
--- a/sdk/bazel/base/dart/build_defs/dart_app.bzl
+++ /dev/null
@@ -1,71 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":dart.bzl", "COMMON_COMPILE_KERNEL_ACTION_ATTRS", "compile_kernel_action")
-load(":package_info.bzl", "PackageComponentInfo", "PackageLocalInfo")
-
-# A Fuchsia Dart application
-#
-# Parameters
-#
-#   main_dart
-#     The main script file.
-#
-#   deps
-#     List of libraries to link to this application.
-
-def _dart_app_impl(context):
-    kernel_snapshot_file = context.outputs.kernel_snapshot
-    manifest_file = context.outputs.manifest
-    component_name = context.files.component_manifest[0].basename.split(".")[0]
-    mappings = compile_kernel_action(
-        context = context,
-        package_name = context.attr.package_name,
-        dest_dir = component_name,
-        dart_exec = context.executable._dart,
-        kernel_compiler = context.files._kernel_compiler[0],
-        sdk_root = context.files._platform_lib[0],
-        main = context.files.main[0],
-        srcs = context.files.srcs,
-        deps = context.attr.deps,
-        kernel_snapshot_file = kernel_snapshot_file,
-        manifest_file = manifest_file,
-        main_dilp_file = context.outputs.main_dilp,
-        dilp_list_file = context.outputs.dilp_list,
-    )
-    outs = [kernel_snapshot_file, manifest_file]
-    return [
-        DefaultInfo(files = depset(outs), runfiles = context.runfiles(files = outs)),
-        PackageLocalInfo(mappings = mappings.items()),
-        PackageComponentInfo(
-            name = component_name,
-            manifest = context.files.component_manifest[0],
-        ),
-    ]
-
-dart_app = rule(
-    implementation = _dart_app_impl,
-    attrs = dict({
-        "component_manifest": attr.label(
-            doc = "The dart component's cmx",
-            mandatory = True,
-            allow_single_file = True,
-        ),
-        "_platform_lib": attr.label(
-            default = Label("//tools/dart_prebuilts/dart_runner:platform_strong.dill"),
-            allow_single_file = True,
-            cfg = "host",
-        ),
-    }.items() + COMMON_COMPILE_KERNEL_ACTION_ATTRS.items()),
-    outputs = {
-        # Kernel snapshot.
-        "kernel_snapshot": "%{name}_kernel.dil",
-        # Main dilp file.
-        "main_dilp": "%{name}_kernel.dil-main.dilp",
-        # Dilp list.
-        "dilp_list": "%{name}_kernel.dilpmanifest.dilplist",
-        # Fuchsia package manifest file.
-        "manifest": "%{name}_kernel.dilpmanifest",
-    },
-)
diff --git a/sdk/bazel/base/dart/build_defs/dart_fidl_library.bzl b/sdk/bazel/base/dart/build_defs/dart_fidl_library.bzl
deleted file mode 100644
index 27545a4..0000000
--- a/sdk/bazel/base/dart/build_defs/dart_fidl_library.bzl
+++ /dev/null
@@ -1,124 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(
-    "@io_bazel_rules_dart//dart/build_rules/common:context.bzl",
-    "make_dart_context"
-)
-load(
-    "@io_bazel_rules_dart//dart/build_rules/internal:analyze.bzl",
-    "summary_action",
-)
-load(":fidl_library.bzl", "FidlLibraryInfo")
-
-# A Dart library backed by a FIDL library.
-#
-# Parameters
-#
-#   library
-#     Label of the FIDL library.
-
-def _dart_codegen_impl(target, context):
-    ir = target[FidlLibraryInfo].ir
-    library_name = target[FidlLibraryInfo].name
-
-    package_root_dir = context.rule.attr.name + "_fidl_dart/lib"
-    package_root = context.actions.declare_directory(package_root_dir)
-    fidl_dart_file = context.new_file(package_root_dir + "/fidl.dart")
-    fidl_async_dart_file = context.new_file(
-        package_root_dir + "/fidl_async.dart")
-
-    context.actions.run(
-        executable = context.executable._fidlgen,
-        arguments = [
-            "--json",
-            ir.path,
-            "--output-base",
-            package_root.path,
-            "--include-base",
-            "this_is_a_bogus_value",
-        ],
-        inputs = [
-            ir,
-        ],
-        outputs = [
-            package_root,
-            fidl_dart_file,
-            fidl_async_dart_file,
-        ],
-        mnemonic = "FidlGenDart",
-    )
-
-    package_name = "fidl_" + library_name.replace(".", "_")
-    deps = context.rule.attr.deps + context.attr._deps
-
-    dart_ctx = make_dart_context(
-        context,
-        generated_srcs = [
-            fidl_dart_file,
-            fidl_async_dart_file,
-        ],
-        lib_root = context.label.package + "/" + package_root_dir,
-        deps = deps,
-        enable_summaries = True,
-        package = package_name,
-    )
-
-    summary_action(context, dart_ctx)
-    files_provider = depset([dart_ctx.strong_summary])
-
-    return struct(
-        dart = dart_ctx,
-        files_provider = files_provider,
-    )
-
-# This aspects runs the FIDL code generator on a given FIDL library.
-_dart_codegen = aspect(
-    implementation = _dart_codegen_impl,
-    attr_aspects = [
-        # Propagate the aspect to every dependency of the library.
-        "deps",
-    ],
-    attrs = {
-        "_fidlgen": attr.label(
-            default = Label("//tools:fidlgen_dart"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-        "_analyzer": attr.label(
-            default = Label("@dart_sdk//:analyzer"),
-            executable = True,
-            cfg = "host",
-        ),
-        "_deps": attr.label_list(
-            default = [
-                Label("//dart/fidl"),
-            ],
-        ),
-    },
-)
-
-def _dart_fidl_library_impl(context):
-    if len(context.attr.deps) != 1:
-        fail("'deps' attribute must have exactly one element.", "deps")
-    library = context.attr.deps[0]
-    return struct(
-        dart = library.dart,
-        files = library.files_provider,
-    )
-
-dart_fidl_library = rule(
-    implementation = _dart_fidl_library_impl,
-    attrs = {
-        "deps": attr.label_list(
-            doc = "The FIDL library to generate code for",
-            mandatory = True,
-            allow_empty = False,
-            allow_files = False,
-            providers = [FidlLibraryInfo],
-            aspects = [_dart_codegen],
-        ),
-    },
-)
diff --git a/sdk/bazel/base/dart/build_defs/dart_library.bzl b/sdk/bazel/base/dart/build_defs/dart_library.bzl
deleted file mode 100644
index aacff85..0000000
--- a/sdk/bazel/base/dart/build_defs/dart_library.bzl
+++ /dev/null
@@ -1,36 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":dart.bzl", "DartLibraryInfo", "produce_package_info")
-
-def _dart_library_impl(context):
-    return [
-        produce_package_info(context.attr.package_name,
-                             context.files.source_dir[0],
-                             context.attr.deps),
-    ]
-
-dart_library = rule(
-    implementation = _dart_library_impl,
-    attrs = {
-        "package_name": attr.string(
-            doc = "The name of the Dart package",
-            mandatory = True,
-        ),
-        "source_dir": attr.label(
-            # TODO(pylaligand): set a default value to "lib".
-            doc = "The directory containing the library sources",
-            mandatory = True,
-            allow_single_file = True,
-        ),
-        "deps": attr.label_list(
-            doc = "The list of libraries this library depends on",
-            mandatory = False,
-            providers = [DartLibraryInfo],
-        ),
-    },
-    provides = [
-        DartLibraryInfo,
-    ],
-)
diff --git a/sdk/bazel/base/dart/build_defs/flutter_app.bzl b/sdk/bazel/base/dart/build_defs/flutter_app.bzl
deleted file mode 100644
index 4f65758..0000000
--- a/sdk/bazel/base/dart/build_defs/flutter_app.bzl
+++ /dev/null
@@ -1,94 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load(":dart.bzl", "COMMON_COMPILE_KERNEL_ACTION_ATTRS", "compile_kernel_action")
-load(":package_info.bzl", "PackageComponentInfo", "PackageLocalInfo")
-
-# A Fuchsia Flutter application
-#
-# Parameters
-#
-#   main_dart
-#     The main script file.
-#
-#   deps
-#     List of libraries to link to this application.
-
-def _flutter_app_impl(context):
-    kernel_snapshot_file = context.outputs.kernel_snapshot
-    manifest_file = context.outputs.manifest
-    component_name = context.files.component_manifest[0].basename.split(".")[0]
-    mappings = compile_kernel_action(
-        context = context,
-        package_name = context.attr.package_name,
-        dest_dir = component_name,
-        dart_exec = context.executable._dart,
-        kernel_compiler = context.files._kernel_compiler[0],
-        sdk_root = context.files._platform_lib[0],
-        main = context.files.main[0],
-        srcs = context.files.srcs,
-        deps = context.attr.deps,
-        kernel_snapshot_file = kernel_snapshot_file,
-        manifest_file = manifest_file,
-        main_dilp_file = context.outputs.main_dilp,
-        dilp_list_file = context.outputs.dilp_list,
-    )
-
-    # Package the assets.
-    data_root = "data/%s/" % component_name
-    asset_manifest_dict = {}
-    package_name_len = len(context.label.package)
-    for asset in context.files.assets:
-        # Remove the package name from the path.
-        short_path = asset.short_path[package_name_len + 1:]
-
-        mappings[data_root + short_path] = asset
-        asset_manifest_dict[short_path] = [short_path]
-
-    asset_manifest = context.actions.declare_file("AssetManifest.json")
-    context.actions.write(
-        output = asset_manifest,
-        content = "%s" % asset_manifest_dict,
-    )
-
-    mappings[data_root + "AssetManifest.json"] = asset_manifest
-    outs = [kernel_snapshot_file, manifest_file]
-    return [
-        DefaultInfo(files = depset(outs), runfiles = context.runfiles(files = outs)),
-        PackageLocalInfo(mappings = mappings.items()),
-        PackageComponentInfo(
-            name = component_name,
-            manifest = context.files.component_manifest[0],
-        ),
-    ]
-
-flutter_app = rule(
-    implementation = _flutter_app_impl,
-    attrs = dict({
-        "assets": attr.label_list(
-            doc = "The app's assets",
-            allow_files = True,
-        ),
-        "component_manifest": attr.label(
-            doc = "The flutter component's cmx",
-            mandatory = True,
-            allow_single_file = True,
-        ),
-        "_platform_lib": attr.label(
-            default = Label("//tools/dart_prebuilts/flutter_runner:platform_strong.dill"),
-            allow_single_file = True,
-            cfg = "host",
-        ),
-    }.items() + COMMON_COMPILE_KERNEL_ACTION_ATTRS.items()),
-    outputs = {
-        # Kernel snapshot.
-        "kernel_snapshot": "%{name}_kernel.dil",
-        # Main dilp file.
-        "main_dilp": "%{name}_kernel.dil-main.dilp",
-        # Dilp list.
-        "dilp_list": "%{name}_kernel.dilpmanifest.dilplist",
-        # Fuchsia package manifest file.
-        "manifest": "%{name}_kernel.dilpmanifest",
-    },
-)
diff --git a/sdk/bazel/base/dart/build_defs/setup_flutter.bzl b/sdk/bazel/base/dart/build_defs/setup_flutter.bzl
deleted file mode 100644
index c862440..0000000
--- a/sdk/bazel/base/dart/build_defs/setup_flutter.bzl
+++ /dev/null
@@ -1,57 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@io_bazel_rules_dart//dart/build_rules/internal:pub.bzl", "pub_repository")
-
-FLUTTER_DOWNLOAD_URL = (
-    "https://github.com/flutter/flutter/archive/v0.5.5.zip"
-)
-
-FLUTTER_SHA256 = (
-    "abb107da9b933ee6355b5a4fad983ff135e612a9a46defda08e03aea647b972c"
-)
-
-# TODO(alainv|DX-314): Pull dependencies automatically from
-#     //third_party/dart-pkg/git/flutter/packages/flutter:flutter.
-FLUTTER_DEPENDENCIES = {
-    "collection": "1.14.6",
-    "meta": "1.1.5",
-    "typed_data": "1.1.5",
-    "vector_math": "2.0.8",
-}
-
-def _install_flutter_dependencies():
-    """Installs Flutter's dependencies."""
-    for name, version in FLUTTER_DEPENDENCIES.items():
-        pub_repository(
-            name = "vendor_" + name,
-            output = ".",
-            package = name,
-            version = version,
-            pub_deps = [],
-        )
-
-def _install_flutter_impl(repository_ctx):
-    """Installs the flutter repository."""
-    # Download Flutter.
-    repository_ctx.download_and_extract(
-        url = FLUTTER_DOWNLOAD_URL,
-        output = ".",
-        sha256 = FLUTTER_SHA256,
-        type = "zip",
-        stripPrefix = "flutter-0.5.5",
-    )
-    # Set up the BUILD file from the Fuchsia SDK.
-    repository_ctx.symlink(
-        Label("@fuchsia_sdk//build_defs:BUILD.flutter_root"), "BUILD")
-    repository_ctx.symlink(
-        Label("@fuchsia_sdk//build_defs:BUILD.flutter"), "packages/flutter/BUILD")
-
-_install_flutter = repository_rule(
-    implementation = _install_flutter_impl,
-)
-
-def setup_flutter():
-    _install_flutter_dependencies()
-    _install_flutter(name = "vendor_flutter")
diff --git a/sdk/bazel/create_test_workspace.py b/sdk/bazel/create_test_workspace.py
deleted file mode 100644
index 96be119..0000000
--- a/sdk/bazel/create_test_workspace.py
+++ /dev/null
@@ -1,94 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-from collections import defaultdict
-import os
-import shutil
-import stat
-import sys
-
-SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # sdk
-    SCRIPT_DIR)))                # bazel
-
-sys.path += [os.path.join(FUCHSIA_ROOT, 'third_party', 'mako')]
-from mako.lookup import TemplateLookup
-from mako.template import Template
-sys.path += [os.path.join(FUCHSIA_ROOT, 'scripts', 'sdk', 'common')]
-from files import copy_tree, make_dir
-import template_model as model
-
-
-class SdkWorkspaceInfo(object):
-    '''Gathers information about an SDK workspace that is necessary to generate
-    tests for it.
-    '''
-
-    def __init__(self):
-        # Map of target to list of header files.
-        # Used to verify that including said headers works properly.
-        self.headers = defaultdict(list)
-        # Whether the workspace has C/C++ content.
-        self.with_cc = False
-        # Whether the workspace has Dart content.
-        self.with_dart = False
-        # Supported target arches.
-        self.target_arches = []
-
-
-def write_file(path, template_name, data, is_executable=False):
-    '''Writes a file based on a Mako template.'''
-    base = os.path.join(SCRIPT_DIR, 'templates')
-    lookup = TemplateLookup(directories=[base, os.path.join(base, 'tests')])
-    template = lookup.get_template(template_name + '.mako')
-    with open(path, 'w') as file:
-        file.write(template.render(data=data))
-    if is_executable:
-        st = os.stat(path)
-        os.chmod(path, st.st_mode | stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH)
-
-
-def create_test_workspace(sdk, output, workspace_info):
-    # Remove any existing output.
-    shutil.rmtree(output, True)
-
-    # Copy the base tests.
-    copy_tree(os.path.join(SCRIPT_DIR, 'tests', 'common'), output)
-    if workspace_info.with_cc:
-        copy_tree(os.path.join(SCRIPT_DIR, 'tests', 'cc'), output)
-    if workspace_info.with_dart:
-        copy_tree(os.path.join(SCRIPT_DIR, 'tests', 'dart'), output)
-
-    # WORKSPACE file.
-    workspace = model.TestWorkspace(os.path.relpath(sdk, output),
-                                    workspace_info.with_cc,
-                                    workspace_info.with_dart)
-    write_file(os.path.join(output, 'WORKSPACE'), 'workspace', workspace)
-
-    # .bazelrc file.
-    crosstool = model.Crosstool(workspace_info.target_arches)
-    write_file(os.path.join(output, '.bazelrc'), 'bazelrc', crosstool)
-
-    # run.py file
-    write_file(os.path.join(output, 'run.py'), 'run_py', crosstool,
-               is_executable=True)
-
-    if workspace_info.with_cc:
-        # Generate test to verify that headers compile fine.
-        headers = workspace_info.headers
-        header_base = os.path.join(output, 'headers')
-        write_file(make_dir(os.path.join(header_base, 'BUILD')),
-                   'headers_build', {
-            'deps': list(filter(lambda k: headers[k], headers.keys())),
-        })
-        write_file(make_dir(os.path.join(header_base, 'headers.cc')),
-                   'headers', {
-            'headers': headers,
-        })
-
-    return True
diff --git a/sdk/bazel/generate.py b/sdk/bazel/generate.py
deleted file mode 100755
index a3a3b6d..0000000
--- a/sdk/bazel/generate.py
+++ /dev/null
@@ -1,336 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import shutil
-import stat
-import sys
-
-SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # sdk
-    SCRIPT_DIR)))                # bazel
-
-sys.path += [os.path.join(FUCHSIA_ROOT, 'third_party', 'mako')]
-from mako.lookup import TemplateLookup
-from mako.template import Template
-sys.path += [os.path.join(FUCHSIA_ROOT, 'scripts', 'sdk', 'common')]
-from frontend import Frontend
-from files import copy_tree
-
-from create_test_workspace import create_test_workspace, SdkWorkspaceInfo
-import template_model as model
-
-
-def sanitize(name):
-    return name.replace('-', '_').replace('.', '_')
-
-
-class BazelBuilder(Frontend):
-
-    def __init__(self, **kwargs):
-        super(BazelBuilder, self).__init__(**kwargs)
-        self.has_dart = False
-        self.has_cc = False
-        self.dart_vendor_packages = {}
-        self.target_arches = []
-        self.workspace_info = SdkWorkspaceInfo()
-
-
-    def _copy_file(self, file, root='', destination='', result=[]):
-        '''Copies the file from a given root directory and writes the
-        resulting relative paths to a list.
-        '''
-        if os.path.commonprefix([root, file]) != root:
-            raise Exception('%s is not within %s' % (file, root))
-        relative_path = os.path.relpath(file, root)
-        dest = self.dest(destination, relative_path)
-        shutil.copy2(self.source(file), dest)
-        result.append(relative_path)
-
-
-    def _copy_files(self, files, root='', destination='', result=[]):
-        '''Copies some files from a given root directory and writes the
-        resulting relative paths to a list.
-        '''
-        for file in files:
-            self._copy_file(file, root, destination, result)
-
-
-    def local(self, *args):
-        '''Builds a path in the current directory.'''
-        return os.path.join(SCRIPT_DIR, *args)
-
-
-    def write_file(self, path, template_name, data):
-        '''Writes a file based on a Mako template.'''
-        lookup = TemplateLookup(directories=[self.local('templates')])
-        template = lookup.get_template(template_name + '.mako')
-        with open(path, 'w') as file:
-            file.write(template.render(data=data))
-
-
-    def add_dart_vendor_package(self, name, version):
-        '''Adds a reference to a new Dart third-party package.'''
-        if name == 'flutter' and version == 'flutter_sdk':
-            # The Flutter SDK is set up separately.
-            return
-        if name in self.dart_vendor_packages:
-            existing_version = self.dart_vendor_packages[name]
-            if existing_version != version:
-                raise Exception('Dart package %s can only have one version; '
-                                '%s and %s requested.' % (name, version,
-                                                          existing_version))
-        else:
-            self.dart_vendor_packages[name] = version
-
-
-    def prepare(self, arch, types):
-        self.target_arches = arch['target']
-
-        # Copy the common files.
-        shutil.copytree(self.local('base', 'common'), self.output)
-        # Copy C/C++ files if needed.
-        if ('sysroot' in types or 'cc_source_library' in types or
-            'cc_prebuilt_library' in types):
-            self.has_cc = True
-            copy_tree(self.local('base', 'cc'), self.output)
-        # Copy Dart files if needed.
-        if 'dart_library' in types:
-            self.has_dart = True
-            copy_tree(self.local('base', 'dart'), self.output)
-
-        self.install_crosstool(arch)
-
-        self.workspace_info.with_cc = self.has_cc
-        self.workspace_info.with_dart = self.has_dart
-        self.workspace_info.target_arches = self.target_arches
-
-
-    def finalize(self, arch, types):
-        self.install_tools()
-        self.install_dart()
-
-
-    def install_tools(self):
-        '''Write BUILD files for tools directories.'''
-        tools_root = os.path.join(self.output, 'tools')
-        for directory, _, _ in os.walk(tools_root, topdown=True):
-            self.write_file(os.path.join(directory, 'BUILD'), 'tools', {})
-
-
-    def install_crosstool(self, arches):
-        '''Generates crosstool.bzl based on the availability of sysroot
-        versions.
-        '''
-        if not self.has_cc:
-            return
-        crosstool = model.Crosstool(arches['target'])
-        crosstool_base = self.dest('build_defs', 'internal', 'crosstool')
-        self.write_file(self.dest(crosstool_base, 'crosstool.bzl'),
-                        'crosstool_bzl', crosstool)
-        self.write_file(self.dest(crosstool_base, 'BUILD.crosstool'),
-                        'crosstool', crosstool)
-        self.write_file(self.dest(crosstool_base, 'CROSSTOOL.in'),
-                        'crosstool_in', crosstool)
-        self.write_file(self.dest('build_defs', 'toolchain', 'BUILD'),
-                        'toolchain_build', crosstool)
-
-
-    def install_dart(self):
-        if not self.has_dart:
-            return
-        # Write the rule for setting up Dart packages.
-        # TODO(pylaligand): this process currently does not capture dependencies
-        # between vendor packages.
-        self.write_file(self.dest('build_defs', 'setup_dart.bzl'),
-                       'setup_dart_bzl', self.dart_vendor_packages)
-
-
-    def install_dart_library_atom(self, atom):
-        package_name = atom['name']
-        name = sanitize(package_name)
-        library = model.DartLibrary(name, package_name)
-        base = self.dest('dart', name)
-
-        self._copy_files(atom['sources'], atom['root'], base)
-
-        for dep in atom['deps']:
-            library.deps.append('//dart/' + sanitize(dep))
-
-        for dep in atom['fidl_deps']:
-            san_dep = sanitize(dep)
-            library.deps.append('//fidl/%s:%s_dart' % (san_dep, san_dep))
-
-        for dep in atom['third_party_deps']:
-            name = dep['name']
-            library.deps.append('@vendor_%s//:%s' % (name, name))
-            self.add_dart_vendor_package(name, dep['version'])
-
-        self.write_file(os.path.join(base, 'BUILD'), 'dart_library', library)
-
-
-    def install_cc_prebuilt_library_atom(self, atom):
-        name = sanitize(atom['name'])
-        library = model.CppPrebuiltLibrary(name)
-        library.is_static = atom['format'] == 'static'
-        base = self.dest('pkg', name)
-
-        self._copy_files(atom['headers'], atom['root'], base, library.hdrs)
-
-        for arch in self.target_arches:
-            def _copy_prebuilt(path, category):
-                relative_dest = os.path.join('arch', arch, category,
-                                             os.path.basename(path))
-                dest = self.dest(base, relative_dest)
-                shutil.copy2(self.source(path), dest)
-                return relative_dest
-
-            binaries = atom['binaries'][arch]
-            prebuilt_set = model.CppPrebuiltSet(_copy_prebuilt(binaries['link'],
-                                                              'lib'))
-            if 'dist' in binaries:
-                dist = binaries['dist']
-                prebuilt_set.dist_lib = _copy_prebuilt(dist, 'dist')
-                prebuilt_set.dist_path = 'lib/' + os.path.basename(dist)
-
-            if 'debug' in binaries:
-                self._copy_file(binaries['debug'])
-
-            library.prebuilts[arch] = prebuilt_set
-
-        for dep in atom['deps']:
-            library.deps.append('//pkg/' + sanitize(dep))
-
-        library.includes = os.path.relpath(atom['include_dir'], atom['root'])
-
-        include_paths = map(lambda h: os.path.relpath(h, atom['include_dir']),
-                            atom['headers'])
-        self.workspace_info.headers['//pkg/' + name] = include_paths
-
-        self.write_file(os.path.join(base, 'BUILD'), 'cc_prebuilt_library',
-                        library)
-
-
-    def install_cc_source_library_atom(self, atom):
-        name = sanitize(atom['name'])
-        library = model.CppSourceLibrary(name)
-        base = self.dest('pkg', name)
-
-        self._copy_files(atom['headers'], atom['root'], base, library.hdrs)
-        self._copy_files(atom['sources'], atom['root'], base, library.srcs)
-
-        for dep in atom['deps']:
-            library.deps.append('//pkg/' + sanitize(dep))
-
-        for dep in atom['fidl_deps']:
-            dep_name = sanitize(dep)
-            library.deps.append('//fidl/%s:%s_cc' % (dep_name, dep_name))
-
-        library.includes = os.path.relpath(atom['include_dir'], atom['root'])
-
-        include_paths = map(lambda h: os.path.relpath(h, atom['include_dir']),
-                            atom['headers'])
-        self.workspace_info.headers['//pkg/' + name] = include_paths
-
-        self.write_file(os.path.join(base, 'BUILD'), 'cc_library', library)
-
-
-    def install_sysroot_atom(self, atom):
-        for arch in self.target_arches:
-            base = self.dest('arch', arch, 'sysroot')
-            arch_data = atom['versions'][arch]
-            self._copy_files(arch_data['headers'], arch_data['root'], base)
-            self._copy_files(arch_data['link_libs'], arch_data['root'], base)
-            # We maintain debug files in their original location.
-            self._copy_files(arch_data['debug_libs'])
-            dist_libs = []
-            self._copy_files(arch_data['dist_libs'], arch_data['root'], base,
-                             dist_libs)
-            version = {}
-            for lib in dist_libs:
-                version['lib/' + os.path.basename(lib)] = lib
-            self.write_file(os.path.join(base, 'BUILD'), 'sysroot_version',
-                            version)
-
-        self.write_file(self.dest('pkg', 'sysroot', 'BUILD'),
-                        'sysroot_pkg', self.target_arches)
-
-
-    def install_host_tool_atom(self, atom):
-        self._copy_files(atom['files'], atom['root'], 'tools')
-
-
-    def install_fidl_library_atom(self, atom):
-        name = sanitize(atom['name'])
-        data = model.FidlLibrary(name, atom['name'])
-        data.with_cc = self.has_cc
-        data.with_dart = self.has_dart
-        base = self.dest('fidl', name)
-        self._copy_files(atom['sources'], atom['root'], base, data.srcs)
-        for dep in atom['deps']:
-            data.deps.append(sanitize(dep))
-        self.write_file(os.path.join(base, 'BUILD'), 'fidl', data)
-
-
-    def install_image_atom(self, atom):
-        # 'image_file' contains a relative path that is good enough to be stored
-        # under the top-level SDK directory. No 'root' or 'destination' is
-        # needed.
-        root = ''
-        dest = ''
-        target_architectures = {}
-        for image_file in atom['file'].itervalues():
-            self._copy_file(image_file, root, dest)
-            # The image file looks like this: target/x64/fuchsia.zbi
-            # where x64 is the architecture
-            target, arch = os.path.split(os.path.dirname(image_file))
-            if target not in target_architectures:
-                target_architectures[target] = set()
-            target_architectures[target].add(arch)
-
-        for target in target_architectures:
-            data = model.Images(list(target_architectures[target]))
-            self.write_file(self.dest(os.path.join(target, 'BUILD')), 'images',
-                            data)
-
-
-def main():
-    parser = argparse.ArgumentParser(
-            description='Lays out a Bazel workspace for a given SDK tarball.')
-    source_group = parser.add_mutually_exclusive_group(required=True)
-    source_group.add_argument('--archive',
-                              help='Path to the SDK archive to ingest',
-                              default='')
-    source_group.add_argument('--directory',
-                              help='Path to the SDK directory to ingest',
-                              default='')
-    parser.add_argument('--output',
-                        help='Path to the directory where to install the SDK',
-                        required=True)
-    parser.add_argument('--tests',
-                        help='Path to the directory where to generate tests')
-    args = parser.parse_args()
-
-    # Remove any existing output.
-    shutil.rmtree(args.output, ignore_errors=True)
-
-    builder = BazelBuilder(archive=args.archive,
-                           directory=args.directory,
-                           output=args.output)
-    if not builder.run():
-        return 1
-
-    if args.tests and not create_test_workspace(args.output, args.tests,
-                                                builder.workspace_info):
-        return 1
-
-    return 0
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/sdk/bazel/template_model.py b/sdk/bazel/template_model.py
deleted file mode 100644
index 2cad4e0..0000000
--- a/sdk/bazel/template_model.py
+++ /dev/null
@@ -1,98 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-'''A collection of storage classes to use with Mako templates.'''
-
-
-ARCH_MAP = {
-    'arm64': 'aarch64',
-    'x64': 'x86_64',
-}
-
-
-class _CppLibrary(object):
-
-    def __init__(self, name):
-        self.name = name
-        self.hdrs = []
-        self.deps = []
-        self.includes = ''
-
-
-class CppSourceLibrary(_CppLibrary):
-
-    def __init__(self, name):
-        super(CppSourceLibrary, self).__init__(name)
-        self.srcs = []
-
-
-class CppPrebuiltSet(object):
-
-    def __init__(self, link):
-        self.link_lib = link
-        self.dist_lib = ''
-        self.dist_path = ''
-
-
-class CppPrebuiltLibrary(_CppLibrary):
-
-    def __init__(self, name):
-        super(CppPrebuiltLibrary, self).__init__(name)
-        self.prebuilt = ""
-        self.is_static = False
-        self.packaged_files = {}
-        self.prebuilts = {}
-
-
-class FidlLibrary(object):
-
-    def __init__(self, name, library):
-        self.name = name
-        self.library = library
-        self.srcs = []
-        self.deps = []
-        self.with_cc = False
-        self.with_dart = False
-
-
-class Arch(object):
-
-    def __init__(self, short, long):
-        self.short_name = short
-        self.long_name = long
-
-
-class Crosstool(object):
-
-    def __init__(self, arches=[]):
-        self.arches = []
-        for arch in arches:
-            if arch in ARCH_MAP:
-                self.arches.append(Arch(arch, ARCH_MAP[arch]))
-            else:
-                print('Unknown target arch: %s' % arch)
-
-
-
-class DartLibrary(object):
-
-    def __init__(self, name, package):
-        self.name = name
-        self.package_name = package
-        self.deps = []
-
-
-class Images(object):
-
-    def __init__(self, arches):
-        self.arches = arches
-
-
-class TestWorkspace(object):
-
-    def __init__(self, sdk_path, with_cc, with_dart):
-        self.sdk_path = sdk_path
-        self.with_cc = with_cc
-        self.with_dart = with_dart
diff --git a/sdk/bazel/templates/cc_library.mako b/sdk/bazel/templates/cc_library.mako
deleted file mode 100644
index 27454e9..0000000
--- a/sdk/bazel/templates/cc_library.mako
+++ /dev/null
@@ -1,23 +0,0 @@
-<%include file="header.mako" />
-
-package(default_visibility = ["//visibility:public"])
-
-cc_library(
-    name = "${data.name}",
-    srcs = [
-        % for source in sorted(data.srcs):
-        "${source}",
-        % endfor
-    ],
-    hdrs = [
-        % for header in sorted(data.hdrs):
-        "${header}",
-        % endfor
-    ],
-    deps = [
-        % for dep in sorted(data.deps):
-        "${dep}",
-        % endfor
-    ],
-    strip_include_prefix = "${data.includes}",
-)
diff --git a/sdk/bazel/templates/cc_prebuilt_library.mako b/sdk/bazel/templates/cc_prebuilt_library.mako
deleted file mode 100644
index 60a9065..0000000
--- a/sdk/bazel/templates/cc_prebuilt_library.mako
+++ /dev/null
@@ -1,56 +0,0 @@
-<%include file="header.mako" />
-
-package(default_visibility = ["//visibility:public"])
-
-load("//build_defs:package_files.bzl", "package_files")
-load("//build_defs:fuchsia_select.bzl", "fuchsia_select")
-
-# Note: the cc_library / cc_import combo serves two purposes:
-#  - it allows the use of a select clause to target the proper architecture;
-#  - it works around an issue with cc_import which does not have an "includes"
-#    nor a "deps" attribute.
-cc_library(
-    name = "${data.name}",
-    hdrs = [
-        % for header in sorted(data.hdrs):
-        "${header}",
-        % endfor
-    ],
-    deps = fuchsia_select({
-        % for arch in sorted(data.prebuilts.keys()):
-        "//build_defs/target_cpu:${arch}": [":${arch}_prebuilts"],
-        % endfor
-    }) + [
-        % for dep in sorted(data.deps):
-        "${dep}",
-        % endfor
-    ],
-    strip_include_prefix = "${data.includes}",
-    data = fuchsia_select({
-        % for arch in sorted(data.prebuilts.keys()):
-        "//build_defs/target_cpu:${arch}": [":${arch}_dist"],
-        % endfor
-    }),
-)
-
-# Architecture-specific targets
-
-% for arch, contents in sorted(data.prebuilts.iteritems()):
-cc_import(
-    name = "${arch}_prebuilts",
-    % if data.is_static:
-    static_library = "${contents.link_lib}",
-    % else:
-    shared_library = "${contents.link_lib}",
-    % endif
-)
-
-package_files(
-    name = "${arch}_dist",
-    contents = {
-        % if contents.dist_lib:
-        "${contents.dist_lib}": "${contents.dist_path}",
-        % endif
-    },
-)
-% endfor
diff --git a/sdk/bazel/templates/crosstool.mako b/sdk/bazel/templates/crosstool.mako
deleted file mode 100644
index 3bfb27a..0000000
--- a/sdk/bazel/templates/crosstool.mako
+++ /dev/null
@@ -1,123 +0,0 @@
-<%include file="header.mako" />
-
-package(default_visibility = ["//visibility:public"])
-
-cc_toolchain_suite(
-    name = "toolchain",
-    toolchains = {
-        % for arch in data.arches:
-        "${arch.long_name}|llvm": ":cc-compiler-${arch.long_name}",
-        "${arch.long_name}": ":cc-compiler-${arch.long_name}",
-        % endfor
-    },
-)
-
-TARGET_CPUS = [
-    % for arch in data.arches:
-    "${arch.long_name}",
-    % endfor
-]
-
-filegroup(
-    name = "empty",
-)
-
-filegroup(
-    name = "cc-compiler-prebuilts",
-    srcs = [
-        "clang/bin/clang",
-        "clang/bin/clang-8",
-        "clang/bin/llvm-ar",
-        "clang/bin/clang++",
-        "clang/bin/ld.lld",
-        "clang/bin/lld",
-        "clang/bin/llvm-nm",
-        "clang/bin/llvm-objdump",
-        "clang/bin/llvm-strip",
-        "clang/bin/llvm-objcopy",
-    ] + glob([
-        "clang/lib/clang/8.0.0/include/**",
-    ]),
-)
-
-filegroup(
-    name = "compile",
-    srcs = [
-        ":cc-compiler-prebuilts",
-    ],
-)
-
-filegroup(
-    name = "objcopy",
-    srcs = [
-        "clang/bin/llvm-objcopy",
-    ],
-)
-
-[
-    filegroup(
-        name = "every-file-" + cpu,
-        srcs = [
-            ":cc-compiler-prebuilts",
-            ":runtime-" + cpu,
-        ],
-    )
-    for cpu in TARGET_CPUS
-]
-
-[
-    filegroup(
-        name = "link-" + cpu,
-        srcs = [
-            ":cc-compiler-prebuilts",
-            ":runtime-" + cpu,
-        ],
-    )
-    for cpu in TARGET_CPUS
-]
-
-[
-    filegroup(
-        name = "runtime-" + cpu,
-        srcs = [
-            "clang/lib/clang/8.0.0/" + cpu + "-fuchsia/lib/libclang_rt.builtins.a",
-        ],
-    )
-    for cpu in TARGET_CPUS
-]
-
-[
-    cc_toolchain(
-        name = "cc-compiler-" + cpu,
-        toolchain_identifier = "crosstool-1.x.x-llvm-fuchsia-" + cpu,
-        all_files = ":every-file-" + cpu,
-        compiler_files = ":compile",
-        cpu = cpu,
-        dwp_files = ":empty",
-        dynamic_runtime_libs = [":runtime-" + cpu],
-        linker_files = ":link-" + cpu,
-        objcopy_files = ":objcopy",
-        static_runtime_libs = [":runtime-" + cpu],
-        strip_files = ":runtime-" + cpu,
-        supports_param_files = 1,
-    )
-    for cpu in TARGET_CPUS
-]
-
-cc_library(
-    name = "sources",
-    srcs = glob(["src/**"]),
-    visibility = ["//visibility:public"],
-)
-
-[
-    filegroup(
-        name = "dist-" + cpu,
-        srcs = [
-            "clang/lib/clang/8.0.0/" + cpu + "-fuchsia/lib/libc++.so.2",
-            "clang/lib/clang/8.0.0/" + cpu + "-fuchsia/lib/libc++abi.so.1",
-            "clang/lib/clang/8.0.0/" + cpu + "-fuchsia/lib/libunwind.so.1",
-        ],
-    )
-    for cpu in TARGET_CPUS
-]
diff --git a/sdk/bazel/templates/crosstool_bzl.mako b/sdk/bazel/templates/crosstool_bzl.mako
deleted file mode 100644
index 4f41138..0000000
--- a/sdk/bazel/templates/crosstool_bzl.mako
+++ /dev/null
@@ -1,70 +0,0 @@
-<%include file="header_no_license.mako" />
-
-"""
-Defines a Fuchsia crosstool workspace.
-"""
-
-# TODO(alainv): Do not hardcode download URLs but export the URL from the
-#               the one used in //buildtools, using the CIPD APIs.
-CLANG_LINUX_DOWNLOAD_URL = (
-    "https://storage.googleapis.com/fuchsia/clang/linux-amd64/2a605accf10c22e7905d2cabec22ca317869f85d"
-)
-CLANG_LINUX_SHA256 = (
-    "776b8b7b47da73199f095fc0cabca85e9d9ced7e3bbfc7707e0a4afaacc6544b"
-)
-
-CLANG_MAC_DOWNLOAD_URL = (
-    "https://storage.googleapis.com/fuchsia/clang/mac-amd64/c0cea0a0fa8cff6d286e46aa4530ffc6b85baaaf"
-)
-CLANG_MAC_SHA256 = (
-    "f06e6cf9bcb09963a3042ff8c8bbfe998f43bf4d61dc5bbb1f1c496792d6bea2"
-)
-
-
-def _configure_crosstool_impl(repository_ctx):
-    """
-    Configures the Fuchsia crosstool repository.
-    """
-    if repository_ctx.os.name == "linux":
-      clang_download_url = CLANG_LINUX_DOWNLOAD_URL
-      clang_sha256 = CLANG_LINUX_SHA256
-    elif repository_ctx.os.name == "mac os x":
-      clang_download_url = CLANG_MAC_DOWNLOAD_URL
-      clang_sha256 = CLANG_MAC_SHA256
-    else:
-      fail("Unsupported platform: %s" % repository_ctx.os.name)
-
-    # Download the toolchain.
-    repository_ctx.download_and_extract(
-        url = clang_download_url,
-        output = "clang",
-        sha256 = clang_sha256,
-        type = "zip",
-    )
-    # Set up the BUILD file from the Fuchsia SDK.
-    repository_ctx.symlink(
-        Label("@fuchsia_sdk//build_defs/internal/crosstool:BUILD.crosstool"),
-        "BUILD",
-    )
-    # Hack to get the path to the sysroot directory, see
-    # https://github.com/bazelbuild/bazel/issues/3901
-    % for arch in data.arches:
-    sysroot_${arch.short_name} = repository_ctx.path(
-        Label("@fuchsia_sdk//arch/${arch.short_name}/sysroot:BUILD")).dirname
-    % endfor
-    # Set up the CROSSTOOL file from the template.
-    repository_ctx.template(
-        "CROSSTOOL",
-        Label("@fuchsia_sdk//build_defs/internal/crosstool:CROSSTOOL.in"),
-        substitutions = {
-            % for arch in data.arches:
-            "%{SYSROOT_${arch.short_name.upper()}}": str(sysroot_${arch.short_name}),
-            % endfor
-            "%{CROSSTOOL_ROOT}": str(repository_ctx.path("."))
-        },
-    )
-
-
-configure_crosstool = repository_rule(
-    implementation = _configure_crosstool_impl,
-)
diff --git a/sdk/bazel/templates/crosstool_in.mako b/sdk/bazel/templates/crosstool_in.mako
deleted file mode 100644
index 51d96b8..0000000
--- a/sdk/bazel/templates/crosstool_in.mako
+++ /dev/null
@@ -1,72 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-major_version: "1.x.x"
-minor_version: "llvm:7.x.x"
-default_target_cpu: "${data.arches[0].short_name}"
-
-% for arch in data.arches:
-toolchain {
-  abi_version: "local"
-  abi_libc_version: "local"
-
-  builtin_sysroot: "%{SYSROOT_${arch.short_name.upper()}}"
-  compiler: "llvm"
-  default_python_top: "/dev/null"
-  default_python_version: "python2.7"
-  host_system_name: "x86_64-unknown-linux-gnu"
-  needsPic: true
-  supports_gold_linker: false
-  supports_incremental_linker: false
-  supports_fission: false
-  supports_interface_shared_objects: false
-  supports_normalizing_ar: true
-  supports_start_end_lib: false
-  target_libc: "fuchsia"
-  target_cpu: "${arch.long_name}"
-  target_system_name: "${arch.long_name}-fuchsia"
-  toolchain_identifier: "crosstool-1.x.x-llvm-fuchsia-${arch.long_name}"
-  cc_target_os: "fuchsia"
-
-  tool_path { name: "ar" path: "clang/bin/llvm-ar" }
-  tool_path { name: "cpp" path: "clang/bin/clang++" }
-  tool_path { name: "gcc" path: "clang/bin/clang" }
-  tool_path { name: "lld" path: "clang/bin/lld" }
-  tool_path { name: "objdump" path: "clang/bin/llvm-objdump" }
-  tool_path { name: "strip" path: "clang/bin/llvm-strip" }
-  tool_path { name: "nm" path: "clang/bin/llvm-nm" }
-  tool_path { name: "objcopy" path: "clang/bin/llvm-objcopy" }
-  tool_path { name: "dwp" path: "/not_available/dwp" }        # Not used but required
-  tool_path { name: "compat-ld" path: "/not_available/compat-ld" }  # Not used but required
-  tool_path { name: "gcov" path: "/not_available/gcov" }       # Not used but required
-  tool_path { name: "gcov-tool" path: "/not_available/gcov-tool" }  # Not used but required
-  tool_path { name: "ld" path: "clang/bin/ld.lld" }
-
-  compiler_flag: "--target=${arch.long_name}-fuchsia"
-  linker_flag: "--target=${arch.long_name}-fuchsia"
-
-  # Use C++14 by default.
-  cxx_flag: "-std=c++14"
-
-  cxx_flag: "-xc++"
-
-  linker_flag: "--driver-mode=g++"
-  linker_flag: "-lzircon"
-
-  ### start 'sdk' portion
-  # The following are to make the various files in runtimes/sdk available
-
-  # Implicit dependencies for Fuchsia system functionality
-  cxx_builtin_include_directory: "%{SYSROOT_${arch.short_name.upper()}}/include" # Platform parts of libc.
-  cxx_builtin_include_directory: "%{CROSSTOOL_ROOT}/clang/lib/${arch.long_name}-fuchsia/include/c++/v1" # Platform libc++.
-  cxx_builtin_include_directory: "%{CROSSTOOL_ROOT}/clang/lib/clang/8.0.0/include" # Platform libc++.
-  ### end
-
-  #### Common compiler options. ####
-
-  compiler_flag: "-Wall"
-  compiler_flag: "-Werror"
-}
-
-% endfor
diff --git a/sdk/bazel/templates/dart_library.mako b/sdk/bazel/templates/dart_library.mako
deleted file mode 100644
index 7a2176b..0000000
--- a/sdk/bazel/templates/dart_library.mako
+++ /dev/null
@@ -1,17 +0,0 @@
-<%include file="header.mako" />
-
-load("@io_bazel_rules_dart//dart/build_rules:core.bzl", "dart_library")
-
-package(default_visibility = ["//visibility:public"])
-
-dart_library(
-    name = "${data.name}",
-    pub_pkg_name = "${data.package_name}",
-    enable_ddc = False,
-    srcs = glob(["lib/**"]),
-    deps = [
-        % for dep in sorted(data.deps):
-        "${dep}",
-        % endfor
-    ],
-)
diff --git a/sdk/bazel/templates/fidl.mako b/sdk/bazel/templates/fidl.mako
deleted file mode 100644
index 12c8158..0000000
--- a/sdk/bazel/templates/fidl.mako
+++ /dev/null
@@ -1,45 +0,0 @@
-<%include file="header.mako" />
-
-load("//build_defs:fidl_library.bzl", "fidl_library")
-
-package(default_visibility = ["//visibility:public"])
-
-fidl_library(
-    name = "${data.name}",
-    library = "${data.library}",
-    srcs = [
-        % for source in sorted(data.srcs):
-        "${source}",
-        % endfor
-    ],
-    deps = [
-        % for dep in sorted(data.deps):
-        "//fidl/${dep}",
-        % endfor
-    ],
-)
-
-% if data.with_cc:
-load("//build_defs:cc_fidl_library.bzl", "cc_fidl_library")
-
-cc_fidl_library(
-    name = "${data.name}_cc",
-    library = ":${data.name}",
-    # TODO(DX-288): remove explicit deps once C++ compilation API is available
-    #     in Skylark and generated through the cc_fidl_library rule.
-    deps = [
-        % for dep in sorted(data.deps):
-        "//fidl/${dep}:${dep}_cc",
-        % endfor
-    ],
-)
-% endif
-
-% if data.with_dart:
-load("//build_defs:dart_fidl_library.bzl", "dart_fidl_library")
-
-dart_fidl_library(
-    name = "${data.name}_dart",
-    deps = [":${data.name}"],
-)
-% endif
diff --git a/sdk/bazel/templates/header.mako b/sdk/bazel/templates/header.mako
deleted file mode 100644
index 5fe103e..0000000
--- a/sdk/bazel/templates/header.mako
+++ /dev/null
@@ -1,2 +0,0 @@
-<%include file="header_no_license.mako" />
-licenses(["notice"])
diff --git a/sdk/bazel/templates/header_no_license.mako b/sdk/bazel/templates/header_no_license.mako
deleted file mode 100644
index 389edce..0000000
--- a/sdk/bazel/templates/header_no_license.mako
+++ /dev/null
@@ -1,6 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# DO NOT MANUALLY EDIT!
-# Generated by //scripts/sdk/bazel/generate.py.
diff --git a/sdk/bazel/templates/images.mako b/sdk/bazel/templates/images.mako
deleted file mode 100644
index b01b74f..0000000
--- a/sdk/bazel/templates/images.mako
+++ /dev/null
@@ -1,10 +0,0 @@
-<%include file="header.mako" />
-package(default_visibility = ["//visibility:public"])
-
-% for arch in sorted(data.arches):
-filegroup(
-    name = "${arch}",
-    srcs = glob(["${arch}/*"]),
-)
-
-% endfor
\ No newline at end of file
diff --git a/sdk/bazel/templates/setup_dart_bzl.mako b/sdk/bazel/templates/setup_dart_bzl.mako
deleted file mode 100644
index d466c5e..0000000
--- a/sdk/bazel/templates/setup_dart_bzl.mako
+++ /dev/null
@@ -1,18 +0,0 @@
-<%include file="header_no_license.mako" />
-
-load("@io_bazel_rules_dart//dart/build_rules/internal:pub.bzl", "pub_repository")
-
-def setup_dart():
-    % if data:
-      % for name, version in data.iteritems():
-    pub_repository(
-        name = "vendor_${name}",
-        output = ".",
-        package = "${name}",
-        version = "${version}",
-        pub_deps = [],
-    )
-      % endfor
-    % else:
-    pass
-    % endif
diff --git a/sdk/bazel/templates/sysroot_arch.mako b/sdk/bazel/templates/sysroot_arch.mako
deleted file mode 100644
index 66e5b89..0000000
--- a/sdk/bazel/templates/sysroot_arch.mako
+++ /dev/null
@@ -1,19 +0,0 @@
-<%include file="header.mako" />
-
-load("//build_defs:package_files.bzl", "package_files")
-
-exports_files(
-    glob(["**"]),
-)
-
-package_files(
-    name = "dist",
-    contents = {
-        % for path, source in sorted(data.packaged_files.iteritems()):
-        "${source}": "${path}",
-        % endfor
-    },
-    visibility = [
-        "//visibility:public",
-    ],
-)
diff --git a/sdk/bazel/templates/sysroot_pkg.mako b/sdk/bazel/templates/sysroot_pkg.mako
deleted file mode 100644
index 9648a4f..0000000
--- a/sdk/bazel/templates/sysroot_pkg.mako
+++ /dev/null
@@ -1,16 +0,0 @@
-<%include file="header.mako" />
-
-load("//build_defs:fuchsia_select.bzl", "fuchsia_select")
-
-# This target exists solely for packaging purposes.
-alias(
-    name = "sysroot",
-    actual = fuchsia_select({
-        % for arch in sorted(data):
-        "//build_defs/target_cpu:${arch}": "//arch/${arch}/sysroot:dist",
-        % endfor
-    }),
-    visibility = [
-        "//visibility:public",
-    ],
-)
diff --git a/sdk/bazel/templates/sysroot_version.mako b/sdk/bazel/templates/sysroot_version.mako
deleted file mode 100644
index 455c64b..0000000
--- a/sdk/bazel/templates/sysroot_version.mako
+++ /dev/null
@@ -1,19 +0,0 @@
-<%include file="header.mako" />
-
-load("//build_defs:package_files.bzl", "package_files")
-
-exports_files(
-    glob(["**"]),
-)
-
-package_files(
-    name = "dist",
-    contents = {
-        % for path, source in sorted(data.iteritems()):
-        "${source}": "${path}",
-        % endfor
-    },
-    visibility = [
-        "//visibility:public",
-    ],
-)
diff --git a/sdk/bazel/templates/tests/bazelrc.mako b/sdk/bazel/templates/tests/bazelrc.mako
deleted file mode 100644
index 313c1df..0000000
--- a/sdk/bazel/templates/tests/bazelrc.mako
+++ /dev/null
@@ -1,5 +0,0 @@
-% for arch in data.arches:
-build:fuchsia_${arch.short_name} --crosstool_top=@fuchsia_crosstool//:toolchain
-build:fuchsia_${arch.short_name} --cpu=${arch.long_name}
-build:fuchsia_${arch.short_name} --host_crosstool_top=@bazel_tools//tools/cpp:toolchain
-% endfor
diff --git a/sdk/bazel/templates/tests/header_slash.mako b/sdk/bazel/templates/tests/header_slash.mako
deleted file mode 100644
index 6e2a554..0000000
--- a/sdk/bazel/templates/tests/header_slash.mako
+++ /dev/null
@@ -1,6 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-// DO NOT MANUALLY EDIT!
-// Generated by //scripts/sdk/bazel/generate.py.
diff --git a/sdk/bazel/templates/tests/headers.mako b/sdk/bazel/templates/tests/headers.mako
deleted file mode 100644
index ca2547d..0000000
--- a/sdk/bazel/templates/tests/headers.mako
+++ /dev/null
@@ -1,14 +0,0 @@
-<%include file="header_slash.mako" />
-
-// This file verifies that all headers included in an SDK are valid.
-
-% for dep, headers in sorted(data['headers'].iteritems()):
-  % if headers:
-// ${dep}
-    % for header in headers:
-#include "${header}"
-    % endfor
-  % endif
-% endfor
-
-int main(int argc, const char** argv) {}
diff --git a/sdk/bazel/templates/tests/headers_build.mako b/sdk/bazel/templates/tests/headers_build.mako
deleted file mode 100644
index 32d3557..0000000
--- a/sdk/bazel/templates/tests/headers_build.mako
+++ /dev/null
@@ -1,13 +0,0 @@
-<%include file="header_no_license.mako" />
-
-cc_binary(
-    name = "headers",
-    srcs = [
-        "headers.cc",
-    ],
-    deps = [
-        % for dep in sorted(data['deps']):
-        "@fuchsia_sdk${dep}",
-        % endfor
-    ],
-)
diff --git a/sdk/bazel/templates/tests/run_py.mako b/sdk/bazel/templates/tests/run_py.mako
deleted file mode 100755
index 621c73f..0000000
--- a/sdk/bazel/templates/tests/run_py.mako
+++ /dev/null
@@ -1,151 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-from subprocess import check_output, Popen
-import sys
-
-
-SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
-
-
-ARCHES = [
-% for arch in data.arches:
-    '${arch.short_name}',
-% endfor
-]
-
-
-def program_exists(name):
-    """Returns True if an executable with the name exists"""
-    if len(name) > 0 and name[0] == '/':
-        return os.path.isfile(name) and os.access(name, os.X_OK)
-    for path in os.environ["PATH"].split(os.pathsep):
-        fname = os.path.join(path, name)
-        if os.path.isfile(fname) and os.access(fname, os.X_OK):
-            return True
-    return False
-
-
-class BazelTester(object):
-
-    def __init__(self, without_sdk, with_ignored, bazel_bin,
-                 optional_flags=[]):
-        self.without_sdk = without_sdk
-        self.with_ignored = with_ignored
-        self.bazel_bin = bazel_bin
-        self.optional_flags = optional_flags
-
-
-    def _invoke_bazel(self, command, targets):
-        command = [self.bazel_bin, command, '--keep_going']
-        command += self.optional_flags
-        command += targets
-        job = Popen(command, cwd=SCRIPT_DIR)
-        job.communicate()
-        return job.returncode
-
-
-    def _build(self, targets):
-        return self._invoke_bazel('build', targets)
-
-
-    def _test(self, targets):
-        return self._invoke_bazel('test', targets)
-
-
-    def _query(self, query):
-        command = [self.bazel_bin, 'query', query]
-        return set(check_output(command, cwd=SCRIPT_DIR).splitlines())
-
-
-    def run(self):
-        if not self.without_sdk:
-            # Build the SDK contents.
-            print('Building SDK contents')
-            if self._build(['@fuchsia_sdk//...']):
-                return False
-
-        targets = ['//...']
-        if not self.with_ignored:
-            # Identify and remove ignored targets.
-            all_targets = self._query('//...')
-            ignored_targets = self._query('attr("tags", "ignored", //...)')
-            if ignored_targets:
-                # Targets which depend on an ignored target should be ignored too.
-                all_ignored_targets = set()
-                for target in ignored_targets:
-                    all_ignored_targets.add(target)
-                    dep_query = 'rdeps("//...", "{}")'.format(target)
-                    dependent_targets = self._query(dep_query)
-                    all_ignored_targets.update(dependent_targets)
-                print('Ignored targets:')
-                for target in sorted(all_ignored_targets):
-                    print(' - ' + target)
-                targets = list(all_targets - all_ignored_targets)
-
-        # Build the tests targets.
-        print('Building test targets')
-        if self._build(targets):
-            return False
-
-        # Run tests.
-        args = ('attr("tags", "^((?!compile-only).)*$",' +
-                ' kind(".*test rule", //...))')
-        test_targets = list(self._query(args))
-        print('Running test targets')
-        return self._test(test_targets) == 0
-
-
-def main():
-    parser = argparse.ArgumentParser(
-        description='Runs the SDK tests')
-    parser.add_argument('--no-sdk',
-                        help='If set, SDK targets are not built.',
-                        action='store_true')
-    parser.add_argument('--ignored',
-                        help='If set, ignored tests are run too.',
-                        action='store_true')
-    parser.add_argument('--bazel',
-                        help='Path to the Bazel tool',
-                        default='bazel')
-    parser.add_argument('--once',
-                        help='Whether to only run tests once',
-                        action='store_true')
-    args = parser.parse_args()
-
-    if not program_exists(args.bazel):
-        print('"%s": command not found' % (args.bazel))
-        return 1
-
-    def print_test_start(arch, cpp_version):
-        print('')
-        print('-----------------------------------')
-        print('| Testing %s / %s' % (arch, cpp_version))
-        print('-----------------------------------')
-
-    for arch in ARCHES:
-        print_test_start(arch, 'C++14')
-        config_flags = ['--config=fuchsia_%s' % arch]
-        cpp14_flags = ['--cxxopt=-Wc++14-compat', '--cxxopt=-Wc++17-extensions']
-        if not BazelTester(args.no_sdk, args.ignored, args.bazel,
-                           optional_flags=config_flags + cpp14_flags).run():
-            return 1
-
-        if args.once:
-            print('Single iteration requested, done.')
-            break
-
-        print_test_start(arch, 'C++17')
-        cpp17_flags = ['--cxxopt=-std=c++17', '--cxxopt=-Wc++17-compat']
-        if not BazelTester(args.no_sdk, args.ignored, args.bazel,
-                           optional_flags=config_flags + cpp17_flags).run():
-            return 1
-    return 0
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/sdk/bazel/templates/tests/workspace.mako b/sdk/bazel/templates/tests/workspace.mako
deleted file mode 100644
index bd633e7..0000000
--- a/sdk/bazel/templates/tests/workspace.mako
+++ /dev/null
@@ -1,30 +0,0 @@
-<%include file="header_no_license.mako" />
-
-load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
-
-local_repository(
-    name = "fuchsia_sdk",
-    path = "${data.sdk_path}",
-)
-
-load("@fuchsia_sdk//build_defs:fuchsia_setup.bzl", "fuchsia_setup")
-fuchsia_setup(
-    with_toolchain = ${data.with_cc},
-)
-
-% if data.with_dart:
-http_archive(
-    name = "io_bazel_rules_dart",
-    url = "https://github.com/dart-lang/rules_dart/archive/master.zip",
-    strip_prefix = "rules_dart-master",
-)
-
-load("@io_bazel_rules_dart//dart/build_rules:repositories.bzl", "dart_repositories")
-dart_repositories()
-
-load("@fuchsia_sdk//build_defs:setup_dart.bzl", "setup_dart")
-setup_dart()
-
-load("@fuchsia_sdk//build_defs:setup_flutter.bzl", "setup_flutter")
-setup_flutter()
-% endif
diff --git a/sdk/bazel/templates/toolchain_build.mako b/sdk/bazel/templates/toolchain_build.mako
deleted file mode 100644
index 9fa7c66..0000000
--- a/sdk/bazel/templates/toolchain_build.mako
+++ /dev/null
@@ -1,19 +0,0 @@
-<%include file="header.mako" />
-
-load(":dist.bzl", "toolchain_dist")
-load("//build_defs:fuchsia_select.bzl", "fuchsia_select")
-
-# TODO(pylaligand): make this target configurable by developers. This is blocked
-# by current work on Bazel build configuration (design at
-# https://docs.google.com/document/d/1vc8v-kXjvgZOdQdnxPTaV0rrLxtP2XwnD2tAZlYJOqw/edit?usp=sharing)
-toolchain_dist(
-    name = "dist",
-    files = fuchsia_select({
-        % for arch in data.arches:
-        "//build_defs/target_cpu:${arch.short_name}": "@fuchsia_crosstool//:dist-${arch.long_name}",
-        % endfor
-    }),
-    visibility = [
-        "//visibility:public",
-    ],
-)
diff --git a/sdk/bazel/templates/tools.mako b/sdk/bazel/templates/tools.mako
deleted file mode 100644
index 7226726..0000000
--- a/sdk/bazel/templates/tools.mako
+++ /dev/null
@@ -1,7 +0,0 @@
-<%include file="header.mako" />
-
-package(default_visibility = ["//visibility:public"])
-
-exports_files(
-    glob(["*"]),
-)
diff --git a/sdk/bazel/tests/cc/cc/BUILD b/sdk/bazel/tests/cc/cc/BUILD
deleted file mode 100644
index 35c8341..0000000
--- a/sdk/bazel/tests/cc/cc/BUILD
+++ /dev/null
@@ -1,121 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:package.bzl", "fuchsia_package")
-load("@fuchsia_sdk//build_defs:cc_binary_component.bzl", "cc_binary_component")
-
-load("//build_defs:verify_package.bzl", "verify_package")
-
-# Vanilla C++ program.
-cc_binary(
-    name = "compilation",
-    srcs = [
-        "compilation.cc",
-    ],
-)
-
-# Local shared library for packaging.
-cc_binary(
-    name = "libshared.so",
-    srcs = [
-        "library.cc",
-        "library.h",
-    ],
-    linkshared = True,
-)
-
-cc_library(
-    name = "shared_library",
-    hdrs = [
-        "library.h",
-    ],
-    srcs = [
-        ":libshared.so",
-    ],
-    includes = [
-        ".",
-    ],
-)
-
-# C++ program with dependency on a Fuchsia library.
-cc_binary(
-    name = "pkg_dep",
-    srcs = [
-        "pkg_dep.cc",
-    ],
-    deps = [
-        ":shared_library",
-        "@fuchsia_sdk//pkg/svc",
-    ],
-)
-
-# Prepare the binary for inclusion in a package.
-cc_binary_component(
-    name = "packageable",
-    deps = [":pkg_dep"],
-    component_name = "packageable",
-    manifest = "manifest.cmx",
-)
-
-# C++ program in a Fuchsia package.
-fuchsia_package(
-    name = "package",
-    deps = [
-        ":packageable",
-    ],
-)
-
-# Verify that the package contains all the expected files.
-verify_package(
-    name = "package_verify",
-    package = ":package",
-    files = [
-        "bin/pkg_dep",
-        "lib/ld.so.1",
-        "lib/libshared.so",
-        "lib/libsvc.so",
-        "meta/packageable.cmx",
-    ],
-)
-
-# Test the testonly attribute.
-
-cc_test(
-    name = "pkg_dep_test",
-    srcs = [
-        "pkg_dep.cc",
-    ],
-    deps = [
-        ":shared_library",
-        "@fuchsia_sdk//pkg/svc",
-    ],
-    tags = [
-        "compile-only"
-    ],
-)
-
-cc_binary_component(
-    name = "packageable_testonly",
-    deps = [":pkg_dep_test"],
-    component_name = "packageable_testonly",
-    manifest = "manifest.cmx",
-    testonly = 1,
-)
-
-fuchsia_package(
-    name = "package_test",
-    deps = [
-        ":packageable_testonly",
-    ],
-    testonly = 1,
-)
-
-verify_package(
-    name = "package_test_verify",
-    package = ":package_test",
-    files = [
-        "bin/pkg_dep_test",
-    ],
-    testonly = 1,
-)
diff --git a/sdk/bazel/tests/cc/cc/compilation.cc b/sdk/bazel/tests/cc/cc/compilation.cc
deleted file mode 100644
index 33f3acb..0000000
--- a/sdk/bazel/tests/cc/cc/compilation.cc
+++ /dev/null
@@ -1,10 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-#include <stdio.h>
-
-int main(int argc, const char** argv) {
-  printf("It works\n");
-  return 0;
-}
diff --git a/sdk/bazel/tests/cc/cc/library.cc b/sdk/bazel/tests/cc/cc/library.cc
deleted file mode 100644
index c72f367..0000000
--- a/sdk/bazel/tests/cc/cc/library.cc
+++ /dev/null
@@ -1,13 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-#include "library.h"
-
-namespace library {
-
-void do_something() {
-  // Maybe not...
-}
-
-}  // namespace library
diff --git a/sdk/bazel/tests/cc/cc/library.h b/sdk/bazel/tests/cc/cc/library.h
deleted file mode 100644
index cea815b..0000000
--- a/sdk/bazel/tests/cc/cc/library.h
+++ /dev/null
@@ -1,9 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-namespace library {
-
-void do_something();
-
-}  // namespace library
diff --git a/sdk/bazel/tests/cc/cc/manifest.cmx b/sdk/bazel/tests/cc/cc/manifest.cmx
deleted file mode 100644
index 100886a..0000000
--- a/sdk/bazel/tests/cc/cc/manifest.cmx
+++ /dev/null
@@ -1,11 +0,0 @@
-{
-    "program": {
-        "binary": "bin/pkg_dep"
-    },
-    "sandbox": {
-        "services": [
-            "fuchsia.sys.Environment",
-            "fuchsia.sys.Loader"
-        ]
-    }
-}
diff --git a/sdk/bazel/tests/cc/cc/pkg_dep.cc b/sdk/bazel/tests/cc/cc/pkg_dep.cc
deleted file mode 100644
index de86e3a..0000000
--- a/sdk/bazel/tests/cc/cc/pkg_dep.cc
+++ /dev/null
@@ -1,15 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-#include <stdio.h>
-
-#include <lib/svc/dir.h>
-
-#include "library.h"
-
-int main(int argc, const char** argv) {
-  printf("This is only the beginning!\n");
-  svc_dir_destroy(NULL);
-  library::do_something();
-}
diff --git a/sdk/bazel/tests/cc/fidl-cc/BUILD b/sdk/bazel/tests/cc/fidl-cc/BUILD
deleted file mode 100644
index 9987963..0000000
--- a/sdk/bazel/tests/cc/fidl-cc/BUILD
+++ /dev/null
@@ -1,31 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:cc_fidl_library.bzl", "cc_fidl_library")
-
-cc_fidl_library(
-    name = "simple",
-    library = "//fidl:simple",
-)
-
-cc_binary(
-    name = "simple-user",
-    srcs = [
-        "simple_user.cc",
-    ],
-    deps = [
-        ":simple",
-    ],
-)
-
-cc_fidl_library(
-    name = "local_deps",
-    library = "//fidl:local_deps",
-    # Currently failing as C++ bindings do not handle dependencies between FIDL
-    # libraries yet. This is blocked on the availability of C++ compilation
-    # support in Skylark (coming soon).
-    tags = [
-        "ignored",
-    ],
-)
diff --git a/sdk/bazel/tests/cc/fidl-cc/simple_user.cc b/sdk/bazel/tests/cc/fidl-cc/simple_user.cc
deleted file mode 100644
index 00f53a0..0000000
--- a/sdk/bazel/tests/cc/fidl-cc/simple_user.cc
+++ /dev/null
@@ -1,12 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-#include <bazel/examples/simple/cpp/fidl.h>
-
-int main(int argc, const char** argv) {
-  bazel::examples::simple::Hello object;
-  object.world = 314;
-  bazel::examples::simple::Hello other_object;
-  return object == other_object ? 0 : 1;
-}
diff --git a/sdk/bazel/tests/common/build_defs/BUILD b/sdk/bazel/tests/common/build_defs/BUILD
deleted file mode 100644
index f99e954..0000000
--- a/sdk/bazel/tests/common/build_defs/BUILD
+++ /dev/null
@@ -1,16 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-licenses(["notice"])
-
-package(default_visibility = ["//visibility:public"])
-
-exports_files(
-    glob(["*.bzl"]),
-)
-
-py_binary(
-    name = "package_verifier",
-    srcs = ["package_verifier.py"],
-)
diff --git a/sdk/bazel/tests/common/build_defs/package_verifier.py b/sdk/bazel/tests/common/build_defs/package_verifier.py
deleted file mode 100644
index 2b660f7..0000000
--- a/sdk/bazel/tests/common/build_defs/package_verifier.py
+++ /dev/null
@@ -1,52 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import sys
-
-
-def main():
-    parser = argparse.ArgumentParser()
-    parser.add_argument('--meta',
-                        help='The path to the package\'s meta directory',
-                        required=True)
-    parser.add_argument('--files',
-                        help='The list of expected files in the package',
-                        default=[],
-                        nargs='*')
-    parser.add_argument('--stamp',
-                        help='The path to the stamp file in case of success',
-                        required=True)
-    args = parser.parse_args()
-
-    all_files = []
-    # List the files in the meta directory itself.
-    for root, dirs, files in os.walk(args.meta):
-        all_files += map(
-            lambda f: os.path.relpath(os.path.join(root, f), args.meta), files)
-    # Add the files outside of the meta directory, which are listed in
-    # meta/contents.
-    with open(os.path.join(args.meta, 'meta', 'contents')) as contents_file:
-        all_files += map(lambda l: l.strip().split('=', 1)[0],
-                         contents_file.readlines())
-
-    has_errors = False
-    for file in args.files:
-        if file not in all_files:
-            print('Missing ' + file)
-            has_errors = True
-    if has_errors:
-        print('Known files:')
-        print(all_files)
-        return 1
-
-    with open(args.stamp, 'w') as stamp_file:
-        stamp_file.write('Success!')
-
-    return 0
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/sdk/bazel/tests/common/build_defs/verify_package.bzl b/sdk/bazel/tests/common/build_defs/verify_package.bzl
deleted file mode 100644
index f29d3b1..0000000
--- a/sdk/bazel/tests/common/build_defs/verify_package.bzl
+++ /dev/null
@@ -1,94 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:package_info.bzl", "PackageInfo")
-
-def _verify_package_impl(context):
-    # Unpack the package archive.
-    archive = context.attr.package[PackageInfo].archive
-    archive_dir = context.actions.declare_directory(context.attr.name)
-    context.actions.run(
-        executable = context.executable._far,
-        arguments = [
-            "extract",
-            "--archive=" + archive.path,
-            "--output=" + archive_dir.path,
-        ],
-        inputs = [
-            archive,
-        ],
-        outputs = [
-            archive_dir,
-        ],
-        mnemonic = "UnpackArchive",
-    )
-
-    # Unpack the meta.far archive.
-    meta_dir = context.actions.declare_directory(context.attr.name + "_meta")
-    context.actions.run(
-        executable = context.executable._far,
-        arguments = [
-            "extract",
-            "--archive=" + archive_dir.path + "/meta.far",
-            "--output=" + meta_dir.path,
-        ],
-        inputs = [
-            archive_dir,
-        ],
-        outputs = [
-            meta_dir,
-        ],
-        mnemonic = "UnpackMeta",
-    )
-
-    # Read meta/contents and verify that it contains the expected files.
-    success_stamp = context.actions.declare_file(context.attr.name + "_success")
-    context.actions.run(
-        executable = context.executable._verifier,
-        arguments = [
-            "--meta",
-            meta_dir.path,
-            "--stamp",
-            success_stamp.path,
-            "--files",
-        ] + context.attr.files,
-        inputs = [
-            meta_dir,
-        ],
-        outputs = [
-            success_stamp,
-        ],
-    )
-    return [
-        DefaultInfo(files = depset([success_stamp])),
-    ]
-
-verify_package = rule(
-    implementation = _verify_package_impl,
-    attrs = {
-        "package": attr.label(
-            doc = "The label of the package to verify",
-            mandatory = True,
-            allow_files = False,
-            providers = [PackageInfo],
-        ),
-        "files": attr.string_list(
-            doc = "The files expected to exist in the package",
-            mandatory = False,
-            allow_empty = True,
-        ),
-        "_far": attr.label(
-            default = Label("@fuchsia_sdk//tools:far"),
-            allow_single_file = True,
-            executable = True,
-            cfg = "host",
-        ),
-        "_verifier": attr.label(
-            default = Label("//build_defs:package_verifier"),
-            allow_files = True,
-            executable = True,
-            cfg = "host",
-        ),
-    },
-)
diff --git a/sdk/bazel/tests/common/fidl/BUILD b/sdk/bazel/tests/common/fidl/BUILD
deleted file mode 100644
index 0f77f76..0000000
--- a/sdk/bazel/tests/common/fidl/BUILD
+++ /dev/null
@@ -1,37 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:fidl_library.bzl", "fidl_library")
-
-package(default_visibility = ["//visibility:public"])
-
-fidl_library(
-    name = "simple",
-    library = "bazel.examples.simple",
-    srcs = [
-        "simple.fidl",
-    ],
-)
-
-fidl_library(
-    name = "local_deps",
-    library = "bazel.examples.localdeps",
-    srcs = [
-        "local_deps.fidl",
-    ],
-    deps = [
-        ":simple",
-    ],
-)
-
-fidl_library(
-    name = "fuchsia_deps",
-    library = "bazel.examples.fuchsiadeps",
-    srcs = [
-        "fuchsia_deps.fidl",
-    ],
-    deps = [
-        "@fuchsia_sdk//fidl/fuchsia_sys",
-    ],
-)
diff --git a/sdk/bazel/tests/common/fidl/fuchsia_deps.fidl b/sdk/bazel/tests/common/fidl/fuchsia_deps.fidl
deleted file mode 100644
index b28ca90..0000000
--- a/sdk/bazel/tests/common/fidl/fuchsia_deps.fidl
+++ /dev/null
@@ -1,11 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-library bazel.examples.fuchsiadeps;
-
-using fuchsia.sys;
-
-interface Fuchsia {
-  1: UseThat(request<fuchsia.sys.Loader> loader);
-};
diff --git a/sdk/bazel/tests/common/fidl/local_deps.fidl b/sdk/bazel/tests/common/fidl/local_deps.fidl
deleted file mode 100644
index 41659cf..0000000
--- a/sdk/bazel/tests/common/fidl/local_deps.fidl
+++ /dev/null
@@ -1,11 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-library bazel.examples.localdeps;
-
-using bazel.examples.simple;
-
-interface Foobar {
-  1: DoSomething(bazel.examples.simple.Hello world);
-};
diff --git a/sdk/bazel/tests/common/fidl/simple.fidl b/sdk/bazel/tests/common/fidl/simple.fidl
deleted file mode 100644
index a7e2c0a..0000000
--- a/sdk/bazel/tests/common/fidl/simple.fidl
+++ /dev/null
@@ -1,9 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-library bazel.examples.simple;
-
-struct Hello {
-  uint32 world;
-};
diff --git a/sdk/bazel/tests/common/package/BUILD b/sdk/bazel/tests/common/package/BUILD
deleted file mode 100644
index c264b12..0000000
--- a/sdk/bazel/tests/common/package/BUILD
+++ /dev/null
@@ -1,21 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:package.bzl", "fuchsia_package")
-load("@fuchsia_sdk//build_defs:package_files.bzl", "package_files")
-
-package_files(
-    name = "files",
-    contents = {
-        "file1.txt": "somedir/file.txt",
-        "file2.txt": "base.txt",
-    }
-)
-
-fuchsia_package(
-    name = "package",
-    deps = [
-        ":files",
-    ],
-)
diff --git a/sdk/bazel/tests/common/package/file1.txt b/sdk/bazel/tests/common/package/file1.txt
deleted file mode 100644
index ce01362..0000000
--- a/sdk/bazel/tests/common/package/file1.txt
+++ /dev/null
@@ -1 +0,0 @@
-hello
diff --git a/sdk/bazel/tests/common/package/file2.txt b/sdk/bazel/tests/common/package/file2.txt
deleted file mode 100644
index cc628cc..0000000
--- a/sdk/bazel/tests/common/package/file2.txt
+++ /dev/null
@@ -1 +0,0 @@
-world
diff --git a/sdk/bazel/tests/common/target_image/BUILD b/sdk/bazel/tests/common/target_image/BUILD
deleted file mode 100644
index 8c09195..0000000
--- a/sdk/bazel/tests/common/target_image/BUILD
+++ /dev/null
@@ -1,6 +0,0 @@
-
-sh_test(
-    name = "image_test",
-    srcs = ["image_test.sh"],
-    data = ["@fuchsia_sdk//target:x64"],
-)
diff --git a/sdk/bazel/tests/common/target_image/image_test.sh b/sdk/bazel/tests/common/target_image/image_test.sh
deleted file mode 100755
index d977aeb..0000000
--- a/sdk/bazel/tests/common/target_image/image_test.sh
+++ /dev/null
@@ -1,5 +0,0 @@
-#!/usr/bin/env bash
-
-set -e
-
-stat ${TEST_SRCDIR}/fuchsia_sdk/target/x64/qemu-kernel.bin
\ No newline at end of file
diff --git a/sdk/bazel/tests/dart/dart/BUILD b/sdk/bazel/tests/dart/dart/BUILD
deleted file mode 100644
index 6c313da..0000000
--- a/sdk/bazel/tests/dart/dart/BUILD
+++ /dev/null
@@ -1,39 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:dart_app.bzl", "dart_app")
-load("@fuchsia_sdk//build_defs:package.bzl", "fuchsia_package")
-
-dart_app(
-    name = "dart",
-    component_manifest = "meta/dart.cmx",
-    package_name = "tests.dart_app",
-    main = "lib/main.dart",
-    srcs = glob(["lib/*.dart"]),
-)
-
-fuchsia_package(
-    name = "hello_dart_package",
-    deps = [
-        ":dart",
-    ],
-)
-
-dart_app(
-    name = "dart_with_unused_dependencies",
-    component_manifest = "meta/dart_with_unused_dependencies.cmx",
-    package_name = "tests.dart_app",
-    main = "lib/main.dart",
-    srcs = glob(["lib/*.dart"]),
-    deps = [
-        "@vendor_meta//:meta",
-    ],
-)
-
-fuchsia_package(
-    name = "hello_unused_package",
-    deps = [
-        ":dart_with_unused_dependencies",
-    ],
-)
diff --git a/sdk/bazel/tests/dart/dart/lib/hello.dart b/sdk/bazel/tests/dart/dart/lib/hello.dart
deleted file mode 100644
index 1666c20..0000000
--- a/sdk/bazel/tests/dart/dart/lib/hello.dart
+++ /dev/null
@@ -1,7 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-String getString() {
-  return "Hello World!";
-}
\ No newline at end of file
diff --git a/sdk/bazel/tests/dart/dart/lib/main.dart b/sdk/bazel/tests/dart/dart/lib/main.dart
deleted file mode 100644
index 0a6d5bf..0000000
--- a/sdk/bazel/tests/dart/dart/lib/main.dart
+++ /dev/null
@@ -1,9 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-import "hello.dart" as hello;
-
-void main(List<String> args) {
-  print(hello.getString());
-}
diff --git a/sdk/bazel/tests/dart/dart/meta/dart.cmx b/sdk/bazel/tests/dart/dart/meta/dart.cmx
deleted file mode 100644
index f45bd12..0000000
--- a/sdk/bazel/tests/dart/dart/meta/dart.cmx
+++ /dev/null
@@ -1,6 +0,0 @@
-{
-    "program": {
-        "data": "data/dart"
-    },
-    "runner": "fuchsia-pkg://fuchsia.com/dart_jit_runner#meta/dart_jit_runner.cmx"
-}
diff --git a/sdk/bazel/tests/dart/dart/meta/dart_with_unused_dependencies.cmx b/sdk/bazel/tests/dart/dart/meta/dart_with_unused_dependencies.cmx
deleted file mode 100644
index 62fbc7d..0000000
--- a/sdk/bazel/tests/dart/dart/meta/dart_with_unused_dependencies.cmx
+++ /dev/null
@@ -1,6 +0,0 @@
-{
-    "program": {
-        "data": "data/dart_with_unused_dependencies"
-    },
-    "runner": "fuchsia-pkg://fuchsia.com/dart_jit_runner#meta/dart_jit_runner.cmx"
-}
diff --git a/sdk/bazel/tests/dart/fidl-dart/BUILD b/sdk/bazel/tests/dart/fidl-dart/BUILD
deleted file mode 100644
index 7b55ae0..0000000
--- a/sdk/bazel/tests/dart/fidl-dart/BUILD
+++ /dev/null
@@ -1,41 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:dart_app.bzl", "dart_app")
-load("@fuchsia_sdk//build_defs:dart_fidl_library.bzl", "dart_fidl_library")
-load("@fuchsia_sdk//build_defs:package.bzl", "fuchsia_package")
-
-package(default_visibility = ["//visibility:public"])
-
-dart_fidl_library(
-    name = "simple",
-    deps = ["//fidl:simple"],
-)
-
-dart_fidl_library(
-    name = "local_deps",
-    deps = ["//fidl:local_deps"],
-)
-
-dart_fidl_library(
-    name = "fuchsia_deps",
-    deps = ["//fidl:fuchsia_deps"],
-)
-
-dart_app(
-    name = "compile",
-    component_manifest = "meta/compile.cmx",
-    package_name = "tests.fidl_dart_compile",
-    main = "instrument_bindings.dart",
-    deps = [
-        ":simple",
-    ],
-)
-
-fuchsia_package(
-    name = "fidl_dart_package",
-    deps = [
-        ":compile",
-    ],
-)
diff --git a/sdk/bazel/tests/dart/fidl-dart/instrument_bindings.dart b/sdk/bazel/tests/dart/fidl-dart/instrument_bindings.dart
deleted file mode 100644
index fc3a9db..0000000
--- a/sdk/bazel/tests/dart/fidl-dart/instrument_bindings.dart
+++ /dev/null
@@ -1,10 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-import 'package:fidl_bazel_examples_simple/fidl.dart';
-
-void main(List<String> args) {
-  final Hello hello = const Hello(world: 314);
-  print('Hello: $hello');
-}
diff --git a/sdk/bazel/tests/dart/fidl-dart/meta/compile.cmx b/sdk/bazel/tests/dart/fidl-dart/meta/compile.cmx
deleted file mode 100644
index 18646ac..0000000
--- a/sdk/bazel/tests/dart/fidl-dart/meta/compile.cmx
+++ /dev/null
@@ -1,6 +0,0 @@
-{
-    "program": {
-        "data": "data/compile"
-    },
-    "runner": "fuchsia-pkg://fuchsia.com/dart_jit_runner#meta/dart_jit_runner.cmx"
-}
diff --git a/sdk/bazel/tests/dart/flutter/BUILD b/sdk/bazel/tests/dart/flutter/BUILD
deleted file mode 100644
index a013905..0000000
--- a/sdk/bazel/tests/dart/flutter/BUILD
+++ /dev/null
@@ -1,26 +0,0 @@
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-load("@fuchsia_sdk//build_defs:flutter_app.bzl", "flutter_app")
-load("@fuchsia_sdk//build_defs:package.bzl", "fuchsia_package")
-
-flutter_app(
-    name = "app",
-    component_manifest = "meta/app.cmx",
-    main = "main.dart",
-    assets = [
-        "assets/logo.png",
-    ],
-    package_name = "tests.flutter_app",
-    deps = [
-        "@vendor_flutter//:flutter",
-    ],
-)
-
-fuchsia_package(
-    name = "package",
-    deps = [
-        ":app",
-    ],
-)
diff --git a/sdk/bazel/tests/dart/flutter/assets/logo.png b/sdk/bazel/tests/dart/flutter/assets/logo.png
deleted file mode 100644
index 72f753f..0000000
--- a/sdk/bazel/tests/dart/flutter/assets/logo.png
+++ /dev/null
Binary files differ
diff --git a/sdk/bazel/tests/dart/flutter/main.dart b/sdk/bazel/tests/dart/flutter/main.dart
deleted file mode 100644
index 95acca4..0000000
--- a/sdk/bazel/tests/dart/flutter/main.dart
+++ /dev/null
@@ -1,57 +0,0 @@
-// Copyright 2018 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-import 'package:flutter/material.dart';
-
-void main() {
-  runApp(new _MyApp());
-}
-
-class _MyApp extends StatelessWidget {
-  @override
-  Widget build(BuildContext context) {
-    return new MaterialApp(
-      title: 'Hello Material',
-      home: const _MyHomePage(title: 'Hello Material!'),
-    );
-  }
-}
-
-class _MyHomePage extends StatefulWidget {
-  const _MyHomePage({Key key, this.title}) : super(key: key);
-
-  final String title;
-
-  @override
-  _MyHomePageState createState() => new _MyHomePageState();
-}
-
-class _MyHomePageState extends State<_MyHomePage> {
-  int _counter = 0;
-
-  void _incrementCounter() {
-    setState(() {
-      _counter++;
-    });
-  }
-
-  @override
-  Widget build(BuildContext context) {
-    return new Scaffold(
-      appBar: new AppBar(
-        title: new Text(widget.title),
-      ),
-      body: new Center(
-        child: new Text(
-          'Button tapped $_counter time${ _counter == 1 ? '' : 's' }.',
-        ),
-      ),
-      floatingActionButton: new FloatingActionButton(
-        onPressed: _incrementCounter,
-        tooltip: 'Increment',
-        child: new Image.asset('assets/logo.png'),
-      ),
-    );
-  }
-}
diff --git a/sdk/bazel/tests/dart/flutter/meta/app.cmx b/sdk/bazel/tests/dart/flutter/meta/app.cmx
deleted file mode 100644
index 520652d..0000000
--- a/sdk/bazel/tests/dart/flutter/meta/app.cmx
+++ /dev/null
@@ -1,19 +0,0 @@
-{
-    "program": {
-        "data": "data/app"
-    },
-    "sandbox": {
-        "services": [
-            "fuchsia.fonts.Provider",
-            "fuchsia.modular.Clipboard",
-            "fuchsia.modular.ContextWriter",
-            "fuchsia.logger.LogSink",
-            "fuchsia.sys.Environment",
-            "fuchsia.ui.input.ImeService",
-            "fuchsia.ui.policy.Presenter",
-            "fuchsia.ui.scenic.Scenic",
-            "fuchsia.ui.viewsv1.ViewManager"
-        ]
-    },
-    "runner": "fuchsia-pkg://fuchsia.com/flutter_jit_runner#meta/flutter_jit_runner.cmx"
-}
diff --git a/sdk/common/files.py b/sdk/common/files.py
deleted file mode 100644
index 9ac6b32..0000000
--- a/sdk/common/files.py
+++ /dev/null
@@ -1,44 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import errno
-import os
-import shutil
-
-
-def make_dir(file_path):
-    '''Creates the directory hierarchy for the given file and returns the
-    given path.
-    '''
-    target = os.path.dirname(file_path)
-    try:
-        os.makedirs(target)
-    except OSError as exception:
-        if exception.errno == errno.EEXIST and os.path.isdir(target):
-            pass
-        else:
-            raise
-    return file_path
-
-
-def copy_tree(src, dst):
-    '''Recursively copies a directory into another.
-    Differs with shutil.copytree in that it won't fail if the destination
-    directory already exists.
-    '''
-    if not os.path.isdir(dst):
-        os.mkdir(dst)
-    for path, directories, files in os.walk(src):
-        def get_path(name):
-            source_path = os.path.join(path, name)
-            dest_path = os.path.join(dst, os.path.relpath(source_path, src))
-            return (source_path, dest_path)
-        for dir in directories:
-            source, dest = get_path(dir)
-            if not os.path.isdir(dest):
-                os.mkdir(dest)
-        for file in files:
-            source, dest = get_path(file)
-            shutil.copy2(source, dest)
diff --git a/sdk/common/frontend.py b/sdk/common/frontend.py
deleted file mode 100644
index 0f64955..0000000
--- a/sdk/common/frontend.py
+++ /dev/null
@@ -1,109 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import contextlib
-import json
-import os
-import shutil
-import tarfile
-import tempfile
-
-from files import make_dir
-
-
-class Frontend(object):
-    '''Processes the contents of an SDK tarball and runs them through various
-    transformation methods.
-
-    In order to process atoms of type "foo", a frontend needs to define a
-    `install_foo_atom` method that accepts a single argument representing
-    the atom's metadata in JSON format.
-    '''
-
-    def __init__(self, output='', archive='', directory=''):
-        self._archive = archive
-        self._directory = directory
-        self.output = os.path.realpath(output)
-        self._source_dir = ''
-
-    def source(self, *args):
-        '''Builds a path to a source file.
-        Only available while the frontend is running.
-        '''
-        if not self._source_dir:
-            raise Exception('Error: accessing sources while inactive')
-        return os.path.join(self._source_dir, *args)
-
-    def dest(self, *args):
-        '''Builds a path in the output directory.
-        This method also ensures that the directory hierarchy exists in the
-        output directory.
-        Behaves correctly if the first argument is already within the output
-        directory.
-        '''
-        if (os.path.commonprefix([os.path.realpath(args[0]), self.output]) ==
-            self.output):
-          path = os.path.join(*args)
-        else:
-          path = os.path.join(self.output, *args)
-        return make_dir(path)
-
-    def prepare(self, arch, atom_types):
-        '''Called before elements are processed.'''
-        pass
-
-    def finalize(self, arch, atom_types):
-        '''Called after all elements have been processed.'''
-        pass
-
-    def run(self):
-        '''Runs this frontend through the contents of the archive.
-        Returns true if successful.
-        '''
-        with self._create_archive_dir() as archive_dir:
-            self._source_dir = archive_dir
-
-            # Convenience for loading metadata files below.
-            def load_metadata(*args):
-                with open(self.source(*args), 'r') as meta_file:
-                    return json.load(meta_file)
-            manifest = load_metadata('meta', 'manifest.json')
-            atoms = [load_metadata(p) for p in manifest['parts']]
-            types = set([a['type'] for a in atoms])
-
-            self.prepare(manifest['arch'], types)
-
-            # Process each SDK atom.
-            for atom in atoms:
-                type = atom['type']
-                getattr(self, 'install_%s_atom' % type, self._handle_atom)(atom)
-
-            self.finalize(manifest['arch'], types)
-
-            # Reset the source directory, which may be about to disappear.
-            self._source_dir = ''
-        return True
-
-    def _handle_atom(self, atom):
-        '''Default atom handler.'''
-        print('Ignored %s (%s)' % (atom['name'], atom['type']))
-
-    @contextlib.contextmanager
-    def _create_archive_dir(self):
-        if self._directory:
-            yield self._directory
-        elif self._archive:
-            temp_dir = tempfile.mkdtemp(prefix='fuchsia-bazel')
-            # Extract the tarball into the temporary directory.
-            # This is vastly more efficient than accessing files one by one via
-            # the tarfile API.
-            with tarfile.open(self._archive) as archive:
-                archive.extractall(temp_dir)
-            try:
-                yield temp_dir
-            finally:
-                shutil.rmtree(temp_dir, ignore_errors=True)
-        else:
-            raise Exception('Error: archive or directory must be set')
diff --git a/sdk/merger/merge.py b/sdk/merger/merge.py
deleted file mode 100755
index f894aa6..0000000
--- a/sdk/merger/merge.py
+++ /dev/null
@@ -1,323 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import contextlib
-import errno
-import json
-import os
-import shutil
-import sys
-import tarfile
-import tempfile
-
-
-@contextlib.contextmanager
-def _open_archive(archive, directory):
-    '''Manages a directory in which an existing SDK is laid out.'''
-    if directory:
-        yield directory
-    elif archive:
-        temp_dir = tempfile.mkdtemp(prefix='fuchsia-merger')
-        # Extract the tarball into the temporary directory.
-        # This is vastly more efficient than accessing files one by one via
-        # the tarfile API.
-        with tarfile.open(archive) as archive_file:
-            archive_file.extractall(temp_dir)
-        try:
-            yield temp_dir
-        finally:
-            shutil.rmtree(temp_dir, ignore_errors=True)
-    else:
-        raise Exception('Error: archive or directory must be set')
-
-
-@contextlib.contextmanager
-def _open_output(archive, directory):
-    '''Manages the output of this script.'''
-    if directory:
-        # Remove any existing output.
-        shutil.rmtree(directory, ignore_errors=True)
-        yield directory
-    elif archive:
-        temp_dir = tempfile.mkdtemp(prefix='fuchsia-merger')
-        try:
-            yield temp_dir
-            # Write the archive file.
-            with tarfile.open(archive, "w:gz") as archive_file:
-                archive_file.add(temp_dir, arcname='')
-        finally:
-            shutil.rmtree(temp_dir, ignore_errors=True)
-    else:
-        raise Exception('Error: archive or directory must be set')
-
-
-def _get_manifest(sdk_dir):
-    '''Returns the set of elements in the given SDK.'''
-    with open(os.path.join(sdk_dir, 'meta', 'manifest.json'), 'r') as manifest:
-        return json.load(manifest)
-
-
-def _get_meta(element, sdk_dir):
-    '''Returns the contents of the given element's manifest in a given SDK.'''
-    with open(os.path.join(sdk_dir, element), 'r') as meta:
-        return json.load(meta)
-
-
-def _get_files(element_meta):
-    '''Extracts the files associated with the given element.
-    Returns a 2-tuple containing:
-     - the set of arch-independent files;
-     - the sets of arch-dependent files, indexed by architecture.
-    '''
-    type = element_meta['type']
-    common_files = set()
-    arch_files = {}
-    if type == 'cc_prebuilt_library':
-        common_files.update(element_meta['headers'])
-        for arch, binaries in element_meta['binaries'].iteritems():
-            contents = set()
-            contents.add(binaries['link'])
-            if 'dist' in binaries:
-                contents.add(binaries['dist'])
-            if 'debug' in binaries:
-                contents.add(binaries['debug'])
-            arch_files[arch] = contents
-    elif type == 'cc_source_library':
-        common_files.update(element_meta['headers'])
-        common_files.update(element_meta['sources'])
-    elif type == 'dart_library':
-        common_files.update(element_meta['sources'])
-    elif type == 'fidl_library':
-        common_files.update(element_meta['sources'])
-    elif type == 'host_tool':
-        common_files.update(element_meta['files'])
-    elif type == 'image':
-        for arch, file in element_meta['file'].iteritems():
-            arch_files[arch] = set([file])
-    elif type == 'loadable_module':
-        common_files.update(element_meta['resources'])
-        arch_files.update(element_meta['binaries'])
-    elif type == 'sysroot':
-        for arch, version in element_meta['versions'].iteritems():
-            contents = set()
-            contents.update(version['headers'])
-            contents.update(version['link_libs'])
-            contents.update(version['dist_libs'])
-            contents.update(version['debug_libs'])
-            arch_files[arch] = contents
-    elif type == 'documentation':
-        common_files.update(element_meta['docs'])
-    else:
-        raise Exception('Unknown element type: ' + type)
-    return (common_files, arch_files)
-
-
-def _ensure_directory(path):
-    '''Ensures that the directory hierarchy of the given path exists.'''
-    target_dir = os.path.dirname(path)
-    try:
-        os.makedirs(target_dir)
-    except OSError as exception:
-        if exception.errno == errno.EEXIST and os.path.isdir(target_dir):
-            pass
-        else:
-            raise
-
-
-def _copy_file(file, source_dir, dest_dir):
-    '''Copies a file to a given path, taking care of creating directories if
-    needed.
-    '''
-    source = os.path.join(source_dir, file)
-    destination = os.path.join(dest_dir, file)
-    _ensure_directory(destination)
-    shutil.copy2(source, destination)
-
-
-def _copy_files(files, source_dir, dest_dir):
-    '''Copies a set of files to a given directory.'''
-    for file in files:
-        _copy_file(file, source_dir, dest_dir)
-
-
-def _copy_identical_files(set_one, source_dir_one, set_two, source_dir_two,
-                          dest_dir):
-    '''Verifies that two sets of files are absolutely identical and then copies
-    them to the output directory.
-    '''
-    if set_one != set_two:
-        return False
-    # Not verifying that the contents of the files are the same, as builds are
-    # not exactly stable at the moment.
-    _copy_files(set_one, source_dir_one, dest_dir)
-    return True
-
-
-def _copy_element(element, source_dir, dest_dir):
-    '''Copy an entire SDK element to a given directory.'''
-    meta = _get_meta(element, source_dir)
-    common_files, arch_files = _get_files(meta)
-    files = common_files
-    for more_files in arch_files.itervalues():
-        files.update(more_files)
-    _copy_files(files, source_dir, dest_dir)
-    # Copy the metadata file as well.
-    _copy_file(element, source_dir, dest_dir)
-
-
-def _write_meta(element, source_dir_one, source_dir_two, dest_dir):
-    '''Writes a meta file for the given element, resulting from the merge of the
-    meta files for that element in the two given SDK directories.
-    '''
-    meta_one = _get_meta(element, source_dir_one)
-    meta_two = _get_meta(element, source_dir_two)
-    # TODO(DX-495): verify that the common parts of the metadata files are in
-    # fact identical.
-    type = meta_one['type']
-    meta = {}
-    if type == 'cc_prebuilt_library' or type == 'loadable_module':
-        meta = meta_one
-        meta['binaries'].update(meta_two['binaries'])
-    elif type == 'image':
-        meta = meta_one
-        meta['file'].update(meta_two['file'])
-    elif type == 'sysroot':
-        meta = meta_one
-        meta['versions'].update(meta_two['versions'])
-    elif (type == 'cc_source_library' or type == 'dart_library' or
-          type == 'fidl_library' or type == 'host_tool' or
-          type == 'documentation'):
-        # These elements are arch-independent, the metadata does not need any
-        # update.
-        meta = meta_one
-    else:
-        raise Exception('Unknown element type: ' + type)
-    meta_path = os.path.join(dest_dir, element)
-    _ensure_directory(meta_path)
-    with open(meta_path, 'w') as meta_file:
-        json.dump(meta, meta_file, indent=2, sort_keys=True)
-    return True
-
-
-def _write_manifest(source_dir_one, source_dir_two, dest_dir):
-    '''Writes a manifest file resulting from the merge of the manifest files for
-    the two given SDK directories.
-    '''
-    manifest_one = _get_manifest(source_dir_one)
-    manifest_two = _get_manifest(source_dir_two)
-
-    # Host architecture.
-    if manifest_one['arch']['host'] != manifest_two['arch']['host']:
-        print('Error: mismatching host architecture')
-        return False
-    manifest = {
-        'arch': {
-            'host': manifest_one['arch']['host'],
-        }
-    }
-
-    # Target architectures.
-    manifest['arch']['target'] = sorted(set(manifest_one['arch']['target']) |
-                                        set(manifest_two['arch']['target']))
-
-    # Parts.
-    manifest['parts'] = sorted(set(manifest_one['parts']) |
-                               set(manifest_two['parts']))
-
-    manifest_path = os.path.join(dest_dir, 'meta', 'manifest.json')
-    _ensure_directory(manifest_path)
-    with open(manifest_path, 'w') as manifest_file:
-        json.dump(manifest, manifest_file, indent=2, sort_keys=True)
-    return True
-
-
-def main():
-    parser = argparse.ArgumentParser(
-            description=('Merges the contents of two SDKs'))
-    alpha_group = parser.add_mutually_exclusive_group(required=True)
-    alpha_group.add_argument('--alpha-archive',
-                             help='Path to the first SDK - as an archive',
-                             default='')
-    alpha_group.add_argument('--alpha-directory',
-                             help='Path to the first SDK - as a directory',
-                             default='')
-    beta_group = parser.add_mutually_exclusive_group(required=True)
-    beta_group.add_argument('--beta-archive',
-                            help='Path to the second SDK - as an archive',
-                            default='')
-    beta_group.add_argument('--beta-directory',
-                            help='Path to the second SDK - as a directory',
-                            default='')
-    output_group = parser.add_mutually_exclusive_group(required=True)
-    output_group.add_argument('--output-archive',
-                              help='Path to the merged SDK - as an archive',
-                              default='')
-    output_group.add_argument('--output-directory',
-                              help='Path to the merged SDK - as a directory',
-                              default='')
-    args = parser.parse_args()
-
-    has_errors = False
-
-    with _open_archive(args.alpha_archive, args.alpha_directory) as alpha_dir, \
-         _open_archive(args.beta_archive, args.beta_directory) as beta_dir, \
-         _open_output(args.output_archive, args.output_directory) as out_dir:
-
-        alpha_elements = set(_get_manifest(alpha_dir)['parts'])
-        beta_elements = set(_get_manifest(beta_dir)['parts'])
-        common_elements = alpha_elements & beta_elements
-
-        # Copy elements that appear in a single SDK.
-        for element in sorted(alpha_elements - common_elements):
-            _copy_element(element, alpha_dir, out_dir)
-        for element in (beta_elements - common_elements):
-            _copy_element(element, beta_dir, out_dir)
-
-        # Verify and merge elements which are common to both SDKs.
-        for element in sorted(common_elements):
-            alpha_meta = _get_meta(element, alpha_dir)
-            beta_meta = _get_meta(element, beta_dir)
-            alpha_common, alpha_arch = _get_files(alpha_meta)
-            beta_common, beta_arch = _get_files(beta_meta)
-
-            # Common files should not vary.
-            if not _copy_identical_files(alpha_common, alpha_dir, beta_common,
-                                         beta_dir, out_dir):
-                print('Error: different common files for ' + element)
-                has_errors = True
-                continue
-
-            # Arch-dependent files need to be merged in the metadata.
-            all_arches = set(alpha_arch.keys()) | set(beta_arch.keys())
-            for arch in all_arches:
-                if arch in alpha_arch and arch in beta_arch:
-                    if not _copy_identical_files(alpha_arch[arch], alpha_dir,
-                                                 beta_arch[arch], beta_dir,
-                                                 out_dir):
-                        print('Error: different %s files for %s' % (arch,
-                                                                   element))
-                        has_errors = True
-                        continue
-                elif arch in alpha_arch:
-                    _copy_files(alpha_arch[arch], alpha_dir, out_dir)
-                elif arch in beta_arch:
-                    _copy_files(beta_arch[arch], beta_dir, out_dir)
-
-            if not _write_meta(element, alpha_dir, beta_dir, out_dir):
-                print('Error: unable to merge meta for ' + element)
-                has_errors = True
-
-        if not _write_manifest(alpha_dir, beta_dir, out_dir):
-            print('Error: could not write manifest file')
-            has_errors = True
-
-        # TODO(DX-495): verify that metadata files are valid.
-
-    return 1 if has_errors else 0
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/sdk/tools/visualize_manifest.py b/sdk/tools/visualize_manifest.py
deleted file mode 100755
index 4f1029c..0000000
--- a/sdk/tools/visualize_manifest.py
+++ /dev/null
@@ -1,65 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import json
-import os
-import sys
-
-
-def sanitize_name(name):
-    '''Makes a given string usable in a label.'''
-    return name.replace('-', '_')
-
-
-def get_atom_id(id):
-    '''Returns a string representing a sanitized version of the given id.'''
-    return '%s__%s' % (sanitize_name(id['domain']), sanitize_name(id['name']))
-
-
-def main():
-    parser = argparse.ArgumentParser(
-            description=('Generates a DOT file representing the contents of an '
-                         'SDK manifest'))
-    parser.add_argument('--manifest',
-                        help='Path to the SDK manifest',
-                        required=True)
-    parser.add_argument('--output',
-                        help=('Path to the DOT file to produce, '
-                               'defaults to <manifest_name>.dot'))
-    args = parser.parse_args()
-
-    with open(args.manifest, 'r') as manifest_file:
-        manifest = json.load(manifest_file)
-
-    all_atoms = manifest['atoms']
-    domains = set([a['id']['domain'] for a in all_atoms])
-
-    if args.output is not None:
-        output = args.output
-    else:
-        output = '%s.dot' % os.path.basename(args.manifest).split('.', 1)[0]
-
-    with open(output, 'w') as out:
-        out.write('digraph fuchsia {\n')
-        for index, domain in enumerate(domains):
-            out.write('subgraph cluster_%s {\n' % index)
-            out.write('label="%s";\n' % domain)
-            atoms = [a for a in all_atoms if a['id']['domain'] == domain]
-            for atom in atoms:
-                out.write('%s [label="%s"];\n' % (get_atom_id(atom['id']),
-                                                  atom['id']['name']))
-            out.write('}\n')
-        for atom in all_atoms:
-            if not atom['deps']:
-                continue
-            id = get_atom_id(atom['id'])
-            dep_ids = [get_atom_id(d) for d in atom['deps']]
-            out.write('%s -> { %s }\n' % (id, ' '.join(dep_ids)));
-        out.write('}\n')
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/start-dhcp-server.sh b/start-dhcp-server.sh
deleted file mode 100755
index 90447cd..0000000
--- a/start-dhcp-server.sh
+++ /dev/null
@@ -1,172 +0,0 @@
-#!/bin/bash
-# Copyright 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# This script takes a network interface (eg: tap0) as its only argument. It sets
-# up that interface for running a Fuchsia device. It runs dnsmasq to provide
-# DHCP and DNS to the Fuchsia device. It configures NAT. It will do its best to
-# kill old instances of dnsmasq from previous runs of the script for this
-# interface.
-#
-# It can be passed with -u as a start-up script to run-zircon-* or frun to
-# bring up a network for a new qemu instance
-#
-# If the environment variable FUCHSIA_IP is set it will give that IP to the
-# Fuchsia device, otherwise, for historical reasons it will allocate
-# 192.168.3.53.
-
-set -eo pipefail; [[ "$TRACE" ]] && set -x
-
-INTERFACE=$1
-LEASE_FILE=/tmp/fuchsia-dhcp-$INTERFACE.leases
-PID_FILE=/tmp/fuchsia-dhcp-$INTERFACE.pid
-LOG_FILE=/tmp/fuchsia-dhcp-$INTERFACE.log
-
-if [[ -z "$INTERFACE" ]]
-then
-  echo "Missing interface name."
-  exit 1
-fi
-
-FUCHSIA_IP=${FUCHSIA_IP:-192.168.3.53} # is this a good default?
-if [[ ! $FUCHSIA_IP =~ (^[0-9]+\.[0-9]+\.[0-9]+)\.[0-9]+$ ]]
-then
-  echo "Invalid FUCHSIA_IP '$FUCHSIA_IP'. Must be a valid IPv4 address."
-  exit 1
-fi
-
-SUBNET_PREFIX=${BASH_REMATCH[1]}
-FUCHSIA_NETWORK=${SUBNET_PREFIX}.0/24
-HOST_IP=${SUBNET_PREFIX}.1
-
-DARWIN=false
-if [[ $(uname -s) == Darwin ]]
-then
-  DARWIN=true
-fi
-
-# Find the dnsmasq binary.
-DNSMASQ=$(which dnsmasq) || DNSMASQ=$(brew --prefix)/sbin/dnsmasq
-if [[ ! -x "$DNSMASQ" ]]
-then
-  echo "dnsmasq not found."
-  if $DARWIN
-  then
-    echo " brew install dnsmasq"
-  else
-    echo " apt-get install dnsmasq"
-  fi
-  exit 1
-fi
-
-if [[ $DARWIN == false && $(pidof NetworkManager) ]]
-then
-  nmstat=$(nmcli d status | awk "/$INTERFACE / { print \$3 }")
-  if [[ -n $nmstat && $nmstat != unmanaged ]]; then
-    echo "$INTERFACE is managed by NetworkManager so can't be configured by this script."
-    echo ""
-    echo "If you DON'T want this, create a file /etc/network/interfaces.d/$INTERFACE.conf containing:"
-    echo "iface $INTERFACE inet manual"
-    echo ""
-    echo "Then restart Network manager with: sudo killall NetworkManager"
-    exit 1
-  fi
-fi
-
-# Check if dnsmasq is running.
-if [[ -r $PID_FILE ]]
-then
-  # Read the PID file.
-  DNSMASQ_PID=$(<$PID_FILE)
-
-  # Check that the PID file actually refers to a dnsmasq process.
-  if $DARWIN
-  then
-    DNSMASQ_PID_NAME=$( (ps -A -o comm $DNSMASQ_PID || true) | tail +2)
-    if [[ "$DNSMASQ_PID_NAME" != "$DNSMASQ" ]]
-    then
-      # There's a PID file but the process name isn't right.
-      unset DNSMASQ_PID
-    fi
-  else
-    if [[ /proc/$DNSMASQ_PID/exe -ef $DNSMASQ ]]
-    then
-      # There's a PID file but the process name isn't right.
-      unset DNSMASQ_PID
-    fi
-  fi
-
-  if [[ -n "$DNSMASQ_PID" ]]
-  then
-    echo "Killing the old dnsmasq (pid: $DNSMASQ_PID)..."
-    sudo kill $DNSMASQ_PID || true
-    sudo rm -f $PID_FILE
-  fi
-fi
-
-if [[ -f "$LEASE_FILE" ]]
-then
-  echo "Removing the old dnsmasq lease file $LEASE_FILE ..."
-  sudo rm $LEASE_FILE
-fi
-
-# Bring up the network.
-echo "Bringing up the network interface: $INTERFACE"
-sudo ifconfig $INTERFACE inet $HOST_IP
-
-if $DARWIN
-then
-  LOOPBACK=lo0
-else
-  LOOPBACK=lo
-fi
-
-echo Starting dnsmasq...
-# TODO: can we use --dhcp-host instead of --dhcp-range
-sudo $DNSMASQ \
-  --conf-file=/dev/null \
-  --bind-interfaces \
-  --interface=$INTERFACE \
-  --except-interface=$LOOPBACK \
-  --dhcp-range=$INTERFACE,$FUCHSIA_IP,$FUCHSIA_IP,24h \
-  --dhcp-leasefile=$LEASE_FILE \
-  --pid-file=$PID_FILE \
-  --log-facility=$LOG_FILE \
-  --listen-address=$HOST_IP
-
-if $DARWIN
-then
-  # OSX will not bring up ipv6 until an ipv6 address is assigned, but as soon
-  # as an address is assigned, it will also assign a link-local address. Here
-  # we assign the same address as used by the zircon ifup script, and let OSX
-  # assign the link-local address. Previously we computed and assigned a
-  # link-local address, but this resulted in duplicate addresses assigned to
-  # the interface, and TAP just duplicated that traffic to applications.
-  # This is configured after dnsmasq is started, as dnsmasq has no need to
-  # listen on ipv6, and fails to bind fc00.
-  sudo ifconfig $INTERFACE inet6 fc00::/7 up
-
-  DEFAULT_INTERFACE=$(route -n get default | awk '/interface:/ { print $2 }') || true
-else
-  DEFAULT_INTERFACE=$(ip route get 8.8.8.8 | awk '/^8.8.8.8/ { print $5 }')
-fi
-if [[ -z "$DEFAULT_INTERFACE" ]]
-then
-  echo "No default route, not enabling forwarding."
-else
-  echo "Enable IP forwarding..."
-  if $DARWIN
-  then
-    sudo sysctl -q net.inet.ip.forwarding=1
-    echo "
-    nat on $DEFAULT_INTERFACE from $FUCHSIA_NETWORK to any -> ($DEFAULT_INTERFACE)
-    pass out on $DEFAULT_INTERFACE inet from $FUCHSIA_NETWORK to any
-    " | sudo pfctl -q -ef - >& /dev/null || true
-  else
-    sudo /bin/bash -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
-    sudo iptables -t nat -A POSTROUTING -o $DEFAULT_INTERFACE -j MASQUERADE
-    sudo iptables -A FORWARD -i $DEFAULT_INTERFACE -o $INTERFACE -m state --state RELATED,ESTABLISHED -j ACCEPT
-    sudo iptables -A FORWARD -i $INTERFACE -o $DEFAULT_INTERFACE -j ACCEPT
-  fi
-fi
diff --git a/style/check-header-guards.py b/style/check-header-guards.py
deleted file mode 100755
index 5615922..0000000
--- a/style/check-header-guards.py
+++ /dev/null
@@ -1,249 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-"""Script to check C and C++ file header guards.
-
-This script accepts a list of file or directory arguments. If a given
-path is a file, it runs the checker on it. If the path is a directory,
-it runs the checker on all files in that directory.
-
-In addition, this script checks for potential header guard
-collisions. This is useful since we munge / to _, and so
-    lib/abc/xyz/xyz.h
-and
-    lib/abc_xyz/xyz.h
-both want to use LIB_ABC_XYZ_XYZ_H_ as a header guard.
-
-"""
-
-
-import argparse
-import collections
-import fileinput
-import os.path
-import re
-import string
-import sys
-
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # style
-    os.path.realpath(
-    os.path.abspath(__file__)))))
-
-PUBLIC_PREFIXES = [
-    'ZIRCON_SYSTEM_ULIB_.*_INCLUDE',
-    'ZIRCON_SYSTEM_PUBLIC',
-    'GARNET_PUBLIC',
-    'PERIDOT_PUBLIC',
-    'TOPAZ_PUBLIC',
-]
-public_prefix = re.compile('^(' + string.join(PUBLIC_PREFIXES, '|') + ')_')
-
-all_header_guards = collections.defaultdict(list)
-
-pragma_once = re.compile('^#pragma once$')
-disallowed_header_characters = re.compile('[^a-zA-Z0-9_]')
-
-def adjust_for_layer(header_guard):
-    """Remove internal layer prefix from public headers if applicable."""
-    return public_prefix.sub('', header_guard, 1)
-
-def check_file(path, fix_guards=False):
-    """Check whether the file has a correct header guard.
-
-    A header guard can either be a #pragma once, or else a matching set of
-        #ifndef PATH_TO_FILE_
-        #define PATH_TO_FILE_
-        ...
-        #endif  // PATH_TO_FILE_
-    preprocessor directives, where both '.' and '/' in the path are
-    mapped to '_', and a trailing '_' is appended.
-
-    In either the #pragma once case or the header guard case, it is
-    assumed that there is no trailing or leading whitespace.
-
-    """
-
-    # Only check .h files
-    if path[-2:] != '.h':
-        return True
-
-    assert(path.startswith(FUCHSIA_ROOT))
-    relative_path = path[len(FUCHSIA_ROOT):].strip('/')
-    upper_path = relative_path.upper()
-    header_guard = re.sub(disallowed_header_characters, '_', upper_path) + '_'
-    header_guard = adjust_for_layer(header_guard)
-    all_header_guards[header_guard].append(path)
-
-    ifndef = re.compile('^#ifndef %s$' % header_guard)
-    define = re.compile('^#define %s$' % header_guard)
-    endif = re.compile('^#endif +// *%s$' % header_guard)
-
-    found_pragma_once = False
-    found_ifndef = False
-    found_define = False
-    found_endif = False
-
-    with open(path, 'r') as f:
-        for line in f.readlines():
-            match = pragma_once.match(line)
-            if match:
-                if found_pragma_once:
-                    print('%s contains multiple #pragma once' % path)
-                    return False
-                found_pragma_once = True
-
-            match = ifndef.match(line)
-            if match:
-                if found_ifndef:
-                    print('%s contains multiple ifndef header guards' % path)
-                    return False
-                found_ifndef = True
-
-            match = define.match(line)
-            if match:
-                if found_define:
-                    print('%s contains multiple define header guards' % path)
-                    return False
-                found_define = True
-
-            match = endif.match(line)
-            if match:
-                if found_endif:
-                    print('%s contains multiple endif header guards' % path)
-                    return False
-                found_endif = True
-
-    if found_pragma_once:
-        if found_ifndef or found_define or found_endif:
-            print('%s contains both #pragma once and header guards' % path)
-            return False
-        if not fix_guards:
-            return True
-
-    if found_ifndef and found_define and found_endif:
-        return True
-
-    if not found_ifndef:
-      print('%s did not contain ifndef part of its header guard' % path)
-    elif not found_define:
-      print('%s did not contain define part of its header guard' % path)
-    elif not found_endif:
-      print('%s did not contain endif part of its header guard' % path)
-    elif fix_guards:
-        if found_pragma_once:
-            print('%s contained #pragma once instead of a header guard' % path)
-        else:
-            print('%s did not contain a header guard or the header guard did '
-                  'not match the file path' % path)
-    else:
-        print('%s contained neither a proper header guard nor #pragma once' %
-              path)
-
-    header_guards_fixed = False
-    if fix_guards:
-        header_guards_fixed = fix_header_guard(path, header_guard)
-
-    if not header_guards_fixed:
-        print('Allowable header guard values are %s' % all_header_guards.keys());
-
-    return False
-
-
-def fix_header_guard(path, header_guard):
-    """Attempt to fix the header guard in the given file."""
-    ifndef = re.compile('^#ifndef [^\s]+_H_$')
-    define = re.compile('^#define [^\s]+_H_$')
-    endif = re.compile('^#endif +// *[^\s]+_H_$')
-    fixed_ifndef = False
-    fixed_define = False
-    fixed_endif = False
-    fixed_pragma_once = False
-
-    for line in fileinput.input(path, inplace=1):
-        (new_line, changes) = re.subn(ifndef,
-                                      '#ifndef %s' % header_guard,
-                                      line)
-        if changes:
-            fixed_ifndef = True
-            sys.stdout.write(new_line)
-            continue
-        (new_line, changes) = re.subn(define,
-                                      '#define %s' % header_guard,
-                                      line)
-        if changes:
-            fixed_define = True
-            sys.stdout.write(new_line)
-            continue
-        (new_line, changes) = re.subn(endif,
-                                      '#endif  // %s' % header_guard,
-                                      line)
-        if changes:
-            fixed_endif = True
-            sys.stdout.write(new_line)
-            continue
-        if pragma_once.match(line):
-            fixed_pragma_once = True
-            sys.stdout.write('#ifndef %s\n' % header_guard)
-            sys.stdout.write('#define %s\n' % header_guard)
-            continue
-        sys.stdout.write(line)
-
-    if fixed_pragma_once:
-        with open(path, 'a') as file:
-            file.write('\n')
-            file.write('#endif  // %s\n' % header_guard)
-
-    if (fixed_ifndef and fixed_define and fixed_endif) or fixed_pragma_once:
-        print('Fixed!')
-        return True
-
-    print('Not fixed...')
-    return False
-
-
-def check_dir(p, fix_guards=False):
-    """Walk recursively over a directory checking .h files"""
-
-    def prune(d):
-        if d[0] == '.' or d == 'third_party':
-            return True
-        return False
-
-    for root, dirs, paths in os.walk(p):
-        # Prune dot directories like .git
-        [dirs.remove(d) for d in list(dirs) if prune(d)]
-        for path in paths:
-            check_file(os.path.join(root, path), fix_guards=fix_guards)
-
-
-def check_collisions():
-    for header_guard, paths in all_header_guards.iteritems():
-        if len(paths) == 1:
-            continue
-        print('Multiple files could use %s as a header guard:' % header_guard)
-        for path in paths:
-            print('    %s' % path)
-
-
-def main():
-    parser = argparse.ArgumentParser()
-    parser.add_argument('--fix',
-                        help='Correct wrong header guards',
-                        action='store_true')
-    (arg_results, other_args) = parser.parse_known_args()
-    fix_guards = arg_results.fix
-    for p in other_args:
-        p = os.path.realpath(os.path.abspath(p))
-        if os.path.isdir(p):
-            check_dir(p, fix_guards=fix_guards)
-        else:
-            check_file(p, fix_guards=fix_guards)
-    check_collisions()
-
-
-if __name__ == "__main__":
-    sys.exit(main())
diff --git a/style/json-fmt.py b/style/json-fmt.py
deleted file mode 100755
index 8b39985..0000000
--- a/style/json-fmt.py
+++ /dev/null
@@ -1,40 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-"""Script to format JSON files.
-
-This script accepts a list of files as arguments, and for each of them attempts
-to parse it as JSON, and update it in-place with a pretty-printed version. Stops
-on the first error.
-"""
-
-import argparse
-import json
-import sys
-
-
-def main():
-    parser = argparse.ArgumentParser()
-    parser.add_argument(
-        'file',
-        type=argparse.FileType('r+'),
-        nargs='+',
-        help='JSON file to be pretty-printed.')
-    args = parser.parse_args()
-    for json_file in args.file:
-        with json_file:
-            data = json.load(json_file)
-            json_file.seek(0)
-            json_file.truncate()
-            json.dump(
-                data,
-                json_file,
-                indent=4,
-                sort_keys=True,
-                separators=(',', ': '))
-            json_file.write('\n')
-
-
-if __name__ == "__main__":
-    main()
diff --git a/style/verify-fidl-libraries.py b/style/verify-fidl-libraries.py
deleted file mode 100755
index 88d263b..0000000
--- a/style/verify-fidl-libraries.py
+++ /dev/null
@@ -1,70 +0,0 @@
-#!/usr/bin/env python
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-import argparse
-import os
-import re
-import subprocess
-import sys
-
-
-FUCHSIA_ROOT = os.path.dirname(  # $root
-    os.path.dirname(             # scripts
-    os.path.dirname(             # style
-    os.path.abspath(__file__))))
-
-NAMESPACES = [
-    'fidl',
-    'fuchsia',
-    'test',
-]
-
-
-def main():
-    parser = argparse.ArgumentParser(
-            description=('Checks that FIDL libraries in a given layer are '
-                         'properly namespaced'))
-    layer_group = parser.add_mutually_exclusive_group(required=True)
-    layer_group.add_argument('--layer',
-                             help='Name of the layer to analyze',
-                             choices=['zircon', 'garnet', 'peridot', 'topaz'])
-    layer_group.add_argument('--vendor-layer',
-                             help='Name of the vendor layer to analyze')
-    parser.add_argument('--namespaces',
-                        help='The list of allowed namespaces, defaults to '
-                             '[%s]' % ', '.join(NAMESPACES),
-                        nargs='*',
-                        default=NAMESPACES)
-    args = parser.parse_args()
-
-    if args.layer:
-        base = os.path.join(FUCHSIA_ROOT, args.layer)
-    else:
-        base = os.path.join(FUCHSIA_ROOT, 'vendor', args.vendor_layer)
-
-    files = subprocess.check_output(['git', '-C', base, 'ls-files', '*.fidl'])
-
-    has_errors = False
-    for file in files.splitlines():
-        with open(os.path.join(base, file), 'r') as fidl:
-            contents = fidl.read()
-            result = re.search(r'^library ([^\.;]+)[^;]*;$', contents,
-                               re.MULTILINE)
-            if not result:
-                print('Missing library declaration (%s)' % file)
-                has_errors = True
-                continue
-            namespace = result.group(1)
-            if namespace not in args.namespaces:
-                print(
-                    'Invalid namespace %s (%s), namespace must begin with one of [%s].'
-                    % (namespace, file, ', '.join(NAMESPACES)))
-                has_errors = True
-
-    return 1 if has_errors else 0
-
-
-if __name__ == '__main__':
-    sys.exit(main())
diff --git a/tests/common_term_styles-test-visually b/tests/common_term_styles-test-visually
deleted file mode 100755
index 32d8f37..0000000
--- a/tests/common_term_styles-test-visually
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Visual tests for //scripts/devshell/lib/common_term_styles.sh
-
-# This is not an automated unit test.
-# It prints stylized to demonstrate the styles on a terminal.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")"/../devshell/lib >/dev/null 2>&1 && pwd)"/style.sh || exit $?
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")"/../devshell/lib >/dev/null 2>&1 && pwd)"/common_term_styles.sh || exit $?
-
-runtest() {
-  command="$1"; shift
-  echo "${command}" "$@"
-  ${command} "$@"
-}
-
-runtest info 'This is informational'
-runtest warn 'This is your last warning'
-runtest error 'Danger! Danger Will Robinson!'
-runtest details <<EOF
-This detail will be
-indented from the error.
-and could have a link like $(link 'https://some/url/here')
-EOF
-runtest code <<EOF
-for ( line in lines_of_code ) {
-  print "this is the demon of code style"
-end
-EOF
diff --git a/tests/common_term_styles-tests b/tests/common_term_styles-tests
deleted file mode 100755
index b51550d..0000000
--- a/tests/common_term_styles-tests
+++ /dev/null
@@ -1,40 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Automated tests for //scripts/devshell/lib/common_term_styles.sh
-#
-# Usage: common_term_styles-tests
-#
-#   Returns: Error status if actual output does not match expected.
-
-TEST_NAME="$(basename "${BASH_SOURCE[0]}")"
-TESTS_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-
-verbose() {
-  echo
-  echo "======================================================="
-  echo
-  echo "$@"
-  echo
-  "$@"
-  echo
-}
-
-test_main() {
-  local expected_out="${TESTS_DIR}/expected/${TEST_NAME}.out"
-  local expected_err="${TESTS_DIR}/expected/${TEST_NAME}.err"
-  local capture_dir=$(mktemp -d)
-  local actual_out="${capture_dir}/${TEST_NAME}.out"
-  local actual_err="${capture_dir}/${TEST_NAME}.err"
-  ${TESTS_DIR}/common_term_styles-test-visually 1> "${actual_out}" 2> "${actual_err}"
-
-  local status=0
-  verbose diff "${expected_out}" "${actual_out}" || status=$?
-  verbose diff "${expected_err}" "${actual_err}" || status=$?
-
-  return $status
-}
-
-test_main "$@" || return $?
diff --git a/tests/expected/common_term_styles-tests.err b/tests/expected/common_term_styles-tests.err
deleted file mode 100644
index e69de29..0000000
--- a/tests/expected/common_term_styles-tests.err
+++ /dev/null
diff --git a/tests/expected/common_term_styles-tests.out b/tests/expected/common_term_styles-tests.out
deleted file mode 100644
index 4cd26e1..0000000
--- a/tests/expected/common_term_styles-tests.out
+++ /dev/null
@@ -1,20 +0,0 @@
-info This is informational
-
-INFO: This is informational
-
-warn This is your last warning
-
-WARNING: This is your last warning
-
-error Danger! Danger Will Robinson!
-
-ERROR: Danger! Danger Will Robinson!
-
-details
-  This detail will be
-  indented from the error.
-  and could have a link like https://some/url/here
-code
-    for ( line in lines_of_code ) {
-      print "this is the demon of code style"
-    end
diff --git a/tests/expected/style-tests.err b/tests/expected/style-tests.err
deleted file mode 100644
index c7e4d76..0000000
--- a/tests/expected/style-tests.err
+++ /dev/null
@@ -1,104 +0,0 @@
-
-Usage: style::stylize <command> [style options] [command parameters]
-
-<command> is any command with output to stylize, followed by style options,
-and then the command's normal parameters.
-
-style options include:
-  --bold, --faint, --underline, etc.
-  --color <color_name>
-  --background <color_name>
-  --indent <spaces_count>
-  --stderr (output to standard error instead of standard out)
-
-  echo "This is $(style::echo -f --bold LOUD) and soft."
-
-command parameters are those supported by the stylized command.
-
-Use style::stylize --help colors for a list of colors or backgrounds
-Use style::stylize --help attributes for a list of style attribute flags
-
-Usage: style::printf [style options] [command parameters]
-
-style options include:
-  --bold, --faint, --underline, etc.
-  --color <color_name>
-  --background <color_name>
-  --indent <spaces_count>
-  --stderr (output to standard error instead of standard out)
-
-  echo "This is $(style::echo -f --bold LOUD) and soft."
-
-command parameters are those supported by the 'printf' command.
-
-Use style::printf --help colors for a list of colors or backgrounds
-Use style::printf --help attributes for a list of style attribute flags
-
-Usage: style::error [style options] [command parameters]
-
-Default style options for style::error:
-  "--stderr --bold --color red"
-
-style options include:
-  --bold, --faint, --underline, etc.
-  --color <color_name>
-  --background <color_name>
-  --indent <spaces_count>
-  --stderr (output to standard error instead of standard out)
-
-  echo "This is $(style::echo -f --bold LOUD) and soft."
-
-command parameters are those supported by the echo command.
-
-Use style::error --help colors for a list of colors or backgrounds
-Use style::error --help attributes for a list of style attribute flags
-
-Usage: style::link [style options] [command parameters]
-
-Default style options for style::link:
-  "--underline --color dark_blue"
-
-style options include:
-  --bold, --faint, --underline, etc.
-  --color <color_name>
-  --background <color_name>
-  --indent <spaces_count>
-  --stderr (output to standard error instead of standard out)
-
-  echo "This is $(style::echo -f --bold LOUD) and soft."
-
-command parameters are those supported by the echo command.
-
-Use style::link --help colors for a list of colors or backgrounds
-Use style::link --help attributes for a list of style attribute flags
-black
-blue
-cyan
-dark_blue
-dark_cyan
-dark_green
-dark_magenta
-dark_red
-dark_yellow
-default
-gray
-green
-light_gray
-magenta
-pink
-purple
-red
-white
-yellow
---blink
---bold
---faint
---italic
---reset
---underline
-INFO: Info here
-WARNING: Watch out!
-ERROR: What went wrong now?
-WARNING: Customized warning style, still to stderr! :-)
-
-This should still display in bold red, but on stderr
diff --git a/tests/expected/style-tests.out b/tests/expected/style-tests.out
deleted file mode 100644
index 2652af8..0000000
--- a/tests/expected/style-tests.out
+++ /dev/null
@@ -1,42 +0,0 @@
-------------------
-------------------
-------------------
------ colors -----
---- attributes ---
-------------------
-style::echo --bold
-style::echo --bold --color cyan
-style::echo --faint --color green
-style::echo --italic --color magenta
-style::echo --underline --color dark_blue
-style::echo --blink --color light_gray
-style::echo --pink --background dark_cyan
-italic this style may not work in some terminals: style::echo --italic --dark_magenta italic this style may not work in some terminals:
-      Item   Cost
-      ----   ----
-     beans $  2.90
-    franks $  9.35
-      cola $  7.99
-  tiramasu $ 24.50
-Now is the time for all good
-people to come to the
-aid of their country and world.
-    Now is the time for all good
-    people to come to the
-    aid of their country and world.
-http://wikipedia.com
-STYLE_TO_TTY_ONLY=true
-
-This will not be styled. It doesn't print directly to the tty
-
-This will not be styled. stderr doesn't print directly to the tty
-STYLE_TO_TTY_ONLY=false
-
-This will be styled even though it doesn't print directly to the tty.
-
-This will be styled even though stderr doesn't print directly to the tty
-This is -f --bold --yellow LOUD and soft.
-This is --force --bold --yellow LOUD and soft.
-This is --tty --bold --yellow LOUD and soft.
-Bad style, Error status: 2
-No orange! Error status: 2
diff --git a/tests/pave-prebuilt-tests b/tests/pave-prebuilt-tests
deleted file mode 100755
index bbdf7d6..0000000
--- a/tests/pave-prebuilt-tests
+++ /dev/null
@@ -1,180 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Unit tests for //scripts/pave-prebuilt.
-#
-# Usage: pave-prebuilt-tests
-
-set -o errexit
-set -o nounset
-set -o pipefail
-
-# Read in the tool under test. Set TESTING=1 to avoid invoking main().
-readonly PARENT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-TESTING=1 source "${PARENT_DIR}/../pave-prebuilt"
-
-# expect_archive_produces_id <archive-path> <expected-id>
-function expect_archive_produces_id {
-  local archive="$1"
-  local expected_id="$2"
-
-  # archive_to_unique_id expects the file to live in the filesystem,
-  # so create an empty file at a related path under a temp directory.
-  local tmpdir
-  tmpdir="$(mktemp -d)"
-  local full_archive="${tmpdir}/${archive}"
-  mkdir -p "$(dirname "${full_archive}")"
-  touch "${full_archive}"
-
-  EXPECT_EQ \
-    "${expected_id}" \
-    "$(archive_to_unique_id "${full_archive}")" \
-    "input filename '${full_archive}'"
-
-  rm -rf "${tmpdir}"
-}
-
-function test::archive_to_unique_id {
-  # Expected ID when falling back to the "hash the archive contents" case.
-  # Hash is the SHA-1 sum of an empty file.
-  local fallback_hash='hash-da39a3ee5e6b4b0d3255bfef95601890afd80709'
-
-  # Stem tests #
-
-  # A long hex hash followed by an extension is a stem.
-  expect_archive_produces_id \
-    '/home/USER/Downloads/625284e057cc25a77cdab944d46ab3de692bade1.tar' \
-    'stem-625284e057cc25a77cdab944d46ab3de692bade1'
-
-  # Extension doesn't matter.
-  expect_archive_produces_id \
-    '/home/USER/Downloads/625284e057cc25a77cdab944d46ab3de692bade1.png' \
-    'stem-625284e057cc25a77cdab944d46ab3de692bade1'
-
-  # A short hex hash is not a stem, and falls back to hashing.
-  expect_archive_produces_id \
-    '/home/USER/Downloads/625284e0.tar' \
-    "${fallback_hash}"
-
-  # A long hex hash followed by non-hex before the extension is not a stem,
-  # and falls back to hashing.
-  expect_archive_produces_id \
-    '/home/USER/Downloads/625284e057cc25a77cdab944d46ab3de692bade1__ZZZZZ.tar' \
-    "${fallback_hash}"
-
-  # Build ID tests #
-
-  # A "fuchsia.*" file living in a numbered directory is a build ID.
-  expect_archive_produces_id \
-    '/home/USER/.cache/builds/8937296116261653552/fuchsia.tar.gz' \
-    'build-8937296116261653552'
-
-  # Extension dosen't matter.
-  expect_archive_produces_id \
-    '/home/USER/.cache/builds/8937296116261653552/fuchsia.png' \
-    'build-8937296116261653552'
-
-  # A "fuchsia.*" file living in a small-numbered directory is not a build ID,
-  # and falls back to hashing.
-  expect_archive_produces_id \
-    '/home/USER/.cache/builds/552/fuchsia.tar.gz' \
-    "${fallback_hash}"
-
-  # A "fuchsia.*" file living in a number-prefixed directory is not a build ID,
-  # and falls back to hashing.
-  expect_archive_produces_id \
-    '/home/USER/.cache/builds/8937296116261653552__ZZZZZ/fuchsia.tar.gz' \
-    "${fallback_hash}"
-
-  # A "fuchsia.*" file living in a non-numbered directory is not a build ID,
-  # and falls back to hashing.
-  expect_archive_produces_id \
-    '/home/USER/Downloads/fuchsia.tar.gz' \
-    "${fallback_hash}"
-}
-
-# Build ID is sniffed even when archive is passed as a relative path.
-function test::archive_to_unique_id_relative {
-  local tmpdir
-  tmpdir="$(mktemp -d)"
-  local archive='/home/USER/.cache/builds/8937296116261653552/fuchsia.tar.gz'
-  local full_archive="${tmpdir}/${archive}"
-  mkdir -p "$(dirname "${full_archive}")"
-  touch "${full_archive}"
-
-  local unique_id
-  unique_id="$(
-    cd "$(dirname "${full_archive}")"
-    archive_to_unique_id ./fuchsia.tar.gz
-  )"
-
-  EXPECT_EQ \
-    'build-8937296116261653552' \
-    "${unique_id}" \
-    "input filename './fuchsia.tar.gz'"
-
-  rm -rf "${tmpdir}"
-}
-
-#
-# Test framework.
-# TOOD(dbort): Move this to a common file if other tools want to use it.
-#
-
-# Any failing assert/expect should set this to 1.
-FAILED=0
-
-function EXPECT_EQ {
-  local v1="$1"
-  shift
-  local v2="$1"
-  shift
-  if [[ "${v1}" != "${v2}" ]]; then
-    local msg="$@"
-    if [[ -n "${msg}" ]]; then
-      msg=": ${msg}"
-    fi
-    echo "TEST FAILURE: '${v1}' != '${v2}'${msg}"
-    FAILED=1
-    return 1
-  fi
-}
-
-# Prints the names of all functions with a test:: prefix.
-function print_test_functions {
-  # "declare -F" prints all declared function names, with lines like
-  # "declare -f funcname".
-  declare -F \
-    | grep -E '^declare -f test::' \
-    | sed -e 's/^declare -f //'
-}
-
-function test_main {
-  local num_tests=0
-  local num_failures=0
-  for t in $(print_test_functions); do
-    num_tests=$(( num_tests + 1 ))
-    FAILED=0
-    echo "RUNNING: ${t}"
-    "${t}" || FAILED=1
-    if (( FAILED )); then
-      num_failures=$(( num_failures + 1 ))
-      echo "FAILED: ${t}"
-    else
-      echo "PASSED: ${t}"
-    fi
-  done
-  if (( num_failures == 0 )); then
-    echo "All ${num_tests} tests passed!"
-    echo "PASS"
-    return 0
-  else
-    echo "${num_failures}/${num_tests} tests failed"
-    echo "FAIL"
-    return 1
-  fi
-}
-
-test_main "$@"
diff --git a/tests/style-test-visually b/tests/style-test-visually
deleted file mode 100755
index 1209d0e..0000000
--- a/tests/style-test-visually
+++ /dev/null
@@ -1,93 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Visual tests for //scripts/devshell/lib/style.sh
-
-# This is not an automated unit test.
-# It prints stylized text describing the style to be shown, so a tester
-# can validate the expected style is rendered.
-
-# Note that some terminals do not support all terminal styles.
-# For instance, italic may not render as italic on MacOS.
-
-source "$(cd "$(dirname "${BASH_SOURCE[0]}")"/../devshell/lib >/dev/null 2>&1 && pwd)"/style.sh || exit $?
-
-runtest() {
-  command="$1"; shift
-  "${command}" "$@" "${command} $*"
-}
-
-style::stylize --help
-style::echo --blue $'------------------'
-style::printf --help
-style::echo --blue $'------------------'
-style::error --help
-style::echo --blue $'------------------'
-style::link --help
-style::echo --blue $'----- colors -----'
-style::echo --help colors
-style::echo --blue $'--- attributes ---'
-style::echo --help attributes
-style::echo --blue $'------------------'
-
-runtest style::echo --bold
-runtest style::echo --bold --color cyan
-runtest style::echo --faint --color green
-runtest style::echo --italic --color magenta
-runtest style::echo --underline --color dark_blue
-runtest style::echo --blink --color light_gray
-runtest style::echo --pink --background dark_cyan
-runtest style::echo --italic --dark_magenta italic "this style may not work in some terminals:"
-
-style::printf --bold '%10s %6s\n' Item Cost
-style::printf '%10s %6s\n'        ---- ----
-style::printf --purple --background white  '%10s $%6.2f\n' beans 2.90 franks 9.35 cola 7.99 tiramasu 24.50
-
-style::cat --background dark_yellow --black << EOF
-Now is the time for all good
-people to come to the
-aid of their country and world.
-EOF
-
-style::cat --background cyan --color black --indent 4 << EOF
-Now is the time for all good
-people to come to the
-aid of their country and world.
-EOF
-
-style::info 'INFO: Info here'
-style::warning 'WARNING: Watch out!'
-style::error 'ERROR: What went wrong now?'
-style::link 'http://wikipedia.com'
-
-STYLE_WARNING='--stderr --blink --dark_yellow'
-style::warning 'WARNING: Customized warning style, still to stderr! :-)'
-
-STYLE_TO_TTY_ONLY=true  # default is false
-style::echo --bold --red "STYLE_TO_TTY_ONLY=$STYLE_TO_TTY_ONLY"
-style::echo --stderr --bold --red '
-This should still display in bold red, but on stderr' >/dev/null
-
-style::echo --color cyan --faint "
-This will not be styled. It doesn't print directly to the tty" | cat
-
-style::echo --stderr --color cyan --faint "
-This will not be styled. stderr doesn't print directly to the tty" 2>&1 | cat
-
-STYLE_TO_TTY_ONLY=false
-style::echo --bold --red "STYLE_TO_TTY_ONLY=$STYLE_TO_TTY_ONLY"
-style::echo --color cyan --faint "
-This will be styled even though it doesn't print directly to the tty." | cat
-
-style::echo --stderr --color cyan --faint "
-This will be styled even though stderr doesn't print directly to the tty" 2>&1 | cat
-
-# Three flags for the same thing:
-echo "This is $(style::echo -f      --bold --yellow LOUD) and soft."
-echo "This is $(style::echo --force --bold --yellow LOUD) and soft."
-echo "This is $(style::echo --tty   --bold --yellow LOUD) and soft."
-
-style::printf --blod --green 'Bad style' 2>/dev/null || echo "Bad style, Error status: $?"
-style::printf --faint --orange 'No orange' 2>/dev/null || echo "No orange! Error status: $?"
diff --git a/tests/style-tests b/tests/style-tests
deleted file mode 100755
index 3ddf5da..0000000
--- a/tests/style-tests
+++ /dev/null
@@ -1,40 +0,0 @@
-#!/bin/bash
-# Copyright 2018 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Automated tests for //scripts/devshell/lib/style.sh
-#
-# Usage: style-tests
-#
-#   Returns: Error status if actual output does not match expected.
-
-TEST_NAME="$(basename "${BASH_SOURCE[0]}")"
-TESTS_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd)"
-
-verbose() {
-  echo
-  echo "======================================================="
-  echo
-  echo "$@"
-  echo
-  "$@"
-  echo
-}
-
-test_main() {
-  local expected_out="${TESTS_DIR}/expected/${TEST_NAME}.out"
-  local expected_err="${TESTS_DIR}/expected/${TEST_NAME}.err"
-  local capture_dir=$(mktemp -d)
-  local actual_out="${capture_dir}/${TEST_NAME}.out"
-  local actual_err="${capture_dir}/${TEST_NAME}.err"
-  ${TESTS_DIR}/style-test-visually 1> "${actual_out}" 2> "${actual_err}"
-
-  local status=0
-  verbose diff "${expected_out}" "${actual_out}" || status=$?
-  verbose diff "${expected_err}" "${actual_err}" || status=$?
-
-  return $status
-}
-
-test_main "$@" || return $?
diff --git a/third_party/d3/LICENSE b/third_party/d3/LICENSE
deleted file mode 100644
index 8301346..0000000
--- a/third_party/d3/LICENSE
+++ /dev/null
@@ -1,26 +0,0 @@
-Copyright (c) 2010-2014, Michael Bostock
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-
-* Redistributions of source code must retain the above copyright notice, this
-  list of conditions and the following disclaimer.
-
-* Redistributions in binary form must reproduce the above copyright notice,
-  this list of conditions and the following disclaimer in the documentation
-  and/or other materials provided with the distribution.
-
-* The name Michael Bostock may not be used to endorse or promote products
-  derived from this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
-AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL MICHAEL BOSTOCK BE LIABLE FOR ANY DIRECT,
-INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
-BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
-OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
-NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
-EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/third_party/d3/README.fuchsia b/third_party/d3/README.fuchsia
deleted file mode 100644
index c291585..0000000
--- a/third_party/d3/README.fuchsia
+++ /dev/null
@@ -1,15 +0,0 @@
-Name: d3
-URL: https://github.com/d3/d3
-License: BSD-3-Clause
-Date: Mon Mar 24 20:45:44 2014 -0700
-Revision: fa55eead411a3c1b01703cb1ddfd59ccc0b23124
-
-Javascript library for manipulating documents based on data.
-
-Local Modifications:
-Deleted everything except for:
-* d3.js         the standalone non-minified library
-* LICENSE       the BSD-style 3-Clause license
-* README.md     the readme file from github, for basic information
-
-Disabled unsafe html() function
diff --git a/third_party/d3/README.md b/third_party/d3/README.md
deleted file mode 100644
index eb334e2..0000000
--- a/third_party/d3/README.md
+++ /dev/null
@@ -1,9 +0,0 @@
-# Data-Driven Documents
-
-<a href="http://d3js.org"><img src="http://d3js.org/logo.svg" align="left" hspace="10" vspace="6"></a>
-
-**D3.js** is a JavaScript library for manipulating documents based on data. **D3** helps you bring data to life using HTML, SVG and CSS. D3’s emphasis on web standards gives you the full capabilities of modern browsers without tying yourself to a proprietary framework, combining powerful visualization components and a data-driven approach to DOM manipulation.
-
-Want to learn more? [See the wiki.](https://github.com/mbostock/d3/wiki)
-
-For examples, [see the gallery](https://github.com/mbostock/d3/wiki/Gallery) and [mbostock’s bl.ocks](http://bl.ocks.org/mbostock).
diff --git a/third_party/d3/d3.js b/third_party/d3/d3.js
deleted file mode 100644
index 654740c..0000000
--- a/third_party/d3/d3.js
+++ /dev/null
@@ -1,9293 +0,0 @@
-!function() {
-  var d3 = {
-    version: "3.4.4"
-  };
-  if (!Date.now) Date.now = function() {
-    return +new Date();
-  };
-  var d3_arraySlice = [].slice, d3_array = function(list) {
-    return d3_arraySlice.call(list);
-  };
-  var d3_document = document, d3_documentElement = d3_document.documentElement, d3_window = window;
-  try {
-    d3_array(d3_documentElement.childNodes)[0].nodeType;
-  } catch (e) {
-    d3_array = function(list) {
-      var i = list.length, array = new Array(i);
-      while (i--) array[i] = list[i];
-      return array;
-    };
-  }
-  try {
-    d3_document.createElement("div").style.setProperty("opacity", 0, "");
-  } catch (error) {
-    var d3_element_prototype = d3_window.Element.prototype, d3_element_setAttribute = d3_element_prototype.setAttribute, d3_element_setAttributeNS = d3_element_prototype.setAttributeNS, d3_style_prototype = d3_window.CSSStyleDeclaration.prototype, d3_style_setProperty = d3_style_prototype.setProperty;
-    d3_element_prototype.setAttribute = function(name, value) {
-      d3_element_setAttribute.call(this, name, value + "");
-    };
-    d3_element_prototype.setAttributeNS = function(space, local, value) {
-      d3_element_setAttributeNS.call(this, space, local, value + "");
-    };
-    d3_style_prototype.setProperty = function(name, value, priority) {
-      d3_style_setProperty.call(this, name, value + "", priority);
-    };
-  }
-  d3.ascending = d3_ascending;
-  function d3_ascending(a, b) {
-    return a < b ? -1 : a > b ? 1 : a >= b ? 0 : NaN;
-  }
-  d3.descending = function(a, b) {
-    return b < a ? -1 : b > a ? 1 : b >= a ? 0 : NaN;
-  };
-  d3.min = function(array, f) {
-    var i = -1, n = array.length, a, b;
-    if (arguments.length === 1) {
-      while (++i < n && !((a = array[i]) != null && a <= a)) a = undefined;
-      while (++i < n) if ((b = array[i]) != null && a > b) a = b;
-    } else {
-      while (++i < n && !((a = f.call(array, array[i], i)) != null && a <= a)) a = undefined;
-      while (++i < n) if ((b = f.call(array, array[i], i)) != null && a > b) a = b;
-    }
-    return a;
-  };
-  d3.max = function(array, f) {
-    var i = -1, n = array.length, a, b;
-    if (arguments.length === 1) {
-      while (++i < n && !((a = array[i]) != null && a <= a)) a = undefined;
-      while (++i < n) if ((b = array[i]) != null && b > a) a = b;
-    } else {
-      while (++i < n && !((a = f.call(array, array[i], i)) != null && a <= a)) a = undefined;
-      while (++i < n) if ((b = f.call(array, array[i], i)) != null && b > a) a = b;
-    }
-    return a;
-  };
-  d3.extent = function(array, f) {
-    var i = -1, n = array.length, a, b, c;
-    if (arguments.length === 1) {
-      while (++i < n && !((a = c = array[i]) != null && a <= a)) a = c = undefined;
-      while (++i < n) if ((b = array[i]) != null) {
-        if (a > b) a = b;
-        if (c < b) c = b;
-      }
-    } else {
-      while (++i < n && !((a = c = f.call(array, array[i], i)) != null && a <= a)) a = undefined;
-      while (++i < n) if ((b = f.call(array, array[i], i)) != null) {
-        if (a > b) a = b;
-        if (c < b) c = b;
-      }
-    }
-    return [ a, c ];
-  };
-  d3.sum = function(array, f) {
-    var s = 0, n = array.length, a, i = -1;
-    if (arguments.length === 1) {
-      while (++i < n) if (!isNaN(a = +array[i])) s += a;
-    } else {
-      while (++i < n) if (!isNaN(a = +f.call(array, array[i], i))) s += a;
-    }
-    return s;
-  };
-  function d3_number(x) {
-    return x != null && !isNaN(x);
-  }
-  d3.mean = function(array, f) {
-    var n = array.length, a, m = 0, i = -1, j = 0;
-    if (arguments.length === 1) {
-      while (++i < n) if (d3_number(a = array[i])) m += (a - m) / ++j;
-    } else {
-      while (++i < n) if (d3_number(a = f.call(array, array[i], i))) m += (a - m) / ++j;
-    }
-    return j ? m : undefined;
-  };
-  d3.quantile = function(values, p) {
-    var H = (values.length - 1) * p + 1, h = Math.floor(H), v = +values[h - 1], e = H - h;
-    return e ? v + e * (values[h] - v) : v;
-  };
-  d3.median = function(array, f) {
-    if (arguments.length > 1) array = array.map(f);
-    array = array.filter(d3_number);
-    return array.length ? d3.quantile(array.sort(d3_ascending), .5) : undefined;
-  };
-  function d3_bisector(compare) {
-    return {
-      left: function(a, x, lo, hi) {
-        if (arguments.length < 3) lo = 0;
-        if (arguments.length < 4) hi = a.length;
-        while (lo < hi) {
-          var mid = lo + hi >>> 1;
-          if (compare(a[mid], x) < 0) lo = mid + 1; else hi = mid;
-        }
-        return lo;
-      },
-      right: function(a, x, lo, hi) {
-        if (arguments.length < 3) lo = 0;
-        if (arguments.length < 4) hi = a.length;
-        while (lo < hi) {
-          var mid = lo + hi >>> 1;
-          if (compare(a[mid], x) > 0) hi = mid; else lo = mid + 1;
-        }
-        return lo;
-      }
-    };
-  }
-  var d3_bisect = d3_bisector(d3_ascending);
-  d3.bisectLeft = d3_bisect.left;
-  d3.bisect = d3.bisectRight = d3_bisect.right;
-  d3.bisector = function(f) {
-    return d3_bisector(f.length === 1 ? function(d, x) {
-      return d3_ascending(f(d), x);
-    } : f);
-  };
-  d3.shuffle = function(array) {
-    var m = array.length, t, i;
-    while (m) {
-      i = Math.random() * m-- | 0;
-      t = array[m], array[m] = array[i], array[i] = t;
-    }
-    return array;
-  };
-  d3.permute = function(array, indexes) {
-    var i = indexes.length, permutes = new Array(i);
-    while (i--) permutes[i] = array[indexes[i]];
-    return permutes;
-  };
-  d3.pairs = function(array) {
-    var i = 0, n = array.length - 1, p0, p1 = array[0], pairs = new Array(n < 0 ? 0 : n);
-    while (i < n) pairs[i] = [ p0 = p1, p1 = array[++i] ];
-    return pairs;
-  };
-  d3.zip = function() {
-    if (!(n = arguments.length)) return [];
-    for (var i = -1, m = d3.min(arguments, d3_zipLength), zips = new Array(m); ++i < m; ) {
-      for (var j = -1, n, zip = zips[i] = new Array(n); ++j < n; ) {
-        zip[j] = arguments[j][i];
-      }
-    }
-    return zips;
-  };
-  function d3_zipLength(d) {
-    return d.length;
-  }
-  d3.transpose = function(matrix) {
-    return d3.zip.apply(d3, matrix);
-  };
-  d3.keys = function(map) {
-    var keys = [];
-    for (var key in map) keys.push(key);
-    return keys;
-  };
-  d3.values = function(map) {
-    var values = [];
-    for (var key in map) values.push(map[key]);
-    return values;
-  };
-  d3.entries = function(map) {
-    var entries = [];
-    for (var key in map) entries.push({
-      key: key,
-      value: map[key]
-    });
-    return entries;
-  };
-  d3.merge = function(arrays) {
-    var n = arrays.length, m, i = -1, j = 0, merged, array;
-    while (++i < n) j += arrays[i].length;
-    merged = new Array(j);
-    while (--n >= 0) {
-      array = arrays[n];
-      m = array.length;
-      while (--m >= 0) {
-        merged[--j] = array[m];
-      }
-    }
-    return merged;
-  };
-  var abs = Math.abs;
-  d3.range = function(start, stop, step) {
-    if (arguments.length < 3) {
-      step = 1;
-      if (arguments.length < 2) {
-        stop = start;
-        start = 0;
-      }
-    }
-    if ((stop - start) / step === Infinity) throw new Error("infinite range");
-    var range = [], k = d3_range_integerScale(abs(step)), i = -1, j;
-    start *= k, stop *= k, step *= k;
-    if (step < 0) while ((j = start + step * ++i) > stop) range.push(j / k); else while ((j = start + step * ++i) < stop) range.push(j / k);
-    return range;
-  };
-  function d3_range_integerScale(x) {
-    var k = 1;
-    while (x * k % 1) k *= 10;
-    return k;
-  }
-  function d3_class(ctor, properties) {
-    try {
-      for (var key in properties) {
-        Object.defineProperty(ctor.prototype, key, {
-          value: properties[key],
-          enumerable: false
-        });
-      }
-    } catch (e) {
-      ctor.prototype = properties;
-    }
-  }
-  d3.map = function(object) {
-    var map = new d3_Map();
-    if (object instanceof d3_Map) object.forEach(function(key, value) {
-      map.set(key, value);
-    }); else for (var key in object) map.set(key, object[key]);
-    return map;
-  };
-  function d3_Map() {}
-  d3_class(d3_Map, {
-    has: d3_map_has,
-    get: function(key) {
-      return this[d3_map_prefix + key];
-    },
-    set: function(key, value) {
-      return this[d3_map_prefix + key] = value;
-    },
-    remove: d3_map_remove,
-    keys: d3_map_keys,
-    values: function() {
-      var values = [];
-      this.forEach(function(key, value) {
-        values.push(value);
-      });
-      return values;
-    },
-    entries: function() {
-      var entries = [];
-      this.forEach(function(key, value) {
-        entries.push({
-          key: key,
-          value: value
-        });
-      });
-      return entries;
-    },
-    size: d3_map_size,
-    empty: d3_map_empty,
-    forEach: function(f) {
-      for (var key in this) if (key.charCodeAt(0) === d3_map_prefixCode) f.call(this, key.substring(1), this[key]);
-    }
-  });
-  var d3_map_prefix = "\x00", d3_map_prefixCode = d3_map_prefix.charCodeAt(0);
-  function d3_map_has(key) {
-    return d3_map_prefix + key in this;
-  }
-  function d3_map_remove(key) {
-    key = d3_map_prefix + key;
-    return key in this && delete this[key];
-  }
-  function d3_map_keys() {
-    var keys = [];
-    this.forEach(function(key) {
-      keys.push(key);
-    });
-    return keys;
-  }
-  function d3_map_size() {
-    var size = 0;
-    for (var key in this) if (key.charCodeAt(0) === d3_map_prefixCode) ++size;
-    return size;
-  }
-  function d3_map_empty() {
-    for (var key in this) if (key.charCodeAt(0) === d3_map_prefixCode) return false;
-    return true;
-  }
-  d3.nest = function() {
-    var nest = {}, keys = [], sortKeys = [], sortValues, rollup;
-    function map(mapType, array, depth) {
-      if (depth >= keys.length) return rollup ? rollup.call(nest, array) : sortValues ? array.sort(sortValues) : array;
-      var i = -1, n = array.length, key = keys[depth++], keyValue, object, setter, valuesByKey = new d3_Map(), values;
-      while (++i < n) {
-        if (values = valuesByKey.get(keyValue = key(object = array[i]))) {
-          values.push(object);
-        } else {
-          valuesByKey.set(keyValue, [ object ]);
-        }
-      }
-      if (mapType) {
-        object = mapType();
-        setter = function(keyValue, values) {
-          object.set(keyValue, map(mapType, values, depth));
-        };
-      } else {
-        object = {};
-        setter = function(keyValue, values) {
-          object[keyValue] = map(mapType, values, depth);
-        };
-      }
-      valuesByKey.forEach(setter);
-      return object;
-    }
-    function entries(map, depth) {
-      if (depth >= keys.length) return map;
-      var array = [], sortKey = sortKeys[depth++];
-      map.forEach(function(key, keyMap) {
-        array.push({
-          key: key,
-          values: entries(keyMap, depth)
-        });
-      });
-      return sortKey ? array.sort(function(a, b) {
-        return sortKey(a.key, b.key);
-      }) : array;
-    }
-    nest.map = function(array, mapType) {
-      return map(mapType, array, 0);
-    };
-    nest.entries = function(array) {
-      return entries(map(d3.map, array, 0), 0);
-    };
-    nest.key = function(d) {
-      keys.push(d);
-      return nest;
-    };
-    nest.sortKeys = function(order) {
-      sortKeys[keys.length - 1] = order;
-      return nest;
-    };
-    nest.sortValues = function(order) {
-      sortValues = order;
-      return nest;
-    };
-    nest.rollup = function(f) {
-      rollup = f;
-      return nest;
-    };
-    return nest;
-  };
-  d3.set = function(array) {
-    var set = new d3_Set();
-    if (array) for (var i = 0, n = array.length; i < n; ++i) set.add(array[i]);
-    return set;
-  };
-  function d3_Set() {}
-  d3_class(d3_Set, {
-    has: d3_map_has,
-    add: function(value) {
-      this[d3_map_prefix + value] = true;
-      return value;
-    },
-    remove: function(value) {
-      value = d3_map_prefix + value;
-      return value in this && delete this[value];
-    },
-    values: d3_map_keys,
-    size: d3_map_size,
-    empty: d3_map_empty,
-    forEach: function(f) {
-      for (var value in this) if (value.charCodeAt(0) === d3_map_prefixCode) f.call(this, value.substring(1));
-    }
-  });
-  d3.behavior = {};
-  d3.rebind = function(target, source) {
-    var i = 1, n = arguments.length, method;
-    while (++i < n) target[method = arguments[i]] = d3_rebind(target, source, source[method]);
-    return target;
-  };
-  function d3_rebind(target, source, method) {
-    return function() {
-      var value = method.apply(source, arguments);
-      return value === source ? target : value;
-    };
-  }
-  function d3_vendorSymbol(object, name) {
-    if (name in object) return name;
-    name = name.charAt(0).toUpperCase() + name.substring(1);
-    for (var i = 0, n = d3_vendorPrefixes.length; i < n; ++i) {
-      var prefixName = d3_vendorPrefixes[i] + name;
-      if (prefixName in object) return prefixName;
-    }
-  }
-  var d3_vendorPrefixes = [ "webkit", "ms", "moz", "Moz", "o", "O" ];
-  function d3_noop() {}
-  d3.dispatch = function() {
-    var dispatch = new d3_dispatch(), i = -1, n = arguments.length;
-    while (++i < n) dispatch[arguments[i]] = d3_dispatch_event(dispatch);
-    return dispatch;
-  };
-  function d3_dispatch() {}
-  d3_dispatch.prototype.on = function(type, listener) {
-    var i = type.indexOf("."), name = "";
-    if (i >= 0) {
-      name = type.substring(i + 1);
-      type = type.substring(0, i);
-    }
-    if (type) return arguments.length < 2 ? this[type].on(name) : this[type].on(name, listener);
-    if (arguments.length === 2) {
-      if (listener == null) for (type in this) {
-        if (this.hasOwnProperty(type)) this[type].on(name, null);
-      }
-      return this;
-    }
-  };
-  function d3_dispatch_event(dispatch) {
-    var listeners = [], listenerByName = new d3_Map();
-    function event() {
-      var z = listeners, i = -1, n = z.length, l;
-      while (++i < n) if (l = z[i].on) l.apply(this, arguments);
-      return dispatch;
-    }
-    event.on = function(name, listener) {
-      var l = listenerByName.get(name), i;
-      if (arguments.length < 2) return l && l.on;
-      if (l) {
-        l.on = null;
-        listeners = listeners.slice(0, i = listeners.indexOf(l)).concat(listeners.slice(i + 1));
-        listenerByName.remove(name);
-      }
-      if (listener) listeners.push(listenerByName.set(name, {
-        on: listener
-      }));
-      return dispatch;
-    };
-    return event;
-  }
-  d3.event = null;
-  function d3_eventPreventDefault() {
-    d3.event.preventDefault();
-  }
-  function d3_eventSource() {
-    var e = d3.event, s;
-    while (s = e.sourceEvent) e = s;
-    return e;
-  }
-  function d3_eventDispatch(target) {
-    var dispatch = new d3_dispatch(), i = 0, n = arguments.length;
-    while (++i < n) dispatch[arguments[i]] = d3_dispatch_event(dispatch);
-    dispatch.of = function(thiz, argumentz) {
-      return function(e1) {
-        try {
-          var e0 = e1.sourceEvent = d3.event;
-          e1.target = target;
-          d3.event = e1;
-          dispatch[e1.type].apply(thiz, argumentz);
-        } finally {
-          d3.event = e0;
-        }
-      };
-    };
-    return dispatch;
-  }
-  d3.requote = function(s) {
-    return s.replace(d3_requote_re, "\\$&");
-  };
-  var d3_requote_re = /[\\\^\$\*\+\?\|\[\]\(\)\.\{\}]/g;
-  var d3_subclass = {}.__proto__ ? function(object, prototype) {
-    object.__proto__ = prototype;
-  } : function(object, prototype) {
-    for (var property in prototype) object[property] = prototype[property];
-  };
-  function d3_selection(groups) {
-    d3_subclass(groups, d3_selectionPrototype);
-    return groups;
-  }
-  var d3_select = function(s, n) {
-    return n.querySelector(s);
-  }, d3_selectAll = function(s, n) {
-    return n.querySelectorAll(s);
-  }, d3_selectMatcher = d3_documentElement[d3_vendorSymbol(d3_documentElement, "matchesSelector")], d3_selectMatches = function(n, s) {
-    return d3_selectMatcher.call(n, s);
-  };
-  if (typeof Sizzle === "function") {
-    d3_select = function(s, n) {
-      return Sizzle(s, n)[0] || null;
-    };
-    d3_selectAll = Sizzle;
-    d3_selectMatches = Sizzle.matchesSelector;
-  }
-  d3.selection = function() {
-    return d3_selectionRoot;
-  };
-  var d3_selectionPrototype = d3.selection.prototype = [];
-  d3_selectionPrototype.select = function(selector) {
-    var subgroups = [], subgroup, subnode, group, node;
-    selector = d3_selection_selector(selector);
-    for (var j = -1, m = this.length; ++j < m; ) {
-      subgroups.push(subgroup = []);
-      subgroup.parentNode = (group = this[j]).parentNode;
-      for (var i = -1, n = group.length; ++i < n; ) {
-        if (node = group[i]) {
-          subgroup.push(subnode = selector.call(node, node.__data__, i, j));
-          if (subnode && "__data__" in node) subnode.__data__ = node.__data__;
-        } else {
-          subgroup.push(null);
-        }
-      }
-    }
-    return d3_selection(subgroups);
-  };
-  function d3_selection_selector(selector) {
-    return typeof selector === "function" ? selector : function() {
-      return d3_select(selector, this);
-    };
-  }
-  d3_selectionPrototype.selectAll = function(selector) {
-    var subgroups = [], subgroup, node;
-    selector = d3_selection_selectorAll(selector);
-    for (var j = -1, m = this.length; ++j < m; ) {
-      for (var group = this[j], i = -1, n = group.length; ++i < n; ) {
-        if (node = group[i]) {
-          subgroups.push(subgroup = d3_array(selector.call(node, node.__data__, i, j)));
-          subgroup.parentNode = node;
-        }
-      }
-    }
-    return d3_selection(subgroups);
-  };
-  function d3_selection_selectorAll(selector) {
-    return typeof selector === "function" ? selector : function() {
-      return d3_selectAll(selector, this);
-    };
-  }
-  var d3_nsPrefix = {
-    svg: "http://www.w3.org/2000/svg",
-    xhtml: "http://www.w3.org/1999/xhtml",
-    xlink: "http://www.w3.org/1999/xlink",
-    xml: "http://www.w3.org/XML/1998/namespace",
-    xmlns: "http://www.w3.org/2000/xmlns/"
-  };
-  d3.ns = {
-    prefix: d3_nsPrefix,
-    qualify: function(name) {
-      var i = name.indexOf(":"), prefix = name;
-      if (i >= 0) {
-        prefix = name.substring(0, i);
-        name = name.substring(i + 1);
-      }
-      return d3_nsPrefix.hasOwnProperty(prefix) ? {
-        space: d3_nsPrefix[prefix],
-        local: name
-      } : name;
-    }
-  };
-  d3_selectionPrototype.attr = function(name, value) {
-    if (arguments.length < 2) {
-      if (typeof name === "string") {
-        var node = this.node();
-        name = d3.ns.qualify(name);
-        return name.local ? node.getAttributeNS(name.space, name.local) : node.getAttribute(name);
-      }
-      for (value in name) this.each(d3_selection_attr(value, name[value]));
-      return this;
-    }
-    return this.each(d3_selection_attr(name, value));
-  };
-  function d3_selection_attr(name, value) {
-    name = d3.ns.qualify(name);
-    function attrNull() {
-      this.removeAttribute(name);
-    }
-    function attrNullNS() {
-      this.removeAttributeNS(name.space, name.local);
-    }
-    function attrConstant() {
-      this.setAttribute(name, value);
-    }
-    function attrConstantNS() {
-      this.setAttributeNS(name.space, name.local, value);
-    }
-    function attrFunction() {
-      var x = value.apply(this, arguments);
-      if (x == null) this.removeAttribute(name); else this.setAttribute(name, x);
-    }
-    function attrFunctionNS() {
-      var x = value.apply(this, arguments);
-      if (x == null) this.removeAttributeNS(name.space, name.local); else this.setAttributeNS(name.space, name.local, x);
-    }
-    return value == null ? name.local ? attrNullNS : attrNull : typeof value === "function" ? name.local ? attrFunctionNS : attrFunction : name.local ? attrConstantNS : attrConstant;
-  }
-  function d3_collapse(s) {
-    return s.trim().replace(/\s+/g, " ");
-  }
-  d3_selectionPrototype.classed = function(name, value) {
-    if (arguments.length < 2) {
-      if (typeof name === "string") {
-        var node = this.node(), n = (name = d3_selection_classes(name)).length, i = -1;
-        if (value = node.classList) {
-          while (++i < n) if (!value.contains(name[i])) return false;
-        } else {
-          value = node.getAttribute("class");
-          while (++i < n) if (!d3_selection_classedRe(name[i]).test(value)) return false;
-        }
-        return true;
-      }
-      for (value in name) this.each(d3_selection_classed(value, name[value]));
-      return this;
-    }
-    return this.each(d3_selection_classed(name, value));
-  };
-  function d3_selection_classedRe(name) {
-    return new RegExp("(?:^|\\s+)" + d3.requote(name) + "(?:\\s+|$)", "g");
-  }
-  function d3_selection_classes(name) {
-    return name.trim().split(/^|\s+/);
-  }
-  function d3_selection_classed(name, value) {
-    name = d3_selection_classes(name).map(d3_selection_classedName);
-    var n = name.length;
-    function classedConstant() {
-      var i = -1;
-      while (++i < n) name[i](this, value);
-    }
-    function classedFunction() {
-      var i = -1, x = value.apply(this, arguments);
-      while (++i < n) name[i](this, x);
-    }
-    return typeof value === "function" ? classedFunction : classedConstant;
-  }
-  function d3_selection_classedName(name) {
-    var re = d3_selection_classedRe(name);
-    return function(node, value) {
-      if (c = node.classList) return value ? c.add(name) : c.remove(name);
-      var c = node.getAttribute("class") || "";
-      if (value) {
-        re.lastIndex = 0;
-        if (!re.test(c)) node.setAttribute("class", d3_collapse(c + " " + name));
-      } else {
-        node.setAttribute("class", d3_collapse(c.replace(re, " ")));
-      }
-    };
-  }
-  d3_selectionPrototype.style = function(name, value, priority) {
-    var n = arguments.length;
-    if (n < 3) {
-      if (typeof name !== "string") {
-        if (n < 2) value = "";
-        for (priority in name) this.each(d3_selection_style(priority, name[priority], value));
-        return this;
-      }
-      if (n < 2) return d3_window.getComputedStyle(this.node(), null).getPropertyValue(name);
-      priority = "";
-    }
-    return this.each(d3_selection_style(name, value, priority));
-  };
-  function d3_selection_style(name, value, priority) {
-    function styleNull() {
-      this.style.removeProperty(name);
-    }
-    function styleConstant() {
-      this.style.setProperty(name, value, priority);
-    }
-    function styleFunction() {
-      var x = value.apply(this, arguments);
-      if (x == null) this.style.removeProperty(name); else this.style.setProperty(name, x, priority);
-    }
-    return value == null ? styleNull : typeof value === "function" ? styleFunction : styleConstant;
-  }
-  d3_selectionPrototype.property = function(name, value) {
-    if (arguments.length < 2) {
-      if (typeof name === "string") return this.node()[name];
-      for (value in name) this.each(d3_selection_property(value, name[value]));
-      return this;
-    }
-    return this.each(d3_selection_property(name, value));
-  };
-  function d3_selection_property(name, value) {
-    function propertyNull() {
-      delete this[name];
-    }
-    function propertyConstant() {
-      this[name] = value;
-    }
-    function propertyFunction() {
-      var x = value.apply(this, arguments);
-      if (x == null) delete this[name]; else this[name] = x;
-    }
-    return value == null ? propertyNull : typeof value === "function" ? propertyFunction : propertyConstant;
-  }
-  d3_selectionPrototype.text = function(value) {
-    return arguments.length ? this.each(typeof value === "function" ? function() {
-      var v = value.apply(this, arguments);
-      this.textContent = v == null ? "" : v;
-    } : value == null ? function() {
-      this.textContent = "";
-    } : function() {
-      this.textContent = value;
-    }) : this.node().textContent;
-  };
-  d3_selectionPrototype.html = function(value) {
-    throw "disallowed by chromium security";
-    return arguments.length ? this.each(typeof value === "function" ? function() {
-      var v = value.apply(this, arguments);
-      this.innerHTML = v == null ? "" : v;
-    } : value == null ? function() {
-      this.innerHTML = "";
-    } : function() {
-      this.innerHTML = value;
-    }) : this.node().innerHTML;
-  };
-  d3_selectionPrototype.append = function(name) {
-    name = d3_selection_creator(name);
-    return this.select(function() {
-      return this.appendChild(name.apply(this, arguments));
-    });
-  };
-  function d3_selection_creator(name) {
-    return typeof name === "function" ? name : (name = d3.ns.qualify(name)).local ? function() {
-      return this.ownerDocument.createElementNS(name.space, name.local);
-    } : function() {
-      return this.ownerDocument.createElementNS(this.namespaceURI, name);
-    };
-  }
-  d3_selectionPrototype.insert = function(name, before) {
-    name = d3_selection_creator(name);
-    before = d3_selection_selector(before);
-    return this.select(function() {
-      return this.insertBefore(name.apply(this, arguments), before.apply(this, arguments) || null);
-    });
-  };
-  d3_selectionPrototype.remove = function() {
-    return this.each(function() {
-      var parent = this.parentNode;
-      if (parent) parent.removeChild(this);
-    });
-  };
-  d3_selectionPrototype.data = function(value, key) {
-    var i = -1, n = this.length, group, node;
-    if (!arguments.length) {
-      value = new Array(n = (group = this[0]).length);
-      while (++i < n) {
-        if (node = group[i]) {
-          value[i] = node.__data__;
-        }
-      }
-      return value;
-    }
-    function bind(group, groupData) {
-      var i, n = group.length, m = groupData.length, n0 = Math.min(n, m), updateNodes = new Array(m), enterNodes = new Array(m), exitNodes = new Array(n), node, nodeData;
-      if (key) {
-        var nodeByKeyValue = new d3_Map(), dataByKeyValue = new d3_Map(), keyValues = [], keyValue;
-        for (i = -1; ++i < n; ) {
-          keyValue = key.call(node = group[i], node.__data__, i);
-          if (nodeByKeyValue.has(keyValue)) {
-            exitNodes[i] = node;
-          } else {
-            nodeByKeyValue.set(keyValue, node);
-          }
-          keyValues.push(keyValue);
-        }
-        for (i = -1; ++i < m; ) {
-          keyValue = key.call(groupData, nodeData = groupData[i], i);
-          if (node = nodeByKeyValue.get(keyValue)) {
-            updateNodes[i] = node;
-            node.__data__ = nodeData;
-          } else if (!dataByKeyValue.has(keyValue)) {
-            enterNodes[i] = d3_selection_dataNode(nodeData);
-          }
-          dataByKeyValue.set(keyValue, nodeData);
-          nodeByKeyValue.remove(keyValue);
-        }
-        for (i = -1; ++i < n; ) {
-          if (nodeByKeyValue.has(keyValues[i])) {
-            exitNodes[i] = group[i];
-          }
-        }
-      } else {
-        for (i = -1; ++i < n0; ) {
-          node = group[i];
-          nodeData = groupData[i];
-          if (node) {
-            node.__data__ = nodeData;
-            updateNodes[i] = node;
-          } else {
-            enterNodes[i] = d3_selection_dataNode(nodeData);
-          }
-        }
-        for (;i < m; ++i) {
-          enterNodes[i] = d3_selection_dataNode(groupData[i]);
-        }
-        for (;i < n; ++i) {
-          exitNodes[i] = group[i];
-        }
-      }
-      enterNodes.update = updateNodes;
-      enterNodes.parentNode = updateNodes.parentNode = exitNodes.parentNode = group.parentNode;
-      enter.push(enterNodes);
-      update.push(updateNodes);
-      exit.push(exitNodes);
-    }
-    var enter = d3_selection_enter([]), update = d3_selection([]), exit = d3_selection([]);
-    if (typeof value === "function") {
-      while (++i < n) {
-        bind(group = this[i], value.call(group, group.parentNode.__data__, i));
-      }
-    } else {
-      while (++i < n) {
-        bind(group = this[i], value);
-      }
-    }
-    update.enter = function() {
-      return enter;
-    };
-    update.exit = function() {
-      return exit;
-    };
-    return update;
-  };
-  function d3_selection_dataNode(data) {
-    return {
-      __data__: data
-    };
-  }
-  d3_selectionPrototype.datum = function(value) {
-    return arguments.length ? this.property("__data__", value) : this.property("__data__");
-  };
-  d3_selectionPrototype.filter = function(filter) {
-    var subgroups = [], subgroup, group, node;
-    if (typeof filter !== "function") filter = d3_selection_filter(filter);
-    for (var j = 0, m = this.length; j < m; j++) {
-      subgroups.push(subgroup = []);
-      subgroup.parentNode = (group = this[j]).parentNode;
-      for (var i = 0, n = group.length; i < n; i++) {
-        if ((node = group[i]) && filter.call(node, node.__data__, i, j)) {
-          subgroup.push(node);
-        }
-      }
-    }
-    return d3_selection(subgroups);
-  };
-  function d3_selection_filter(selector) {
-    return function() {
-      return d3_selectMatches(this, selector);
-    };
-  }
-  d3_selectionPrototype.order = function() {
-    for (var j = -1, m = this.length; ++j < m; ) {
-      for (var group = this[j], i = group.length - 1, next = group[i], node; --i >= 0; ) {
-        if (node = group[i]) {
-          if (next && next !== node.nextSibling) next.parentNode.insertBefore(node, next);
-          next = node;
-        }
-      }
-    }
-    return this;
-  };
-  d3_selectionPrototype.sort = function(comparator) {
-    comparator = d3_selection_sortComparator.apply(this, arguments);
-    for (var j = -1, m = this.length; ++j < m; ) this[j].sort(comparator);
-    return this.order();
-  };
-  function d3_selection_sortComparator(comparator) {
-    if (!arguments.length) comparator = d3_ascending;
-    return function(a, b) {
-      return a && b ? comparator(a.__data__, b.__data__) : !a - !b;
-    };
-  }
-  d3_selectionPrototype.each = function(callback) {
-    return d3_selection_each(this, function(node, i, j) {
-      callback.call(node, node.__data__, i, j);
-    });
-  };
-  function d3_selection_each(groups, callback) {
-    for (var j = 0, m = groups.length; j < m; j++) {
-      for (var group = groups[j], i = 0, n = group.length, node; i < n; i++) {
-        if (node = group[i]) callback(node, i, j);
-      }
-    }
-    return groups;
-  }
-  d3_selectionPrototype.call = function(callback) {
-    var args = d3_array(arguments);
-    callback.apply(args[0] = this, args);
-    return this;
-  };
-  d3_selectionPrototype.empty = function() {
-    return !this.node();
-  };
-  d3_selectionPrototype.node = function() {
-    for (var j = 0, m = this.length; j < m; j++) {
-      for (var group = this[j], i = 0, n = group.length; i < n; i++) {
-        var node = group[i];
-        if (node) return node;
-      }
-    }
-    return null;
-  };
-  d3_selectionPrototype.size = function() {
-    var n = 0;
-    this.each(function() {
-      ++n;
-    });
-    return n;
-  };
-  function d3_selection_enter(selection) {
-    d3_subclass(selection, d3_selection_enterPrototype);
-    return selection;
-  }
-  var d3_selection_enterPrototype = [];
-  d3.selection.enter = d3_selection_enter;
-  d3.selection.enter.prototype = d3_selection_enterPrototype;
-  d3_selection_enterPrototype.append = d3_selectionPrototype.append;
-  d3_selection_enterPrototype.empty = d3_selectionPrototype.empty;
-  d3_selection_enterPrototype.node = d3_selectionPrototype.node;
-  d3_selection_enterPrototype.call = d3_selectionPrototype.call;
-  d3_selection_enterPrototype.size = d3_selectionPrototype.size;
-  d3_selection_enterPrototype.select = function(selector) {
-    var subgroups = [], subgroup, subnode, upgroup, group, node;
-    for (var j = -1, m = this.length; ++j < m; ) {
-      upgroup = (group = this[j]).update;
-      subgroups.push(subgroup = []);
-      subgroup.parentNode = group.parentNode;
-      for (var i = -1, n = group.length; ++i < n; ) {
-        if (node = group[i]) {
-          subgroup.push(upgroup[i] = subnode = selector.call(group.parentNode, node.__data__, i, j));
-          subnode.__data__ = node.__data__;
-        } else {
-          subgroup.push(null);
-        }
-      }
-    }
-    return d3_selection(subgroups);
-  };
-  d3_selection_enterPrototype.insert = function(name, before) {
-    if (arguments.length < 2) before = d3_selection_enterInsertBefore(this);
-    return d3_selectionPrototype.insert.call(this, name, before);
-  };
-  function d3_selection_enterInsertBefore(enter) {
-    var i0, j0;
-    return function(d, i, j) {
-      var group = enter[j].update, n = group.length, node;
-      if (j != j0) j0 = j, i0 = 0;
-      if (i >= i0) i0 = i + 1;
-      while (!(node = group[i0]) && ++i0 < n) ;
-      return node;
-    };
-  }
-  d3_selectionPrototype.transition = function() {
-    var id = d3_transitionInheritId || ++d3_transitionId, subgroups = [], subgroup, node, transition = d3_transitionInherit || {
-      time: Date.now(),
-      ease: d3_ease_cubicInOut,
-      delay: 0,
-      duration: 250
-    };
-    for (var j = -1, m = this.length; ++j < m; ) {
-      subgroups.push(subgroup = []);
-      for (var group = this[j], i = -1, n = group.length; ++i < n; ) {
-        if (node = group[i]) d3_transitionNode(node, i, id, transition);
-        subgroup.push(node);
-      }
-    }
-    return d3_transition(subgroups, id);
-  };
-  d3_selectionPrototype.interrupt = function() {
-    return this.each(d3_selection_interrupt);
-  };
-  function d3_selection_interrupt() {
-    var lock = this.__transition__;
-    if (lock) ++lock.active;
-  }
-  d3.select = function(node) {
-    var group = [ typeof node === "string" ? d3_select(node, d3_document) : node ];
-    group.parentNode = d3_documentElement;
-    return d3_selection([ group ]);
-  };
-  d3.selectAll = function(nodes) {
-    var group = d3_array(typeof nodes === "string" ? d3_selectAll(nodes, d3_document) : nodes);
-    group.parentNode = d3_documentElement;
-    return d3_selection([ group ]);
-  };
-  var d3_selectionRoot = d3.select(d3_documentElement);
-  d3_selectionPrototype.on = function(type, listener, capture) {
-    var n = arguments.length;
-    if (n < 3) {
-      if (typeof type !== "string") {
-        if (n < 2) listener = false;
-        for (capture in type) this.each(d3_selection_on(capture, type[capture], listener));
-        return this;
-      }
-      if (n < 2) return (n = this.node()["__on" + type]) && n._;
-      capture = false;
-    }
-    return this.each(d3_selection_on(type, listener, capture));
-  };
-  function d3_selection_on(type, listener, capture) {
-    var name = "__on" + type, i = type.indexOf("."), wrap = d3_selection_onListener;
-    if (i > 0) type = type.substring(0, i);
-    var filter = d3_selection_onFilters.get(type);
-    if (filter) type = filter, wrap = d3_selection_onFilter;
-    function onRemove() {
-      var l = this[name];
-      if (l) {
-        this.removeEventListener(type, l, l.$);
-        delete this[name];
-      }
-    }
-    function onAdd() {
-      var l = wrap(listener, d3_array(arguments));
-      onRemove.call(this);
-      this.addEventListener(type, this[name] = l, l.$ = capture);
-      l._ = listener;
-    }
-    function removeAll() {
-      var re = new RegExp("^__on([^.]+)" + d3.requote(type) + "$"), match;
-      for (var name in this) {
-        if (match = name.match(re)) {
-          var l = this[name];
-          this.removeEventListener(match[1], l, l.$);
-          delete this[name];
-        }
-      }
-    }
-    return i ? listener ? onAdd : onRemove : listener ? d3_noop : removeAll;
-  }
-  var d3_selection_onFilters = d3.map({
-    mouseenter: "mouseover",
-    mouseleave: "mouseout"
-  });
-  d3_selection_onFilters.forEach(function(k) {
-    if ("on" + k in d3_document) d3_selection_onFilters.remove(k);
-  });
-  function d3_selection_onListener(listener, argumentz) {
-    return function(e) {
-      var o = d3.event;
-      d3.event = e;
-      argumentz[0] = this.__data__;
-      try {
-        listener.apply(this, argumentz);
-      } finally {
-        d3.event = o;
-      }
-    };
-  }
-  function d3_selection_onFilter(listener, argumentz) {
-    var l = d3_selection_onListener(listener, argumentz);
-    return function(e) {
-      var target = this, related = e.relatedTarget;
-      if (!related || related !== target && !(related.compareDocumentPosition(target) & 8)) {
-        l.call(target, e);
-      }
-    };
-  }
-  var d3_event_dragSelect = "onselectstart" in d3_document ? null : d3_vendorSymbol(d3_documentElement.style, "userSelect"), d3_event_dragId = 0;
-  function d3_event_dragSuppress() {
-    var name = ".dragsuppress-" + ++d3_event_dragId, click = "click" + name, w = d3.select(d3_window).on("touchmove" + name, d3_eventPreventDefault).on("dragstart" + name, d3_eventPreventDefault).on("selectstart" + name, d3_eventPreventDefault);
-    if (d3_event_dragSelect) {
-      var style = d3_documentElement.style, select = style[d3_event_dragSelect];
-      style[d3_event_dragSelect] = "none";
-    }
-    return function(suppressClick) {
-      w.on(name, null);
-      if (d3_event_dragSelect) style[d3_event_dragSelect] = select;
-      if (suppressClick) {
-        function off() {
-          w.on(click, null);
-        }
-        w.on(click, function() {
-          d3_eventPreventDefault();
-          off();
-        }, true);
-        setTimeout(off, 0);
-      }
-    };
-  }
-  d3.mouse = function(container) {
-    return d3_mousePoint(container, d3_eventSource());
-  };
-  function d3_mousePoint(container, e) {
-    if (e.changedTouches) e = e.changedTouches[0];
-    var svg = container.ownerSVGElement || container;
-    if (svg.createSVGPoint) {
-      var point = svg.createSVGPoint();
-      point.x = e.clientX, point.y = e.clientY;
-      point = point.matrixTransform(container.getScreenCTM().inverse());
-      return [ point.x, point.y ];
-    }
-    var rect = container.getBoundingClientRect();
-    return [ e.clientX - rect.left - container.clientLeft, e.clientY - rect.top - container.clientTop ];
-  }
-  d3.touches = function(container, touches) {
-    if (arguments.length < 2) touches = d3_eventSource().touches;
-    return touches ? d3_array(touches).map(function(touch) {
-      var point = d3_mousePoint(container, touch);
-      point.identifier = touch.identifier;
-      return point;
-    }) : [];
-  };
-  d3.behavior.drag = function() {
-    var event = d3_eventDispatch(drag, "drag", "dragstart", "dragend"), origin = null, mousedown = dragstart(d3_noop, d3.mouse, d3_behavior_dragMouseSubject, "mousemove", "mouseup"), touchstart = dragstart(d3_behavior_dragTouchId, d3.touch, d3_behavior_dragTouchSubject, "touchmove", "touchend");
-    function drag() {
-      this.on("mousedown.drag", mousedown).on("touchstart.drag", touchstart);
-    }
-    function dragstart(id, position, subject, move, end) {
-      return function() {
-        var that = this, target = d3.event.target, parent = that.parentNode, dispatch = event.of(that, arguments), dragged = 0, dragId = id(), dragName = ".drag" + (dragId == null ? "" : "-" + dragId), dragOffset, dragSubject = d3.select(subject()).on(move + dragName, moved).on(end + dragName, ended), dragRestore = d3_event_dragSuppress(), position0 = position(parent, dragId);
-        if (origin) {
-          dragOffset = origin.apply(that, arguments);
-          dragOffset = [ dragOffset.x - position0[0], dragOffset.y - position0[1] ];
-        } else {
-          dragOffset = [ 0, 0 ];
-        }
-        dispatch({
-          type: "dragstart"
-        });
-        function moved() {
-          var position1 = position(parent, dragId), dx, dy;
-          if (!position1) return;
-          dx = position1[0] - position0[0];
-          dy = position1[1] - position0[1];
-          dragged |= dx | dy;
-          position0 = position1;
-          dispatch({
-            type: "drag",
-            x: position1[0] + dragOffset[0],
-            y: position1[1] + dragOffset[1],
-            dx: dx,
-            dy: dy
-          });
-        }
-        function ended() {
-          if (!position(parent, dragId)) return;
-          dragSubject.on(move + dragName, null).on(end + dragName, null);
-          dragRestore(dragged && d3.event.target === target);
-          dispatch({
-            type: "dragend"
-          });
-        }
-      };
-    }
-    drag.origin = function(x) {
-      if (!arguments.length) return origin;
-      origin = x;
-      return drag;
-    };
-    return d3.rebind(drag, event, "on");
-  };
-  function d3_behavior_dragTouchId() {
-    return d3.event.changedTouches[0].identifier;
-  }
-  function d3_behavior_dragTouchSubject() {
-    return d3.event.target;
-  }
-  function d3_behavior_dragMouseSubject() {
-    return d3_window;
-  }
-  var π = Math.PI, τ = 2 * π, halfπ = π / 2, ε = 1e-6, ε2 = ε * ε, d3_radians = π / 180, d3_degrees = 180 / π;
-  function d3_sgn(x) {
-    return x > 0 ? 1 : x < 0 ? -1 : 0;
-  }
-  function d3_cross2d(a, b, c) {
-    return (b[0] - a[0]) * (c[1] - a[1]) - (b[1] - a[1]) * (c[0] - a[0]);
-  }
-  function d3_acos(x) {
-    return x > 1 ? 0 : x < -1 ? π : Math.acos(x);
-  }
-  function d3_asin(x) {
-    return x > 1 ? halfπ : x < -1 ? -halfπ : Math.asin(x);
-  }
-  function d3_sinh(x) {
-    return ((x = Math.exp(x)) - 1 / x) / 2;
-  }
-  function d3_cosh(x) {
-    return ((x = Math.exp(x)) + 1 / x) / 2;
-  }
-  function d3_tanh(x) {
-    return ((x = Math.exp(2 * x)) - 1) / (x + 1);
-  }
-  function d3_haversin(x) {
-    return (x = Math.sin(x / 2)) * x;
-  }
-  var ρ = Math.SQRT2, ρ2 = 2, ρ4 = 4;
-  d3.interpolateZoom = function(p0, p1) {
-    var ux0 = p0[0], uy0 = p0[1], w0 = p0[2], ux1 = p1[0], uy1 = p1[1], w1 = p1[2];
-    var dx = ux1 - ux0, dy = uy1 - uy0, d2 = dx * dx + dy * dy, d1 = Math.sqrt(d2), b0 = (w1 * w1 - w0 * w0 + ρ4 * d2) / (2 * w0 * ρ2 * d1), b1 = (w1 * w1 - w0 * w0 - ρ4 * d2) / (2 * w1 * ρ2 * d1), r0 = Math.log(Math.sqrt(b0 * b0 + 1) - b0), r1 = Math.log(Math.sqrt(b1 * b1 + 1) - b1), dr = r1 - r0, S = (dr || Math.log(w1 / w0)) / ρ;
-    function interpolate(t) {
-      var s = t * S;
-      if (dr) {
-        var coshr0 = d3_cosh(r0), u = w0 / (ρ2 * d1) * (coshr0 * d3_tanh(ρ * s + r0) - d3_sinh(r0));
-        return [ ux0 + u * dx, uy0 + u * dy, w0 * coshr0 / d3_cosh(ρ * s + r0) ];
-      }
-      return [ ux0 + t * dx, uy0 + t * dy, w0 * Math.exp(ρ * s) ];
-    }
-    interpolate.duration = S * 1e3;
-    return interpolate;
-  };
-  d3.behavior.zoom = function() {
-    var view = {
-      x: 0,
-      y: 0,
-      k: 1
-    }, translate0, center, size = [ 960, 500 ], scaleExtent = d3_behavior_zoomInfinity, mousedown = "mousedown.zoom", mousemove = "mousemove.zoom", mouseup = "mouseup.zoom", mousewheelTimer, touchstart = "touchstart.zoom", touchtime, event = d3_eventDispatch(zoom, "zoomstart", "zoom", "zoomend"), x0, x1, y0, y1;
-    function zoom(g) {
-      g.on(mousedown, mousedowned).on(d3_behavior_zoomWheel + ".zoom", mousewheeled).on(mousemove, mousewheelreset).on("dblclick.zoom", dblclicked).on(touchstart, touchstarted);
-    }
-    zoom.event = function(g) {
-      g.each(function() {
-        var dispatch = event.of(this, arguments), view1 = view;
-        if (d3_transitionInheritId) {
-          d3.select(this).transition().each("start.zoom", function() {
-            view = this.__chart__ || {
-              x: 0,
-              y: 0,
-              k: 1
-            };
-            zoomstarted(dispatch);
-          }).tween("zoom:zoom", function() {
-            var dx = size[0], dy = size[1], cx = dx / 2, cy = dy / 2, i = d3.interpolateZoom([ (cx - view.x) / view.k, (cy - view.y) / view.k, dx / view.k ], [ (cx - view1.x) / view1.k, (cy - view1.y) / view1.k, dx / view1.k ]);
-            return function(t) {
-              var l = i(t), k = dx / l[2];
-              this.__chart__ = view = {
-                x: cx - l[0] * k,
-                y: cy - l[1] * k,
-                k: k
-              };
-              zoomed(dispatch);
-            };
-          }).each("end.zoom", function() {
-            zoomended(dispatch);
-          });
-        } else {
-          this.__chart__ = view;
-          zoomstarted(dispatch);
-          zoomed(dispatch);
-          zoomended(dispatch);
-        }
-      });
-    };
-    zoom.translate = function(_) {
-      if (!arguments.length) return [ view.x, view.y ];
-      view = {
-        x: +_[0],
-        y: +_[1],
-        k: view.k
-      };
-      rescale();
-      return zoom;
-    };
-    zoom.scale = function(_) {
-      if (!arguments.length) return view.k;
-      view = {
-        x: view.x,
-        y: view.y,
-        k: +_
-      };
-      rescale();
-      return zoom;
-    };
-    zoom.scaleExtent = function(_) {
-      if (!arguments.length) return scaleExtent;
-      scaleExtent = _ == null ? d3_behavior_zoomInfinity : [ +_[0], +_[1] ];
-      return zoom;
-    };
-    zoom.center = function(_) {
-      if (!arguments.length) return center;
-      center = _ && [ +_[0], +_[1] ];
-      return zoom;
-    };
-    zoom.size = function(_) {
-      if (!arguments.length) return size;
-      size = _ && [ +_[0], +_[1] ];
-      return zoom;
-    };
-    zoom.x = function(z) {
-      if (!arguments.length) return x1;
-      x1 = z;
-      x0 = z.copy();
-      view = {
-        x: 0,
-        y: 0,
-        k: 1
-      };
-      return zoom;
-    };
-    zoom.y = function(z) {
-      if (!arguments.length) return y1;
-      y1 = z;
-      y0 = z.copy();
-      view = {
-        x: 0,
-        y: 0,
-        k: 1
-      };
-      return zoom;
-    };
-    function location(p) {
-      return [ (p[0] - view.x) / view.k, (p[1] - view.y) / view.k ];
-    }
-    function point(l) {
-      return [ l[0] * view.k + view.x, l[1] * view.k + view.y ];
-    }
-    function scaleTo(s) {
-      view.k = Math.max(scaleExtent[0], Math.min(scaleExtent[1], s));
-    }
-    function translateTo(p, l) {
-      l = point(l);
-      view.x += p[0] - l[0];
-      view.y += p[1] - l[1];
-    }
-    function rescale() {
-      if (x1) x1.domain(x0.range().map(function(x) {
-        return (x - view.x) / view.k;
-      }).map(x0.invert));
-      if (y1) y1.domain(y0.range().map(function(y) {
-        return (y - view.y) / view.k;
-      }).map(y0.invert));
-    }
-    function zoomstarted(dispatch) {
-      dispatch({
-        type: "zoomstart"
-      });
-    }
-    function zoomed(dispatch) {
-      rescale();
-      dispatch({
-        type: "zoom",
-        scale: view.k,
-        translate: [ view.x, view.y ]
-      });
-    }
-    function zoomended(dispatch) {
-      dispatch({
-        type: "zoomend"
-      });
-    }
-    function mousedowned() {
-      var that = this, target = d3.event.target, dispatch = event.of(that, arguments), dragged = 0, subject = d3.select(d3_window).on(mousemove, moved).on(mouseup, ended), location0 = location(d3.mouse(that)), dragRestore = d3_event_dragSuppress();
-      d3_selection_interrupt.call(that);
-      zoomstarted(dispatch);
-      function moved() {
-        dragged = 1;
-        translateTo(d3.mouse(that), location0);
-        zoomed(dispatch);
-      }
-      function ended() {
-        subject.on(mousemove, d3_window === that ? mousewheelreset : null).on(mouseup, null);
-        dragRestore(dragged && d3.event.target === target);
-        zoomended(dispatch);
-      }
-    }
-    function touchstarted() {
-      var that = this, dispatch = event.of(that, arguments), locations0 = {}, distance0 = 0, scale0, zoomName = ".zoom-" + d3.event.changedTouches[0].identifier, touchmove = "touchmove" + zoomName, touchend = "touchend" + zoomName, target = d3.select(d3.event.target).on(touchmove, moved).on(touchend, ended), subject = d3.select(that).on(mousedown, null).on(touchstart, started), dragRestore = d3_event_dragSuppress();
-      d3_selection_interrupt.call(that);
-      started();
-      zoomstarted(dispatch);
-      function relocate() {
-        var touches = d3.touches(that);
-        scale0 = view.k;
-        touches.forEach(function(t) {
-          if (t.identifier in locations0) locations0[t.identifier] = location(t);
-        });
-        return touches;
-      }
-      function started() {
-        var changed = d3.event.changedTouches;
-        for (var i = 0, n = changed.length; i < n; ++i) {
-          locations0[changed[i].identifier] = null;
-        }
-        var touches = relocate(), now = Date.now();
-        if (touches.length === 1) {
-          if (now - touchtime < 500) {
-            var p = touches[0], l = locations0[p.identifier];
-            scaleTo(view.k * 2);
-            translateTo(p, l);
-            d3_eventPreventDefault();
-            zoomed(dispatch);
-          }
-          touchtime = now;
-        } else if (touches.length > 1) {
-          var p = touches[0], q = touches[1], dx = p[0] - q[0], dy = p[1] - q[1];
-          distance0 = dx * dx + dy * dy;
-        }
-      }
-      function moved() {
-        var touches = d3.touches(that), p0, l0, p1, l1;
-        for (var i = 0, n = touches.length; i < n; ++i, l1 = null) {
-          p1 = touches[i];
-          if (l1 = locations0[p1.identifier]) {
-            if (l0) break;
-            p0 = p1, l0 = l1;
-          }
-        }
-        if (l1) {
-          var distance1 = (distance1 = p1[0] - p0[0]) * distance1 + (distance1 = p1[1] - p0[1]) * distance1, scale1 = distance0 && Math.sqrt(distance1 / distance0);
-          p0 = [ (p0[0] + p1[0]) / 2, (p0[1] + p1[1]) / 2 ];
-          l0 = [ (l0[0] + l1[0]) / 2, (l0[1] + l1[1]) / 2 ];
-          scaleTo(scale1 * scale0);
-        }
-        touchtime = null;
-        translateTo(p0, l0);
-        zoomed(dispatch);
-      }
-      function ended() {
-        if (d3.event.touches.length) {
-          var changed = d3.event.changedTouches;
-          for (var i = 0, n = changed.length; i < n; ++i) {
-            delete locations0[changed[i].identifier];
-          }
-          for (var identifier in locations0) {
-            return void relocate();
-          }
-        }
-        target.on(zoomName, null);
-        subject.on(mousedown, mousedowned).on(touchstart, touchstarted);
-        dragRestore();
-        zoomended(dispatch);
-      }
-    }
-    function mousewheeled() {
-      var dispatch = event.of(this, arguments);
-      if (mousewheelTimer) clearTimeout(mousewheelTimer); else d3_selection_interrupt.call(this), 
-      zoomstarted(dispatch);
-      mousewheelTimer = setTimeout(function() {
-        mousewheelTimer = null;
-        zoomended(dispatch);
-      }, 50);
-      d3_eventPreventDefault();
-      var point = center || d3.mouse(this);
-      if (!translate0) translate0 = location(point);
-      scaleTo(Math.pow(2, d3_behavior_zoomDelta() * .002) * view.k);
-      translateTo(point, translate0);
-      zoomed(dispatch);
-    }
-    function mousewheelreset() {
-      translate0 = null;
-    }
-    function dblclicked() {
-      var dispatch = event.of(this, arguments), p = d3.mouse(this), l = location(p), k = Math.log(view.k) / Math.LN2;
-      zoomstarted(dispatch);
-      scaleTo(Math.pow(2, d3.event.shiftKey ? Math.ceil(k) - 1 : Math.floor(k) + 1));
-      translateTo(p, l);
-      zoomed(dispatch);
-      zoomended(dispatch);
-    }
-    return d3.rebind(zoom, event, "on");
-  };
-  var d3_behavior_zoomInfinity = [ 0, Infinity ];
-  var d3_behavior_zoomDelta, d3_behavior_zoomWheel = "onwheel" in d3_document ? (d3_behavior_zoomDelta = function() {
-    return -d3.event.deltaY * (d3.event.deltaMode ? 120 : 1);
-  }, "wheel") : "onmousewheel" in d3_document ? (d3_behavior_zoomDelta = function() {
-    return d3.event.wheelDelta;
-  }, "mousewheel") : (d3_behavior_zoomDelta = function() {
-    return -d3.event.detail;
-  }, "MozMousePixelScroll");
-  function d3_Color() {}
-  d3_Color.prototype.toString = function() {
-    return this.rgb() + "";
-  };
-  d3.hsl = function(h, s, l) {
-    return arguments.length === 1 ? h instanceof d3_Hsl ? d3_hsl(h.h, h.s, h.l) : d3_rgb_parse("" + h, d3_rgb_hsl, d3_hsl) : d3_hsl(+h, +s, +l);
-  };
-  function d3_hsl(h, s, l) {
-    return new d3_Hsl(h, s, l);
-  }
-  function d3_Hsl(h, s, l) {
-    this.h = h;
-    this.s = s;
-    this.l = l;
-  }
-  var d3_hslPrototype = d3_Hsl.prototype = new d3_Color();
-  d3_hslPrototype.brighter = function(k) {
-    k = Math.pow(.7, arguments.length ? k : 1);
-    return d3_hsl(this.h, this.s, this.l / k);
-  };
-  d3_hslPrototype.darker = function(k) {
-    k = Math.pow(.7, arguments.length ? k : 1);
-    return d3_hsl(this.h, this.s, k * this.l);
-  };
-  d3_hslPrototype.rgb = function() {
-    return d3_hsl_rgb(this.h, this.s, this.l);
-  };
-  function d3_hsl_rgb(h, s, l) {
-    var m1, m2;
-    h = isNaN(h) ? 0 : (h %= 360) < 0 ? h + 360 : h;
-    s = isNaN(s) ? 0 : s < 0 ? 0 : s > 1 ? 1 : s;
-    l = l < 0 ? 0 : l > 1 ? 1 : l;
-    m2 = l <= .5 ? l * (1 + s) : l + s - l * s;
-    m1 = 2 * l - m2;
-    function v(h) {
-      if (h > 360) h -= 360; else if (h < 0) h += 360;
-      if (h < 60) return m1 + (m2 - m1) * h / 60;
-      if (h < 180) return m2;
-      if (h < 240) return m1 + (m2 - m1) * (240 - h) / 60;
-      return m1;
-    }
-    function vv(h) {
-      return Math.round(v(h) * 255);
-    }
-    return d3_rgb(vv(h + 120), vv(h), vv(h - 120));
-  }
-  d3.hcl = function(h, c, l) {
-    return arguments.length === 1 ? h instanceof d3_Hcl ? d3_hcl(h.h, h.c, h.l) : h instanceof d3_Lab ? d3_lab_hcl(h.l, h.a, h.b) : d3_lab_hcl((h = d3_rgb_lab((h = d3.rgb(h)).r, h.g, h.b)).l, h.a, h.b) : d3_hcl(+h, +c, +l);
-  };
-  function d3_hcl(h, c, l) {
-    return new d3_Hcl(h, c, l);
-  }
-  function d3_Hcl(h, c, l) {
-    this.h = h;
-    this.c = c;
-    this.l = l;
-  }
-  var d3_hclPrototype = d3_Hcl.prototype = new d3_Color();
-  d3_hclPrototype.brighter = function(k) {
-    return d3_hcl(this.h, this.c, Math.min(100, this.l + d3_lab_K * (arguments.length ? k : 1)));
-  };
-  d3_hclPrototype.darker = function(k) {
-    return d3_hcl(this.h, this.c, Math.max(0, this.l - d3_lab_K * (arguments.length ? k : 1)));
-  };
-  d3_hclPrototype.rgb = function() {
-    return d3_hcl_lab(this.h, this.c, this.l).rgb();
-  };
-  function d3_hcl_lab(h, c, l) {
-    if (isNaN(h)) h = 0;
-    if (isNaN(c)) c = 0;
-    return d3_lab(l, Math.cos(h *= d3_radians) * c, Math.sin(h) * c);
-  }
-  d3.lab = function(l, a, b) {
-    return arguments.length === 1 ? l instanceof d3_Lab ? d3_lab(l.l, l.a, l.b) : l instanceof d3_Hcl ? d3_hcl_lab(l.l, l.c, l.h) : d3_rgb_lab((l = d3.rgb(l)).r, l.g, l.b) : d3_lab(+l, +a, +b);
-  };
-  function d3_lab(l, a, b) {
-    return new d3_Lab(l, a, b);
-  }
-  function d3_Lab(l, a, b) {
-    this.l = l;
-    this.a = a;
-    this.b = b;
-  }
-  var d3_lab_K = 18;
-  var d3_lab_X = .95047, d3_lab_Y = 1, d3_lab_Z = 1.08883;
-  var d3_labPrototype = d3_Lab.prototype = new d3_Color();
-  d3_labPrototype.brighter = function(k) {
-    return d3_lab(Math.min(100, this.l + d3_lab_K * (arguments.length ? k : 1)), this.a, this.b);
-  };
-  d3_labPrototype.darker = function(k) {
-    return d3_lab(Math.max(0, this.l - d3_lab_K * (arguments.length ? k : 1)), this.a, this.b);
-  };
-  d3_labPrototype.rgb = function() {
-    return d3_lab_rgb(this.l, this.a, this.b);
-  };
-  function d3_lab_rgb(l, a, b) {
-    var y = (l + 16) / 116, x = y + a / 500, z = y - b / 200;
-    x = d3_lab_xyz(x) * d3_lab_X;
-    y = d3_lab_xyz(y) * d3_lab_Y;
-    z = d3_lab_xyz(z) * d3_lab_Z;
-    return d3_rgb(d3_xyz_rgb(3.2404542 * x - 1.5371385 * y - .4985314 * z), d3_xyz_rgb(-.969266 * x + 1.8760108 * y + .041556 * z), d3_xyz_rgb(.0556434 * x - .2040259 * y + 1.0572252 * z));
-  }
-  function d3_lab_hcl(l, a, b) {
-    return l > 0 ? d3_hcl(Math.atan2(b, a) * d3_degrees, Math.sqrt(a * a + b * b), l) : d3_hcl(NaN, NaN, l);
-  }
-  function d3_lab_xyz(x) {
-    return x > .206893034 ? x * x * x : (x - 4 / 29) / 7.787037;
-  }
-  function d3_xyz_lab(x) {
-    return x > .008856 ? Math.pow(x, 1 / 3) : 7.787037 * x + 4 / 29;
-  }
-  function d3_xyz_rgb(r) {
-    return Math.round(255 * (r <= .00304 ? 12.92 * r : 1.055 * Math.pow(r, 1 / 2.4) - .055));
-  }
-  d3.rgb = function(r, g, b) {
-    return arguments.length === 1 ? r instanceof d3_Rgb ? d3_rgb(r.r, r.g, r.b) : d3_rgb_parse("" + r, d3_rgb, d3_hsl_rgb) : d3_rgb(~~r, ~~g, ~~b);
-  };
-  function d3_rgbNumber(value) {
-    return d3_rgb(value >> 16, value >> 8 & 255, value & 255);
-  }
-  function d3_rgbString(value) {
-    return d3_rgbNumber(value) + "";
-  }
-  function d3_rgb(r, g, b) {
-    return new d3_Rgb(r, g, b);
-  }
-  function d3_Rgb(r, g, b) {
-    this.r = r;
-    this.g = g;
-    this.b = b;
-  }
-  var d3_rgbPrototype = d3_Rgb.prototype = new d3_Color();
-  d3_rgbPrototype.brighter = function(k) {
-    k = Math.pow(.7, arguments.length ? k : 1);
-    var r = this.r, g = this.g, b = this.b, i = 30;
-    if (!r && !g && !b) return d3_rgb(i, i, i);
-    if (r && r < i) r = i;
-    if (g && g < i) g = i;
-    if (b && b < i) b = i;
-    return d3_rgb(Math.min(255, ~~(r / k)), Math.min(255, ~~(g / k)), Math.min(255, ~~(b / k)));
-  };
-  d3_rgbPrototype.darker = function(k) {
-    k = Math.pow(.7, arguments.length ? k : 1);
-    return d3_rgb(~~(k * this.r), ~~(k * this.g), ~~(k * this.b));
-  };
-  d3_rgbPrototype.hsl = function() {
-    return d3_rgb_hsl(this.r, this.g, this.b);
-  };
-  d3_rgbPrototype.toString = function() {
-    return "#" + d3_rgb_hex(this.r) + d3_rgb_hex(this.g) + d3_rgb_hex(this.b);
-  };
-  function d3_rgb_hex(v) {
-    return v < 16 ? "0" + Math.max(0, v).toString(16) : Math.min(255, v).toString(16);
-  }
-  function d3_rgb_parse(format, rgb, hsl) {
-    var r = 0, g = 0, b = 0, m1, m2, color;
-    m1 = /([a-z]+)\((.*)\)/i.exec(format);
-    if (m1) {
-      m2 = m1[2].split(",");
-      switch (m1[1]) {
-       case "hsl":
-        {
-          return hsl(parseFloat(m2[0]), parseFloat(m2[1]) / 100, parseFloat(m2[2]) / 100);
-        }
-
-       case "rgb":
-        {
-          return rgb(d3_rgb_parseNumber(m2[0]), d3_rgb_parseNumber(m2[1]), d3_rgb_parseNumber(m2[2]));
-        }
-      }
-    }
-    if (color = d3_rgb_names.get(format)) return rgb(color.r, color.g, color.b);
-    if (format != null && format.charAt(0) === "#" && !isNaN(color = parseInt(format.substring(1), 16))) {
-      if (format.length === 4) {
-        r = (color & 3840) >> 4;
-        r = r >> 4 | r;
-        g = color & 240;
-        g = g >> 4 | g;
-        b = color & 15;
-        b = b << 4 | b;
-      } else if (format.length === 7) {
-        r = (color & 16711680) >> 16;
-        g = (color & 65280) >> 8;
-        b = color & 255;
-      }
-    }
-    return rgb(r, g, b);
-  }
-  function d3_rgb_hsl(r, g, b) {
-    var min = Math.min(r /= 255, g /= 255, b /= 255), max = Math.max(r, g, b), d = max - min, h, s, l = (max + min) / 2;
-    if (d) {
-      s = l < .5 ? d / (max + min) : d / (2 - max - min);
-      if (r == max) h = (g - b) / d + (g < b ? 6 : 0); else if (g == max) h = (b - r) / d + 2; else h = (r - g) / d + 4;
-      h *= 60;
-    } else {
-      h = NaN;
-      s = l > 0 && l < 1 ? 0 : h;
-    }
-    return d3_hsl(h, s, l);
-  }
-  function d3_rgb_lab(r, g, b) {
-    r = d3_rgb_xyz(r);
-    g = d3_rgb_xyz(g);
-    b = d3_rgb_xyz(b);
-    var x = d3_xyz_lab((.4124564 * r + .3575761 * g + .1804375 * b) / d3_lab_X), y = d3_xyz_lab((.2126729 * r + .7151522 * g + .072175 * b) / d3_lab_Y), z = d3_xyz_lab((.0193339 * r + .119192 * g + .9503041 * b) / d3_lab_Z);
-    return d3_lab(116 * y - 16, 500 * (x - y), 200 * (y - z));
-  }
-  function d3_rgb_xyz(r) {
-    return (r /= 255) <= .04045 ? r / 12.92 : Math.pow((r + .055) / 1.055, 2.4);
-  }
-  function d3_rgb_parseNumber(c) {
-    var f = parseFloat(c);
-    return c.charAt(c.length - 1) === "%" ? Math.round(f * 2.55) : f;
-  }
-  var d3_rgb_names = d3.map({
-    aliceblue: 15792383,
-    antiquewhite: 16444375,
-    aqua: 65535,
-    aquamarine: 8388564,
-    azure: 15794175,
-    beige: 16119260,
-    bisque: 16770244,
-    black: 0,
-    blanchedalmond: 16772045,
-    blue: 255,
-    blueviolet: 9055202,
-    brown: 10824234,
-    burlywood: 14596231,
-    cadetblue: 6266528,
-    chartreuse: 8388352,
-    chocolate: 13789470,
-    coral: 16744272,
-    cornflowerblue: 6591981,
-    cornsilk: 16775388,
-    crimson: 14423100,
-    cyan: 65535,
-    darkblue: 139,
-    darkcyan: 35723,
-    darkgoldenrod: 12092939,
-    darkgray: 11119017,
-    darkgreen: 25600,
-    darkgrey: 11119017,
-    darkkhaki: 12433259,
-    darkmagenta: 9109643,
-    darkolivegreen: 5597999,
-    darkorange: 16747520,
-    darkorchid: 10040012,
-    darkred: 9109504,
-    darksalmon: 15308410,
-    darkseagreen: 9419919,
-    darkslateblue: 4734347,
-    darkslategray: 3100495,
-    darkslategrey: 3100495,
-    darkviolet: 9699539,
-    deeppink: 16716947,
-    deepskyblue: 49151,
-    dimgray: 6908265,
-    dimgrey: 6908265,
-    dodgerblue: 2003199,
-    firebrick: 11674146,
-    floralwhite: 16775920,
-    forestgreen: 2263842,
-    fuchsia: 16711935,
-    gainsboro: 14474460,
-    ghostwhite: 16316671,
-    gold: 16766720,
-    goldenrod: 14329120,
-    gray: 8421504,
-    green: 32768,
-    greenyellow: 11403055,
-    grey: 8421504,
-    honeydew: 15794160,
-    hotpink: 16738740,
-    indianred: 13458524,
-    indigo: 4915330,
-    ivory: 16777200,
-    khaki: 15787660,
-    lavender: 15132410,
-    lavenderblush: 16773365,
-    lawngreen: 8190976,
-    lemonchiffon: 16775885,
-    lightblue: 11393254,
-    lightcoral: 15761536,
-    lightcyan: 14745599,
-    lightgoldenrodyellow: 16448210,
-    lightgray: 13882323,
-    lightgreen: 9498256,
-    lightgrey: 13882323,
-    lightpink: 16758465,
-    lightsalmon: 16752762,
-    lightseagreen: 2142890,
-    lightskyblue: 8900346,
-    lightslategray: 7833753,
-    lightslategrey: 7833753,
-    lightsteelblue: 11584734,
-    lightyellow: 16777184,
-    lime: 65280,
-    limegreen: 3329330,
-    linen: 16445670,
-    magenta: 16711935,
-    maroon: 8388608,
-    mediumaquamarine: 6737322,
-    mediumblue: 205,
-    mediumorchid: 12211667,
-    mediumpurple: 9662683,
-    mediumseagreen: 3978097,
-    mediumslateblue: 8087790,
-    mediumspringgreen: 64154,
-    mediumvioletred: 13047173,
-    midnightblue: 1644912,
-    mintcream: 16121850,
-    mistyrose: 16770273,
-    moccasin: 16770229,
-    navajowhite: 16768685,
-    navy: 128,
-    oldlace: 16643558,
-    olive: 8421376,
-    olivedrab: 7048739,
-    orange: 16753920,
-    orangered: 16729344,
-    orchid: 14315734,
-    palegoldenrod: 15657130,
-    palegreen: 10025880,
-    palevioletred: 14381203,
-    papayawhip: 16773077,
-    peachpuff: 16767673,
-    peru: 13468991,
-    pink: 16761035,
-    plum: 14524637,
-    powderblue: 11591910,
-    purple: 8388736,
-    red: 16711680,
-    rosybrown: 12357519,
-    royalblue: 4286945,
-    saddlebrown: 9127187,
-    salmon: 16416882,
-    sandybrown: 16032864,
-    seagreen: 3050327,
-    seashell: 16774638,
-    sienna: 10506797,
-    silver: 12632256,
-    skyblue: 8900331,
-    slateblue: 6970061,
-    slategray: 7372944,
-    slategrey: 7372944,
-    snow: 16775930,
-    springgreen: 65407,
-    steelblue: 4620980,
-    tan: 13808780,
-    teal: 32896,
-    thistle: 14204888,
-    tomato: 16737095,
-    violet: 15631086,
-    wheat: 16113331,
-    white: 16777215,
-    whitesmoke: 16119285,
-    yellow: 16776960,
-    yellowgreen: 10145074
-  });
-  d3_rgb_names.forEach(function(key, value) {
-    d3_rgb_names.set(key, d3_rgbNumber(value));
-  });
-  function d3_functor(v) {
-    return typeof v === "function" ? v : function() {
-      return v;
-    };
-  }
-  d3.functor = d3_functor;
-  function d3_identity(d) {
-    return d;
-  }
-  d3.xhr = d3_xhrType(d3_identity);
-  function d3_xhrType(response) {
-    return function(url, mimeType, callback) {
-      if (arguments.length === 2 && typeof mimeType === "function") callback = mimeType, 
-      mimeType = null;
-      return d3_xhr(url, mimeType, response, callback);
-    };
-  }
-  function d3_xhr(url, mimeType, response, callback) {
-    var xhr = {}, dispatch = d3.dispatch("beforesend", "progress", "load", "error"), headers = {}, request = new XMLHttpRequest(), responseType = null;
-    if (d3_window.XDomainRequest && !("withCredentials" in request) && /^(http(s)?:)?\/\//.test(url)) request = new XDomainRequest();
-    "onload" in request ? request.onload = request.onerror = respond : request.onreadystatechange = function() {
-      request.readyState > 3 && respond();
-    };
-    function respond() {
-      var status = request.status, result;
-      if (!status && request.responseText || status >= 200 && status < 300 || status === 304) {
-        try {
-          result = response.call(xhr, request);
-        } catch (e) {
-          dispatch.error.call(xhr, e);
-          return;
-        }
-        dispatch.load.call(xhr, result);
-      } else {
-        dispatch.error.call(xhr, request);
-      }
-    }
-    request.onprogress = function(event) {
-      var o = d3.event;
-      d3.event = event;
-      try {
-        dispatch.progress.call(xhr, request);
-      } finally {
-        d3.event = o;
-      }
-    };
-    xhr.header = function(name, value) {
-      name = (name + "").toLowerCase();
-      if (arguments.length < 2) return headers[name];
-      if (value == null) delete headers[name]; else headers[name] = value + "";
-      return xhr;
-    };
-    xhr.mimeType = function(value) {
-      if (!arguments.length) return mimeType;
-      mimeType = value == null ? null : value + "";
-      return xhr;
-    };
-    xhr.responseType = function(value) {
-      if (!arguments.length) return responseType;
-      responseType = value;
-      return xhr;
-    };
-    xhr.response = function(value) {
-      response = value;
-      return xhr;
-    };
-    [ "get", "post" ].forEach(function(method) {
-      xhr[method] = function() {
-        return xhr.send.apply(xhr, [ method ].concat(d3_array(arguments)));
-      };
-    });
-    xhr.send = function(method, data, callback) {
-      if (arguments.length === 2 && typeof data === "function") callback = data, data = null;
-      request.open(method, url, true);
-      if (mimeType != null && !("accept" in headers)) headers["accept"] = mimeType + ",*/*";
-      if (request.setRequestHeader) for (var name in headers) request.setRequestHeader(name, headers[name]);
-      if (mimeType != null && request.overrideMimeType) request.overrideMimeType(mimeType);
-      if (responseType != null) request.responseType = responseType;
-      if (callback != null) xhr.on("error", callback).on("load", function(request) {
-        callback(null, request);
-      });
-      dispatch.beforesend.call(xhr, request);
-      request.send(data == null ? null : data);
-      return xhr;
-    };
-    xhr.abort = function() {
-      request.abort();
-      return xhr;
-    };
-    d3.rebind(xhr, dispatch, "on");
-    return callback == null ? xhr : xhr.get(d3_xhr_fixCallback(callback));
-  }
-  function d3_xhr_fixCallback(callback) {
-    return callback.length === 1 ? function(error, request) {
-      callback(error == null ? request : null);
-    } : callback;
-  }
-  d3.dsv = function(delimiter, mimeType) {
-    var reFormat = new RegExp('["' + delimiter + "\n]"), delimiterCode = delimiter.charCodeAt(0);
-    function dsv(url, row, callback) {
-      if (arguments.length < 3) callback = row, row = null;
-      var xhr = d3_xhr(url, mimeType, row == null ? response : typedResponse(row), callback);
-      xhr.row = function(_) {
-        return arguments.length ? xhr.response((row = _) == null ? response : typedResponse(_)) : row;
-      };
-      return xhr;
-    }
-    function response(request) {
-      return dsv.parse(request.responseText);
-    }
-    function typedResponse(f) {
-      return function(request) {
-        return dsv.parse(request.responseText, f);
-      };
-    }
-    dsv.parse = function(text, f) {
-      var o;
-      return dsv.parseRows(text, function(row, i) {
-        if (o) return o(row, i - 1);
-        var a = new Function("d", "return {" + row.map(function(name, i) {
-          return JSON.stringify(name) + ": d[" + i + "]";
-        }).join(",") + "}");
-        o = f ? function(row, i) {
-          return f(a(row), i);
-        } : a;
-      });
-    };
-    dsv.parseRows = function(text, f) {
-      var EOL = {}, EOF = {}, rows = [], N = text.length, I = 0, n = 0, t, eol;
-      function token() {
-        if (I >= N) return EOF;
-        if (eol) return eol = false, EOL;
-        var j = I;
-        if (text.charCodeAt(j) === 34) {
-          var i = j;
-          while (i++ < N) {
-            if (text.charCodeAt(i) === 34) {
-              if (text.charCodeAt(i + 1) !== 34) break;
-              ++i;
-            }
-          }
-          I = i + 2;
-          var c = text.charCodeAt(i + 1);
-          if (c === 13) {
-            eol = true;
-            if (text.charCodeAt(i + 2) === 10) ++I;
-          } else if (c === 10) {
-            eol = true;
-          }
-          return text.substring(j + 1, i).replace(/""/g, '"');
-        }
-        while (I < N) {
-          var c = text.charCodeAt(I++), k = 1;
-          if (c === 10) eol = true; else if (c === 13) {
-            eol = true;
-            if (text.charCodeAt(I) === 10) ++I, ++k;
-          } else if (c !== delimiterCode) continue;
-          return text.substring(j, I - k);
-        }
-        return text.substring(j);
-      }
-      while ((t = token()) !== EOF) {
-        var a = [];
-        while (t !== EOL && t !== EOF) {
-          a.push(t);
-          t = token();
-        }
-        if (f && !(a = f(a, n++))) continue;
-        rows.push(a);
-      }
-      return rows;
-    };
-    dsv.format = function(rows) {
-      if (Array.isArray(rows[0])) return dsv.formatRows(rows);
-      var fieldSet = new d3_Set(), fields = [];
-      rows.forEach(function(row) {
-        for (var field in row) {
-          if (!fieldSet.has(field)) {
-            fields.push(fieldSet.add(field));
-          }
-        }
-      });
-      return [ fields.map(formatValue).join(delimiter) ].concat(rows.map(function(row) {
-        return fields.map(function(field) {
-          return formatValue(row[field]);
-        }).join(delimiter);
-      })).join("\n");
-    };
-    dsv.formatRows = function(rows) {
-      return rows.map(formatRow).join("\n");
-    };
-    function formatRow(row) {
-      return row.map(formatValue).join(delimiter);
-    }
-    function formatValue(text) {
-      return reFormat.test(text) ? '"' + text.replace(/\"/g, '""') + '"' : text;
-    }
-    return dsv;
-  };
-  d3.csv = d3.dsv(",", "text/csv");
-  d3.tsv = d3.dsv("	", "text/tab-separated-values");
-  d3.touch = function(container, touches, identifier) {
-    if (arguments.length < 3) identifier = touches, touches = d3_eventSource().changedTouches;
-    if (touches) for (var i = 0, n = touches.length, touch; i < n; ++i) {
-      if ((touch = touches[i]).identifier === identifier) {
-        return d3_mousePoint(container, touch);
-      }
-    }
-  };
-  var d3_timer_queueHead, d3_timer_queueTail, d3_timer_interval, d3_timer_timeout, d3_timer_active, d3_timer_frame = d3_window[d3_vendorSymbol(d3_window, "requestAnimationFrame")] || function(callback) {
-    setTimeout(callback, 17);
-  };
-  d3.timer = function(callback, delay, then) {
-    var n = arguments.length;
-    if (n < 2) delay = 0;
-    if (n < 3) then = Date.now();
-    var time = then + delay, timer = {
-      c: callback,
-      t: time,
-      f: false,
-      n: null
-    };
-    if (d3_timer_queueTail) d3_timer_queueTail.n = timer; else d3_timer_queueHead = timer;
-    d3_timer_queueTail = timer;
-    if (!d3_timer_interval) {
-      d3_timer_timeout = clearTimeout(d3_timer_timeout);
-      d3_timer_interval = 1;
-      d3_timer_frame(d3_timer_step);
-    }
-  };
-  function d3_timer_step() {
-    var now = d3_timer_mark(), delay = d3_timer_sweep() - now;
-    if (delay > 24) {
-      if (isFinite(delay)) {
-        clearTimeout(d3_timer_timeout);
-        d3_timer_timeout = setTimeout(d3_timer_step, delay);
-      }
-      d3_timer_interval = 0;
-    } else {
-      d3_timer_interval = 1;
-      d3_timer_frame(d3_timer_step);
-    }
-  }
-  d3.timer.flush = function() {
-    d3_timer_mark();
-    d3_timer_sweep();
-  };
-  function d3_timer_mark() {
-    var now = Date.now();
-    d3_timer_active = d3_timer_queueHead;
-    while (d3_timer_active) {
-      if (now >= d3_timer_active.t) d3_timer_active.f = d3_timer_active.c(now - d3_timer_active.t);
-      d3_timer_active = d3_timer_active.n;
-    }
-    return now;
-  }
-  function d3_timer_sweep() {
-    var t0, t1 = d3_timer_queueHead, time = Infinity;
-    while (t1) {
-      if (t1.f) {
-        t1 = t0 ? t0.n = t1.n : d3_timer_queueHead = t1.n;
-      } else {
-        if (t1.t < time) time = t1.t;
-        t1 = (t0 = t1).n;
-      }
-    }
-    d3_timer_queueTail = t0;
-    return time;
-  }
-  function d3_format_precision(x, p) {
-    return p - (x ? Math.ceil(Math.log(x) / Math.LN10) : 1);
-  }
-  d3.round = function(x, n) {
-    return n ? Math.round(x * (n = Math.pow(10, n))) / n : Math.round(x);
-  };
-  var d3_formatPrefixes = [ "y", "z", "a", "f", "p", "n", "µ", "m", "", "k", "M", "G", "T", "P", "E", "Z", "Y" ].map(d3_formatPrefix);
-  d3.formatPrefix = function(value, precision) {
-    var i = 0;
-    if (value) {
-      if (value < 0) value *= -1;
-      if (precision) value = d3.round(value, d3_format_precision(value, precision));
-      i = 1 + Math.floor(1e-12 + Math.log(value) / Math.LN10);
-      i = Math.max(-24, Math.min(24, Math.floor((i - 1) / 3) * 3));
-    }
-    return d3_formatPrefixes[8 + i / 3];
-  };
-  function d3_formatPrefix(d, i) {
-    var k = Math.pow(10, abs(8 - i) * 3);
-    return {
-      scale: i > 8 ? function(d) {
-        return d / k;
-      } : function(d) {
-        return d * k;
-      },
-      symbol: d
-    };
-  }
-  function d3_locale_numberFormat(locale) {
-    var locale_decimal = locale.decimal, locale_thousands = locale.thousands, locale_grouping = locale.grouping, locale_currency = locale.currency, formatGroup = locale_grouping ? function(value) {
-      var i = value.length, t = [], j = 0, g = locale_grouping[0];
-      while (i > 0 && g > 0) {
-        t.push(value.substring(i -= g, i + g));
-        g = locale_grouping[j = (j + 1) % locale_grouping.length];
-      }
-      return t.reverse().join(locale_thousands);
-    } : d3_identity;
-    return function(specifier) {
-      var match = d3_format_re.exec(specifier), fill = match[1] || " ", align = match[2] || ">", sign = match[3] || "", symbol = match[4] || "", zfill = match[5], width = +match[6], comma = match[7], precision = match[8], type = match[9], scale = 1, prefix = "", suffix = "", integer = false;
-      if (precision) precision = +precision.substring(1);
-      if (zfill || fill === "0" && align === "=") {
-        zfill = fill = "0";
-        align = "=";
-        if (comma) width -= Math.floor((width - 1) / 4);
-      }
-      switch (type) {
-       case "n":
-        comma = true;
-        type = "g";
-        break;
-
-       case "%":
-        scale = 100;
-        suffix = "%";
-        type = "f";
-        break;
-
-       case "p":
-        scale = 100;
-        suffix = "%";
-        type = "r";
-        break;
-
-       case "b":
-       case "o":
-       case "x":
-       case "X":
-        if (symbol === "#") prefix = "0" + type.toLowerCase();
-
-       case "c":
-       case "d":
-        integer = true;
-        precision = 0;
-        break;
-
-       case "s":
-        scale = -1;
-        type = "r";
-        break;
-      }
-      if (symbol === "$") prefix = locale_currency[0], suffix = locale_currency[1];
-      if (type == "r" && !precision) type = "g";
-      if (precision != null) {
-        if (type == "g") precision = Math.max(1, Math.min(21, precision)); else if (type == "e" || type == "f") precision = Math.max(0, Math.min(20, precision));
-      }
-      type = d3_format_types.get(type) || d3_format_typeDefault;
-      var zcomma = zfill && comma;
-      return function(value) {
-        var fullSuffix = suffix;
-        if (integer && value % 1) return "";
-        var negative = value < 0 || value === 0 && 1 / value < 0 ? (value = -value, "-") : sign;
-        if (scale < 0) {
-          var unit = d3.formatPrefix(value, precision);
-          value = unit.scale(value);
-          fullSuffix = unit.symbol + suffix;
-        } else {
-          value *= scale;
-        }
-        value = type(value, precision);
-        var i = value.lastIndexOf("."), before = i < 0 ? value : value.substring(0, i), after = i < 0 ? "" : locale_decimal + value.substring(i + 1);
-        if (!zfill && comma) before = formatGroup(before);
-        var length = prefix.length + before.length + after.length + (zcomma ? 0 : negative.length), padding = length < width ? new Array(length = width - length + 1).join(fill) : "";
-        if (zcomma) before = formatGroup(padding + before);
-        negative += prefix;
-        value = before + after;
-        return (align === "<" ? negative + value + padding : align === ">" ? padding + negative + value : align === "^" ? padding.substring(0, length >>= 1) + negative + value + padding.substring(length) : negative + (zcomma ? value : padding + value)) + fullSuffix;
-      };
-    };
-  }
-  var d3_format_re = /(?:([^{])?([<>=^]))?([+\- ])?([$#])?(0)?(\d+)?(,)?(\.-?\d+)?([a-z%])?/i;
-  var d3_format_types = d3.map({
-    b: function(x) {
-      return x.toString(2);
-    },
-    c: function(x) {
-      return String.fromCharCode(x);
-    },
-    o: function(x) {
-      return x.toString(8);
-    },
-    x: function(x) {
-      return x.toString(16);
-    },
-    X: function(x) {
-      return x.toString(16).toUpperCase();
-    },
-    g: function(x, p) {
-      return x.toPrecision(p);
-    },
-    e: function(x, p) {
-      return x.toExponential(p);
-    },
-    f: function(x, p) {
-      return x.toFixed(p);
-    },
-    r: function(x, p) {
-      return (x = d3.round(x, d3_format_precision(x, p))).toFixed(Math.max(0, Math.min(20, d3_format_precision(x * (1 + 1e-15), p))));
-    }
-  });
-  function d3_format_typeDefault(x) {
-    return x + "";
-  }
-  var d3_time = d3.time = {}, d3_date = Date;
-  function d3_date_utc() {
-    this._ = new Date(arguments.length > 1 ? Date.UTC.apply(this, arguments) : arguments[0]);
-  }
-  d3_date_utc.prototype = {
-    getDate: function() {
-      return this._.getUTCDate();
-    },
-    getDay: function() {
-      return this._.getUTCDay();
-    },
-    getFullYear: function() {
-      return this._.getUTCFullYear();
-    },
-    getHours: function() {
-      return this._.getUTCHours();
-    },
-    getMilliseconds: function() {
-      return this._.getUTCMilliseconds();
-    },
-    getMinutes: function() {
-      return this._.getUTCMinutes();
-    },
-    getMonth: function() {
-      return this._.getUTCMonth();
-    },
-    getSeconds: function() {
-      return this._.getUTCSeconds();
-    },
-    getTime: function() {
-      return this._.getTime();
-    },
-    getTimezoneOffset: function() {
-      return 0;
-    },
-    valueOf: function() {
-      return this._.valueOf();
-    },
-    setDate: function() {
-      d3_time_prototype.setUTCDate.apply(this._, arguments);
-    },
-    setDay: function() {
-      d3_time_prototype.setUTCDay.apply(this._, arguments);
-    },
-    setFullYear: function() {
-      d3_time_prototype.setUTCFullYear.apply(this._, arguments);
-    },
-    setHours: function() {
-      d3_time_prototype.setUTCHours.apply(this._, arguments);
-    },
-    setMilliseconds: function() {
-      d3_time_prototype.setUTCMilliseconds.apply(this._, arguments);
-    },
-    setMinutes: function() {
-      d3_time_prototype.setUTCMinutes.apply(this._, arguments);
-    },
-    setMonth: function() {
-      d3_time_prototype.setUTCMonth.apply(this._, arguments);
-    },
-    setSeconds: function() {
-      d3_time_prototype.setUTCSeconds.apply(this._, arguments);
-    },
-    setTime: function() {
-      d3_time_prototype.setTime.apply(this._, arguments);
-    }
-  };
-  var d3_time_prototype = Date.prototype;
-  function d3_time_interval(local, step, number) {
-    function round(date) {
-      var d0 = local(date), d1 = offset(d0, 1);
-      return date - d0 < d1 - date ? d0 : d1;
-    }
-    function ceil(date) {
-      step(date = local(new d3_date(date - 1)), 1);
-      return date;
-    }
-    function offset(date, k) {
-      step(date = new d3_date(+date), k);
-      return date;
-    }
-    function range(t0, t1, dt) {
-      var time = ceil(t0), times = [];
-      if (dt > 1) {
-        while (time < t1) {
-          if (!(number(time) % dt)) times.push(new Date(+time));
-          step(time, 1);
-        }
-      } else {
-        while (time < t1) times.push(new Date(+time)), step(time, 1);
-      }
-      return times;
-    }
-    function range_utc(t0, t1, dt) {
-      try {
-        d3_date = d3_date_utc;
-        var utc = new d3_date_utc();
-        utc._ = t0;
-        return range(utc, t1, dt);
-      } finally {
-        d3_date = Date;
-      }
-    }
-    local.floor = local;
-    local.round = round;
-    local.ceil = ceil;
-    local.offset = offset;
-    local.range = range;
-    var utc = local.utc = d3_time_interval_utc(local);
-    utc.floor = utc;
-    utc.round = d3_time_interval_utc(round);
-    utc.ceil = d3_time_interval_utc(ceil);
-    utc.offset = d3_time_interval_utc(offset);
-    utc.range = range_utc;
-    return local;
-  }
-  function d3_time_interval_utc(method) {
-    return function(date, k) {
-      try {
-        d3_date = d3_date_utc;
-        var utc = new d3_date_utc();
-        utc._ = date;
-        return method(utc, k)._;
-      } finally {
-        d3_date = Date;
-      }
-    };
-  }
-  d3_time.year = d3_time_interval(function(date) {
-    date = d3_time.day(date);
-    date.setMonth(0, 1);
-    return date;
-  }, function(date, offset) {
-    date.setFullYear(date.getFullYear() + offset);
-  }, function(date) {
-    return date.getFullYear();
-  });
-  d3_time.years = d3_time.year.range;
-  d3_time.years.utc = d3_time.year.utc.range;
-  d3_time.day = d3_time_interval(function(date) {
-    var day = new d3_date(2e3, 0);
-    day.setFullYear(date.getFullYear(), date.getMonth(), date.getDate());
-    return day;
-  }, function(date, offset) {
-    date.setDate(date.getDate() + offset);
-  }, function(date) {
-    return date.getDate() - 1;
-  });
-  d3_time.days = d3_time.day.range;
-  d3_time.days.utc = d3_time.day.utc.range;
-  d3_time.dayOfYear = function(date) {
-    var year = d3_time.year(date);
-    return Math.floor((date - year - (date.getTimezoneOffset() - year.getTimezoneOffset()) * 6e4) / 864e5);
-  };
-  [ "sunday", "monday", "tuesday", "wednesday", "thursday", "friday", "saturday" ].forEach(function(day, i) {
-    i = 7 - i;
-    var interval = d3_time[day] = d3_time_interval(function(date) {
-      (date = d3_time.day(date)).setDate(date.getDate() - (date.getDay() + i) % 7);
-      return date;
-    }, function(date, offset) {
-      date.setDate(date.getDate() + Math.floor(offset) * 7);
-    }, function(date) {
-      var day = d3_time.year(date).getDay();
-      return Math.floor((d3_time.dayOfYear(date) + (day + i) % 7) / 7) - (day !== i);
-    });
-    d3_time[day + "s"] = interval.range;
-    d3_time[day + "s"].utc = interval.utc.range;
-    d3_time[day + "OfYear"] = function(date) {
-      var day = d3_time.year(date).getDay();
-      return Math.floor((d3_time.dayOfYear(date) + (day + i) % 7) / 7);
-    };
-  });
-  d3_time.week = d3_time.sunday;
-  d3_time.weeks = d3_time.sunday.range;
-  d3_time.weeks.utc = d3_time.sunday.utc.range;
-  d3_time.weekOfYear = d3_time.sundayOfYear;
-  function d3_locale_timeFormat(locale) {
-    var locale_dateTime = locale.dateTime, locale_date = locale.date, locale_time = locale.time, locale_periods = locale.periods, locale_days = locale.days, locale_shortDays = locale.shortDays, locale_months = locale.months, locale_shortMonths = locale.shortMonths;
-    function d3_time_format(template) {
-      var n = template.length;
-      function format(date) {
-        var string = [], i = -1, j = 0, c, p, f;
-        while (++i < n) {
-          if (template.charCodeAt(i) === 37) {
-            string.push(template.substring(j, i));
-            if ((p = d3_time_formatPads[c = template.charAt(++i)]) != null) c = template.charAt(++i);
-            if (f = d3_time_formats[c]) c = f(date, p == null ? c === "e" ? " " : "0" : p);
-            string.push(c);
-            j = i + 1;
-          }
-        }
-        string.push(template.substring(j, i));
-        return string.join("");
-      }
-      format.parse = function(string) {
-        var d = {
-          y: 1900,
-          m: 0,
-          d: 1,
-          H: 0,
-          M: 0,
-          S: 0,
-          L: 0,
-          Z: null
-        }, i = d3_time_parse(d, template, string, 0);
-        if (i != string.length) return null;
-        if ("p" in d) d.H = d.H % 12 + d.p * 12;
-        var localZ = d.Z != null && d3_date !== d3_date_utc, date = new (localZ ? d3_date_utc : d3_date)();
-        if ("j" in d) date.setFullYear(d.y, 0, d.j); else if ("w" in d && ("W" in d || "U" in d)) {
-          date.setFullYear(d.y, 0, 1);
-          date.setFullYear(d.y, 0, "W" in d ? (d.w + 6) % 7 + d.W * 7 - (date.getDay() + 5) % 7 : d.w + d.U * 7 - (date.getDay() + 6) % 7);
-        } else date.setFullYear(d.y, d.m, d.d);
-        date.setHours(d.H + Math.floor(d.Z / 100), d.M + d.Z % 100, d.S, d.L);
-        return localZ ? date._ : date;
-      };
-      format.toString = function() {
-        return template;
-      };
-      return format;
-    }
-    function d3_time_parse(date, template, string, j) {
-      var c, p, t, i = 0, n = template.length, m = string.length;
-      while (i < n) {
-        if (j >= m) return -1;
-        c = template.charCodeAt(i++);
-        if (c === 37) {
-          t = template.charAt(i++);
-          p = d3_time_parsers[t in d3_time_formatPads ? template.charAt(i++) : t];
-          if (!p || (j = p(date, string, j)) < 0) return -1;
-        } else if (c != string.charCodeAt(j++)) {
-          return -1;
-        }
-      }
-      return j;
-    }
-    d3_time_format.utc = function(template) {
-      var local = d3_time_format(template);
-      function format(date) {
-        try {
-          d3_date = d3_date_utc;
-          var utc = new d3_date();
-          utc._ = date;
-          return local(utc);
-        } finally {
-          d3_date = Date;
-        }
-      }
-      format.parse = function(string) {
-        try {
-          d3_date = d3_date_utc;
-          var date = local.parse(string);
-          return date && date._;
-        } finally {
-          d3_date = Date;
-        }
-      };
-      format.toString = local.toString;
-      return format;
-    };
-    d3_time_format.multi = d3_time_format.utc.multi = d3_time_formatMulti;
-    var d3_time_periodLookup = d3.map(), d3_time_dayRe = d3_time_formatRe(locale_days), d3_time_dayLookup = d3_time_formatLookup(locale_days), d3_time_dayAbbrevRe = d3_time_formatRe(locale_shortDays), d3_time_dayAbbrevLookup = d3_time_formatLookup(locale_shortDays), d3_time_monthRe = d3_time_formatRe(locale_months), d3_time_monthLookup = d3_time_formatLookup(locale_months), d3_time_monthAbbrevRe = d3_time_formatRe(locale_shortMonths), d3_time_monthAbbrevLookup = d3_time_formatLookup(locale_shortMonths);
-    locale_periods.forEach(function(p, i) {
-      d3_time_periodLookup.set(p.toLowerCase(), i);
-    });
-    var d3_time_formats = {
-      a: function(d) {
-        return locale_shortDays[d.getDay()];
-      },
-      A: function(d) {
-        return locale_days[d.getDay()];
-      },
-      b: function(d) {
-        return locale_shortMonths[d.getMonth()];
-      },
-      B: function(d) {
-        return locale_months[d.getMonth()];
-      },
-      c: d3_time_format(locale_dateTime),
-      d: function(d, p) {
-        return d3_time_formatPad(d.getDate(), p, 2);
-      },
-      e: function(d, p) {
-        return d3_time_formatPad(d.getDate(), p, 2);
-      },
-      H: function(d, p) {
-        return d3_time_formatPad(d.getHours(), p, 2);
-      },
-      I: function(d, p) {
-        return d3_time_formatPad(d.getHours() % 12 || 12, p, 2);
-      },
-      j: function(d, p) {
-        return d3_time_formatPad(1 + d3_time.dayOfYear(d), p, 3);
-      },
-      L: function(d, p) {
-        return d3_time_formatPad(d.getMilliseconds(), p, 3);
-      },
-      m: function(d, p) {
-        return d3_time_formatPad(d.getMonth() + 1, p, 2);
-      },
-      M: function(d, p) {
-        return d3_time_formatPad(d.getMinutes(), p, 2);
-      },
-      p: function(d) {
-        return locale_periods[+(d.getHours() >= 12)];
-      },
-      S: function(d, p) {
-        return d3_time_formatPad(d.getSeconds(), p, 2);
-      },
-      U: function(d, p) {
-        return d3_time_formatPad(d3_time.sundayOfYear(d), p, 2);
-      },
-      w: function(d) {
-        return d.getDay();
-      },
-      W: function(d, p) {
-        return d3_time_formatPad(d3_time.mondayOfYear(d), p, 2);
-      },
-      x: d3_time_format(locale_date),
-      X: d3_time_format(locale_time),
-      y: function(d, p) {
-        return d3_time_formatPad(d.getFullYear() % 100, p, 2);
-      },
-      Y: function(d, p) {
-        return d3_time_formatPad(d.getFullYear() % 1e4, p, 4);
-      },
-      Z: d3_time_zone,
-      "%": function() {
-        return "%";
-      }
-    };
-    var d3_time_parsers = {
-      a: d3_time_parseWeekdayAbbrev,
-      A: d3_time_parseWeekday,
-      b: d3_time_parseMonthAbbrev,
-      B: d3_time_parseMonth,
-      c: d3_time_parseLocaleFull,
-      d: d3_time_parseDay,
-      e: d3_time_parseDay,
-      H: d3_time_parseHour24,
-      I: d3_time_parseHour24,
-      j: d3_time_parseDayOfYear,
-      L: d3_time_parseMilliseconds,
-      m: d3_time_parseMonthNumber,
-      M: d3_time_parseMinutes,
-      p: d3_time_parseAmPm,
-      S: d3_time_parseSeconds,
-      U: d3_time_parseWeekNumberSunday,
-      w: d3_time_parseWeekdayNumber,
-      W: d3_time_parseWeekNumberMonday,
-      x: d3_time_parseLocaleDate,
-      X: d3_time_parseLocaleTime,
-      y: d3_time_parseYear,
-      Y: d3_time_parseFullYear,
-      Z: d3_time_parseZone,
-      "%": d3_time_parseLiteralPercent
-    };
-    function d3_time_parseWeekdayAbbrev(date, string, i) {
-      d3_time_dayAbbrevRe.lastIndex = 0;
-      var n = d3_time_dayAbbrevRe.exec(string.substring(i));
-      return n ? (date.w = d3_time_dayAbbrevLookup.get(n[0].toLowerCase()), i + n[0].length) : -1;
-    }
-    function d3_time_parseWeekday(date, string, i) {
-      d3_time_dayRe.lastIndex = 0;
-      var n = d3_time_dayRe.exec(string.substring(i));
-      return n ? (date.w = d3_time_dayLookup.get(n[0].toLowerCase()), i + n[0].length) : -1;
-    }
-    function d3_time_parseMonthAbbrev(date, string, i) {
-      d3_time_monthAbbrevRe.lastIndex = 0;
-      var n = d3_time_monthAbbrevRe.exec(string.substring(i));
-      return n ? (date.m = d3_time_monthAbbrevLookup.get(n[0].toLowerCase()), i + n[0].length) : -1;
-    }
-    function d3_time_parseMonth(date, string, i) {
-      d3_time_monthRe.lastIndex = 0;
-      var n = d3_time_monthRe.exec(string.substring(i));
-      return n ? (date.m = d3_time_monthLookup.get(n[0].toLowerCase()), i + n[0].length) : -1;
-    }
-    function d3_time_parseLocaleFull(date, string, i) {
-      return d3_time_parse(date, d3_time_formats.c.toString(), string, i);
-    }
-    function d3_time_parseLocaleDate(date, string, i) {
-      return d3_time_parse(date, d3_time_formats.x.toString(), string, i);
-    }
-    function d3_time_parseLocaleTime(date, string, i) {
-      return d3_time_parse(date, d3_time_formats.X.toString(), string, i);
-    }
-    function d3_time_parseAmPm(date, string, i) {
-      var n = d3_time_periodLookup.get(string.substring(i, i += 2).toLowerCase());
-      return n == null ? -1 : (date.p = n, i);
-    }
-    return d3_time_format;
-  }
-  var d3_time_formatPads = {
-    "-": "",
-    _: " ",
-    "0": "0"
-  }, d3_time_numberRe = /^\s*\d+/, d3_time_percentRe = /^%/;
-  function d3_time_formatPad(value, fill, width) {
-    var sign = value < 0 ? "-" : "", string = (sign ? -value : value) + "", length = string.length;
-    return sign + (length < width ? new Array(width - length + 1).join(fill) + string : string);
-  }
-  function d3_time_formatRe(names) {
-    return new RegExp("^(?:" + names.map(d3.requote).join("|") + ")", "i");
-  }
-  function d3_time_formatLookup(names) {
-    var map = new d3_Map(), i = -1, n = names.length;
-    while (++i < n) map.set(names[i].toLowerCase(), i);
-    return map;
-  }
-  function d3_time_parseWeekdayNumber(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 1));
-    return n ? (date.w = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseWeekNumberSunday(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i));
-    return n ? (date.U = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseWeekNumberMonday(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i));
-    return n ? (date.W = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseFullYear(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 4));
-    return n ? (date.y = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseYear(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 2));
-    return n ? (date.y = d3_time_expandYear(+n[0]), i + n[0].length) : -1;
-  }
-  function d3_time_parseZone(date, string, i) {
-    return /^[+-]\d{4}$/.test(string = string.substring(i, i + 5)) ? (date.Z = +string, 
-    i + 5) : -1;
-  }
-  function d3_time_expandYear(d) {
-    return d + (d > 68 ? 1900 : 2e3);
-  }
-  function d3_time_parseMonthNumber(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 2));
-    return n ? (date.m = n[0] - 1, i + n[0].length) : -1;
-  }
-  function d3_time_parseDay(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 2));
-    return n ? (date.d = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseDayOfYear(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 3));
-    return n ? (date.j = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseHour24(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 2));
-    return n ? (date.H = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseMinutes(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 2));
-    return n ? (date.M = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseSeconds(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 2));
-    return n ? (date.S = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_parseMilliseconds(date, string, i) {
-    d3_time_numberRe.lastIndex = 0;
-    var n = d3_time_numberRe.exec(string.substring(i, i + 3));
-    return n ? (date.L = +n[0], i + n[0].length) : -1;
-  }
-  function d3_time_zone(d) {
-    var z = d.getTimezoneOffset(), zs = z > 0 ? "-" : "+", zh = ~~(abs(z) / 60), zm = abs(z) % 60;
-    return zs + d3_time_formatPad(zh, "0", 2) + d3_time_formatPad(zm, "0", 2);
-  }
-  function d3_time_parseLiteralPercent(date, string, i) {
-    d3_time_percentRe.lastIndex = 0;
-    var n = d3_time_percentRe.exec(string.substring(i, i + 1));
-    return n ? i + n[0].length : -1;
-  }
-  function d3_time_formatMulti(formats) {
-    var n = formats.length, i = -1;
-    while (++i < n) formats[i][0] = this(formats[i][0]);
-    return function(date) {
-      var i = 0, f = formats[i];
-      while (!f[1](date)) f = formats[++i];
-      return f[0](date);
-    };
-  }
-  d3.locale = function(locale) {
-    return {
-      numberFormat: d3_locale_numberFormat(locale),
-      timeFormat: d3_locale_timeFormat(locale)
-    };
-  };
-  var d3_locale_enUS = d3.locale({
-    decimal: ".",
-    thousands: ",",
-    grouping: [ 3 ],
-    currency: [ "$", "" ],
-    dateTime: "%a %b %e %X %Y",
-    date: "%m/%d/%Y",
-    time: "%H:%M:%S",
-    periods: [ "AM", "PM" ],
-    days: [ "Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday" ],
-    shortDays: [ "Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat" ],
-    months: [ "January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December" ],
-    shortMonths: [ "Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec" ]
-  });
-  d3.format = d3_locale_enUS.numberFormat;
-  d3.geo = {};
-  function d3_adder() {}
-  d3_adder.prototype = {
-    s: 0,
-    t: 0,
-    add: function(y) {
-      d3_adderSum(y, this.t, d3_adderTemp);
-      d3_adderSum(d3_adderTemp.s, this.s, this);
-      if (this.s) this.t += d3_adderTemp.t; else this.s = d3_adderTemp.t;
-    },
-    reset: function() {
-      this.s = this.t = 0;
-    },
-    valueOf: function() {
-      return this.s;
-    }
-  };
-  var d3_adderTemp = new d3_adder();
-  function d3_adderSum(a, b, o) {
-    var x = o.s = a + b, bv = x - a, av = x - bv;
-    o.t = a - av + (b - bv);
-  }
-  d3.geo.stream = function(object, listener) {
-    if (object && d3_geo_streamObjectType.hasOwnProperty(object.type)) {
-      d3_geo_streamObjectType[object.type](object, listener);
-    } else {
-      d3_geo_streamGeometry(object, listener);
-    }
-  };
-  function d3_geo_streamGeometry(geometry, listener) {
-    if (geometry && d3_geo_streamGeometryType.hasOwnProperty(geometry.type)) {
-      d3_geo_streamGeometryType[geometry.type](geometry, listener);
-    }
-  }
-  var d3_geo_streamObjectType = {
-    Feature: function(feature, listener) {
-      d3_geo_streamGeometry(feature.geometry, listener);
-    },
-    FeatureCollection: function(object, listener) {
-      var features = object.features, i = -1, n = features.length;
-      while (++i < n) d3_geo_streamGeometry(features[i].geometry, listener);
-    }
-  };
-  var d3_geo_streamGeometryType = {
-    Sphere: function(object, listener) {
-      listener.sphere();
-    },
-    Point: function(object, listener) {
-      object = object.coordinates;
-      listener.point(object[0], object[1], object[2]);
-    },
-    MultiPoint: function(object, listener) {
-      var coordinates = object.coordinates, i = -1, n = coordinates.length;
-      while (++i < n) object = coordinates[i], listener.point(object[0], object[1], object[2]);
-    },
-    LineString: function(object, listener) {
-      d3_geo_streamLine(object.coordinates, listener, 0);
-    },
-    MultiLineString: function(object, listener) {
-      var coordinates = object.coordinates, i = -1, n = coordinates.length;
-      while (++i < n) d3_geo_streamLine(coordinates[i], listener, 0);
-    },
-    Polygon: function(object, listener) {
-      d3_geo_streamPolygon(object.coordinates, listener);
-    },
-    MultiPolygon: function(object, listener) {
-      var coordinates = object.coordinates, i = -1, n = coordinates.length;
-      while (++i < n) d3_geo_streamPolygon(coordinates[i], listener);
-    },
-    GeometryCollection: function(object, listener) {
-      var geometries = object.geometries, i = -1, n = geometries.length;
-      while (++i < n) d3_geo_streamGeometry(geometries[i], listener);
-    }
-  };
-  function d3_geo_streamLine(coordinates, listener, closed) {
-    var i = -1, n = coordinates.length - closed, coordinate;
-    listener.lineStart();
-    while (++i < n) coordinate = coordinates[i], listener.point(coordinate[0], coordinate[1], coordinate[2]);
-    listener.lineEnd();
-  }
-  function d3_geo_streamPolygon(coordinates, listener) {
-    var i = -1, n = coordinates.length;
-    listener.polygonStart();
-    while (++i < n) d3_geo_streamLine(coordinates[i], listener, 1);
-    listener.polygonEnd();
-  }
-  d3.geo.area = function(object) {
-    d3_geo_areaSum = 0;
-    d3.geo.stream(object, d3_geo_area);
-    return d3_geo_areaSum;
-  };
-  var d3_geo_areaSum, d3_geo_areaRingSum = new d3_adder();
-  var d3_geo_area = {
-    sphere: function() {
-      d3_geo_areaSum += 4 * π;
-    },
-    point: d3_noop,
-    lineStart: d3_noop,
-    lineEnd: d3_noop,
-    polygonStart: function() {
-      d3_geo_areaRingSum.reset();
-      d3_geo_area.lineStart = d3_geo_areaRingStart;
-    },
-    polygonEnd: function() {
-      var area = 2 * d3_geo_areaRingSum;
-      d3_geo_areaSum += area < 0 ? 4 * π + area : area;
-      d3_geo_area.lineStart = d3_geo_area.lineEnd = d3_geo_area.point = d3_noop;
-    }
-  };
-  function d3_geo_areaRingStart() {
-    var λ00, φ00, λ0, cosφ0, sinφ0;
-    d3_geo_area.point = function(λ, φ) {
-      d3_geo_area.point = nextPoint;
-      λ0 = (λ00 = λ) * d3_radians, cosφ0 = Math.cos(φ = (φ00 = φ) * d3_radians / 2 + π / 4), 
-      sinφ0 = Math.sin(φ);
-    };
-    function nextPoint(λ, φ) {
-      λ *= d3_radians;
-      φ = φ * d3_radians / 2 + π / 4;
-      var dλ = λ - λ0, sdλ = dλ >= 0 ? 1 : -1, adλ = sdλ * dλ, cosφ = Math.cos(φ), sinφ = Math.sin(φ), k = sinφ0 * sinφ, u = cosφ0 * cosφ + k * Math.cos(adλ), v = k * sdλ * Math.sin(adλ);
-      d3_geo_areaRingSum.add(Math.atan2(v, u));
-      λ0 = λ, cosφ0 = cosφ, sinφ0 = sinφ;
-    }
-    d3_geo_area.lineEnd = function() {
-      nextPoint(λ00, φ00);
-    };
-  }
-  function d3_geo_cartesian(spherical) {
-    var λ = spherical[0], φ = spherical[1], cosφ = Math.cos(φ);
-    return [ cosφ * Math.cos(λ), cosφ * Math.sin(λ), Math.sin(φ) ];
-  }
-  function d3_geo_cartesianDot(a, b) {
-    return a[0] * b[0] + a[1] * b[1] + a[2] * b[2];
-  }
-  function d3_geo_cartesianCross(a, b) {
-    return [ a[1] * b[2] - a[2] * b[1], a[2] * b[0] - a[0] * b[2], a[0] * b[1] - a[1] * b[0] ];
-  }
-  function d3_geo_cartesianAdd(a, b) {
-    a[0] += b[0];
-    a[1] += b[1];
-    a[2] += b[2];
-  }
-  function d3_geo_cartesianScale(vector, k) {
-    return [ vector[0] * k, vector[1] * k, vector[2] * k ];
-  }
-  function d3_geo_cartesianNormalize(d) {
-    var l = Math.sqrt(d[0] * d[0] + d[1] * d[1] + d[2] * d[2]);
-    d[0] /= l;
-    d[1] /= l;
-    d[2] /= l;
-  }
-  function d3_geo_spherical(cartesian) {
-    return [ Math.atan2(cartesian[1], cartesian[0]), d3_asin(cartesian[2]) ];
-  }
-  function d3_geo_sphericalEqual(a, b) {
-    return abs(a[0] - b[0]) < ε && abs(a[1] - b[1]) < ε;
-  }
-  d3.geo.bounds = function() {
-    var λ0, φ0, λ1, φ1, λ_, λ__, φ__, p0, dλSum, ranges, range;
-    var bound = {
-      point: point,
-      lineStart: lineStart,
-      lineEnd: lineEnd,
-      polygonStart: function() {
-        bound.point = ringPoint;
-        bound.lineStart = ringStart;
-        bound.lineEnd = ringEnd;
-        dλSum = 0;
-        d3_geo_area.polygonStart();
-      },
-      polygonEnd: function() {
-        d3_geo_area.polygonEnd();
-        bound.point = point;
-        bound.lineStart = lineStart;
-        bound.lineEnd = lineEnd;
-        if (d3_geo_areaRingSum < 0) λ0 = -(λ1 = 180), φ0 = -(φ1 = 90); else if (dλSum > ε) φ1 = 90; else if (dλSum < -ε) φ0 = -90;
-        range[0] = λ0, range[1] = λ1;
-      }
-    };
-    function point(λ, φ) {
-      ranges.push(range = [ λ0 = λ, λ1 = λ ]);
-      if (φ < φ0) φ0 = φ;
-      if (φ > φ1) φ1 = φ;
-    }
-    function linePoint(λ, φ) {
-      var p = d3_geo_cartesian([ λ * d3_radians, φ * d3_radians ]);
-      if (p0) {
-        var normal = d3_geo_cartesianCross(p0, p), equatorial = [ normal[1], -normal[0], 0 ], inflection = d3_geo_cartesianCross(equatorial, normal);
-        d3_geo_cartesianNormalize(inflection);
-        inflection = d3_geo_spherical(inflection);
-        var dλ = λ - λ_, s = dλ > 0 ? 1 : -1, λi = inflection[0] * d3_degrees * s, antimeridian = abs(dλ) > 180;
-        if (antimeridian ^ (s * λ_ < λi && λi < s * λ)) {
-          var φi = inflection[1] * d3_degrees;
-          if (φi > φ1) φ1 = φi;
-        } else if (λi = (λi + 360) % 360 - 180, antimeridian ^ (s * λ_ < λi && λi < s * λ)) {
-          var φi = -inflection[1] * d3_degrees;
-          if (φi < φ0) φ0 = φi;
-        } else {
-          if (φ < φ0) φ0 = φ;
-          if (φ > φ1) φ1 = φ;
-        }
-        if (antimeridian) {
-          if (λ < λ_) {
-            if (angle(λ0, λ) > angle(λ0, λ1)) λ1 = λ;
-          } else {
-            if (angle(λ, λ1) > angle(λ0, λ1)) λ0 = λ;
-          }
-        } else {
-          if (λ1 >= λ0) {
-            if (λ < λ0) λ0 = λ;
-            if (λ > λ1) λ1 = λ;
-          } else {
-            if (λ > λ_) {
-              if (angle(λ0, λ) > angle(λ0, λ1)) λ1 = λ;
-            } else {
-              if (angle(λ, λ1) > angle(λ0, λ1)) λ0 = λ;
-            }
-          }
-        }
-      } else {
-        point(λ, φ);
-      }
-      p0 = p, λ_ = λ;
-    }
-    function lineStart() {
-      bound.point = linePoint;
-    }
-    function lineEnd() {
-      range[0] = λ0, range[1] = λ1;
-      bound.point = point;
-      p0 = null;
-    }
-    function ringPoint(λ, φ) {
-      if (p0) {
-        var dλ = λ - λ_;
-        dλSum += abs(dλ) > 180 ? dλ + (dλ > 0 ? 360 : -360) : dλ;
-      } else λ__ = λ, φ__ = φ;
-      d3_geo_area.point(λ, φ);
-      linePoint(λ, φ);
-    }
-    function ringStart() {
-      d3_geo_area.lineStart();
-    }
-    function ringEnd() {
-      ringPoint(λ__, φ__);
-      d3_geo_area.lineEnd();
-      if (abs(dλSum) > ε) λ0 = -(λ1 = 180);
-      range[0] = λ0, range[1] = λ1;
-      p0 = null;
-    }
-    function angle(λ0, λ1) {
-      return (λ1 -= λ0) < 0 ? λ1 + 360 : λ1;
-    }
-    function compareRanges(a, b) {
-      return a[0] - b[0];
-    }
-    function withinRange(x, range) {
-      return range[0] <= range[1] ? range[0] <= x && x <= range[1] : x < range[0] || range[1] < x;
-    }
-    return function(feature) {
-      φ1 = λ1 = -(λ0 = φ0 = Infinity);
-      ranges = [];
-      d3.geo.stream(feature, bound);
-      var n = ranges.length;
-      if (n) {
-        ranges.sort(compareRanges);
-        for (var i = 1, a = ranges[0], b, merged = [ a ]; i < n; ++i) {
-          b = ranges[i];
-          if (withinRange(b[0], a) || withinRange(b[1], a)) {
-            if (angle(a[0], b[1]) > angle(a[0], a[1])) a[1] = b[1];
-            if (angle(b[0], a[1]) > angle(a[0], a[1])) a[0] = b[0];
-          } else {
-            merged.push(a = b);
-          }
-        }
-        var best = -Infinity, dλ;
-        for (var n = merged.length - 1, i = 0, a = merged[n], b; i <= n; a = b, ++i) {
-          b = merged[i];
-          if ((dλ = angle(a[1], b[0])) > best) best = dλ, λ0 = b[0], λ1 = a[1];
-        }
-      }
-      ranges = range = null;
-      return λ0 === Infinity || φ0 === Infinity ? [ [ NaN, NaN ], [ NaN, NaN ] ] : [ [ λ0, φ0 ], [ λ1, φ1 ] ];
-    };
-  }();
-  d3.geo.centroid = function(object) {
-    d3_geo_centroidW0 = d3_geo_centroidW1 = d3_geo_centroidX0 = d3_geo_centroidY0 = d3_geo_centroidZ0 = d3_geo_centroidX1 = d3_geo_centroidY1 = d3_geo_centroidZ1 = d3_geo_centroidX2 = d3_geo_centroidY2 = d3_geo_centroidZ2 = 0;
-    d3.geo.stream(object, d3_geo_centroid);
-    var x = d3_geo_centroidX2, y = d3_geo_centroidY2, z = d3_geo_centroidZ2, m = x * x + y * y + z * z;
-    if (m < ε2) {
-      x = d3_geo_centroidX1, y = d3_geo_centroidY1, z = d3_geo_centroidZ1;
-      if (d3_geo_centroidW1 < ε) x = d3_geo_centroidX0, y = d3_geo_centroidY0, z = d3_geo_centroidZ0;
-      m = x * x + y * y + z * z;
-      if (m < ε2) return [ NaN, NaN ];
-    }
-    return [ Math.atan2(y, x) * d3_degrees, d3_asin(z / Math.sqrt(m)) * d3_degrees ];
-  };
-  var d3_geo_centroidW0, d3_geo_centroidW1, d3_geo_centroidX0, d3_geo_centroidY0, d3_geo_centroidZ0, d3_geo_centroidX1, d3_geo_centroidY1, d3_geo_centroidZ1, d3_geo_centroidX2, d3_geo_centroidY2, d3_geo_centroidZ2;
-  var d3_geo_centroid = {
-    sphere: d3_noop,
-    point: d3_geo_centroidPoint,
-    lineStart: d3_geo_centroidLineStart,
-    lineEnd: d3_geo_centroidLineEnd,
-    polygonStart: function() {
-      d3_geo_centroid.lineStart = d3_geo_centroidRingStart;
-    },
-    polygonEnd: function() {
-      d3_geo_centroid.lineStart = d3_geo_centroidLineStart;
-    }
-  };
-  function d3_geo_centroidPoint(λ, φ) {
-    λ *= d3_radians;
-    var cosφ = Math.cos(φ *= d3_radians);
-    d3_geo_centroidPointXYZ(cosφ * Math.cos(λ), cosφ * Math.sin(λ), Math.sin(φ));
-  }
-  function d3_geo_centroidPointXYZ(x, y, z) {
-    ++d3_geo_centroidW0;
-    d3_geo_centroidX0 += (x - d3_geo_centroidX0) / d3_geo_centroidW0;
-    d3_geo_centroidY0 += (y - d3_geo_centroidY0) / d3_geo_centroidW0;
-    d3_geo_centroidZ0 += (z - d3_geo_centroidZ0) / d3_geo_centroidW0;
-  }
-  function d3_geo_centroidLineStart() {
-    var x0, y0, z0;
-    d3_geo_centroid.point = function(λ, φ) {
-      λ *= d3_radians;
-      var cosφ = Math.cos(φ *= d3_radians);
-      x0 = cosφ * Math.cos(λ);
-      y0 = cosφ * Math.sin(λ);
-      z0 = Math.sin(φ);
-      d3_geo_centroid.point = nextPoint;
-      d3_geo_centroidPointXYZ(x0, y0, z0);
-    };
-    function nextPoint(λ, φ) {
-      λ *= d3_radians;
-      var cosφ = Math.cos(φ *= d3_radians), x = cosφ * Math.cos(λ), y = cosφ * Math.sin(λ), z = Math.sin(φ), w = Math.atan2(Math.sqrt((w = y0 * z - z0 * y) * w + (w = z0 * x - x0 * z) * w + (w = x0 * y - y0 * x) * w), x0 * x + y0 * y + z0 * z);
-      d3_geo_centroidW1 += w;
-      d3_geo_centroidX1 += w * (x0 + (x0 = x));
-      d3_geo_centroidY1 += w * (y0 + (y0 = y));
-      d3_geo_centroidZ1 += w * (z0 + (z0 = z));
-      d3_geo_centroidPointXYZ(x0, y0, z0);
-    }
-  }
-  function d3_geo_centroidLineEnd() {
-    d3_geo_centroid.point = d3_geo_centroidPoint;
-  }
-  function d3_geo_centroidRingStart() {
-    var λ00, φ00, x0, y0, z0;
-    d3_geo_centroid.point = function(λ, φ) {
-      λ00 = λ, φ00 = φ;
-      d3_geo_centroid.point = nextPoint;
-      λ *= d3_radians;
-      var cosφ = Math.cos(φ *= d3_radians);
-      x0 = cosφ * Math.cos(λ);
-      y0 = cosφ * Math.sin(λ);
-      z0 = Math.sin(φ);
-      d3_geo_centroidPointXYZ(x0, y0, z0);
-    };
-    d3_geo_centroid.lineEnd = function() {
-      nextPoint(λ00, φ00);
-      d3_geo_centroid.lineEnd = d3_geo_centroidLineEnd;
-      d3_geo_centroid.point = d3_geo_centroidPoint;
-    };
-    function nextPoint(λ, φ) {
-      λ *= d3_radians;
-      var cosφ = Math.cos(φ *= d3_radians), x = cosφ * Math.cos(λ), y = cosφ * Math.sin(λ), z = Math.sin(φ), cx = y0 * z - z0 * y, cy = z0 * x - x0 * z, cz = x0 * y - y0 * x, m = Math.sqrt(cx * cx + cy * cy + cz * cz), u = x0 * x + y0 * y + z0 * z, v = m && -d3_acos(u) / m, w = Math.atan2(m, u);
-      d3_geo_centroidX2 += v * cx;
-      d3_geo_centroidY2 += v * cy;
-      d3_geo_centroidZ2 += v * cz;
-      d3_geo_centroidW1 += w;
-      d3_geo_centroidX1 += w * (x0 + (x0 = x));
-      d3_geo_centroidY1 += w * (y0 + (y0 = y));
-      d3_geo_centroidZ1 += w * (z0 + (z0 = z));
-      d3_geo_centroidPointXYZ(x0, y0, z0);
-    }
-  }
-  function d3_true() {
-    return true;
-  }
-  function d3_geo_clipPolygon(segments, compare, clipStartInside, interpolate, listener) {
-    var subject = [], clip = [];
-    segments.forEach(function(segment) {
-      if ((n = segment.length - 1) <= 0) return;
-      var n, p0 = segment[0], p1 = segment[n];
-      if (d3_geo_sphericalEqual(p0, p1)) {
-        listener.lineStart();
-        for (var i = 0; i < n; ++i) listener.point((p0 = segment[i])[0], p0[1]);
-        listener.lineEnd();
-        return;
-      }
-      var a = new d3_geo_clipPolygonIntersection(p0, segment, null, true), b = new d3_geo_clipPolygonIntersection(p0, null, a, false);
-      a.o = b;
-      subject.push(a);
-      clip.push(b);
-      a = new d3_geo_clipPolygonIntersection(p1, segment, null, false);
-      b = new d3_geo_clipPolygonIntersection(p1, null, a, true);
-      a.o = b;
-      subject.push(a);
-      clip.push(b);
-    });
-    clip.sort(compare);
-    d3_geo_clipPolygonLinkCircular(subject);
-    d3_geo_clipPolygonLinkCircular(clip);
-    if (!subject.length) return;
-    for (var i = 0, entry = clipStartInside, n = clip.length; i < n; ++i) {
-      clip[i].e = entry = !entry;
-    }
-    var start = subject[0], points, point;
-    while (1) {
-      var current = start, isSubject = true;
-      while (current.v) if ((current = current.n) === start) return;
-      points = current.z;
-      listener.lineStart();
-      do {
-        current.v = current.o.v = true;
-        if (current.e) {
-          if (isSubject) {
-            for (var i = 0, n = points.length; i < n; ++i) listener.point((point = points[i])[0], point[1]);
-          } else {
-            interpolate(current.x, current.n.x, 1, listener);
-          }
-          current = current.n;
-        } else {
-          if (isSubject) {
-            points = current.p.z;
-            for (var i = points.length - 1; i >= 0; --i) listener.point((point = points[i])[0], point[1]);
-          } else {
-            interpolate(current.x, current.p.x, -1, listener);
-          }
-          current = current.p;
-        }
-        current = current.o;
-        points = current.z;
-        isSubject = !isSubject;
-      } while (!current.v);
-      listener.lineEnd();
-    }
-  }
-  function d3_geo_clipPolygonLinkCircular(array) {
-    if (!(n = array.length)) return;
-    var n, i = 0, a = array[0], b;
-    while (++i < n) {
-      a.n = b = array[i];
-      b.p = a;
-      a = b;
-    }
-    a.n = b = array[0];
-    b.p = a;
-  }
-  function d3_geo_clipPolygonIntersection(point, points, other, entry) {
-    this.x = point;
-    this.z = points;
-    this.o = other;
-    this.e = entry;
-    this.v = false;
-    this.n = this.p = null;
-  }
-  function d3_geo_clip(pointVisible, clipLine, interpolate, clipStart) {
-    return function(rotate, listener) {
-      var line = clipLine(listener), rotatedClipStart = rotate.invert(clipStart[0], clipStart[1]);
-      var clip = {
-        point: point,
-        lineStart: lineStart,
-        lineEnd: lineEnd,
-        polygonStart: function() {
-          clip.point = pointRing;
-          clip.lineStart = ringStart;
-          clip.lineEnd = ringEnd;
-          segments = [];
-          polygon = [];
-          listener.polygonStart();
-        },
-        polygonEnd: function() {
-          clip.point = point;
-          clip.lineStart = lineStart;
-          clip.lineEnd = lineEnd;
-          segments = d3.merge(segments);
-          var clipStartInside = d3_geo_pointInPolygon(rotatedClipStart, polygon);
-          if (segments.length) {
-            d3_geo_clipPolygon(segments, d3_geo_clipSort, clipStartInside, interpolate, listener);
-          } else if (clipStartInside) {
-            listener.lineStart();
-            interpolate(null, null, 1, listener);
-            listener.lineEnd();
-          }
-          listener.polygonEnd();
-          segments = polygon = null;
-        },
-        sphere: function() {
-          listener.polygonStart();
-          listener.lineStart();
-          interpolate(null, null, 1, listener);
-          listener.lineEnd();
-          listener.polygonEnd();
-        }
-      };
-      function point(λ, φ) {
-        var point = rotate(λ, φ);
-        if (pointVisible(λ = point[0], φ = point[1])) listener.point(λ, φ);
-      }
-      function pointLine(λ, φ) {
-        var point = rotate(λ, φ);
-        line.point(point[0], point[1]);
-      }
-      function lineStart() {
-        clip.point = pointLine;
-        line.lineStart();
-      }
-      function lineEnd() {
-        clip.point = point;
-        line.lineEnd();
-      }
-      var segments;
-      var buffer = d3_geo_clipBufferListener(), ringListener = clipLine(buffer), polygon, ring;
-      function pointRing(λ, φ) {
-        ring.push([ λ, φ ]);
-        var point = rotate(λ, φ);
-        ringListener.point(point[0], point[1]);
-      }
-      function ringStart() {
-        ringListener.lineStart();
-        ring = [];
-      }
-      function ringEnd() {
-        pointRing(ring[0][0], ring[0][1]);
-        ringListener.lineEnd();
-        var clean = ringListener.clean(), ringSegments = buffer.buffer(), segment, n = ringSegments.length;
-        ring.pop();
-        polygon.push(ring);
-        ring = null;
-        if (!n) return;
-        if (clean & 1) {
-          segment = ringSegments[0];
-          var n = segment.length - 1, i = -1, point;
-          listener.lineStart();
-          while (++i < n) listener.point((point = segment[i])[0], point[1]);
-          listener.lineEnd();
-          return;
-        }
-        if (n > 1 && clean & 2) ringSegments.push(ringSegments.pop().concat(ringSegments.shift()));
-        segments.push(ringSegments.filter(d3_geo_clipSegmentLength1));
-      }
-      return clip;
-    };
-  }
-  function d3_geo_clipSegmentLength1(segment) {
-    return segment.length > 1;
-  }
-  function d3_geo_clipBufferListener() {
-    var lines = [], line;
-    return {
-      lineStart: function() {
-        lines.push(line = []);
-      },
-      point: function(λ, φ) {
-        line.push([ λ, φ ]);
-      },
-      lineEnd: d3_noop,
-      buffer: function() {
-        var buffer = lines;
-        lines = [];
-        line = null;
-        return buffer;
-      },
-      rejoin: function() {
-        if (lines.length > 1) lines.push(lines.pop().concat(lines.shift()));
-      }
-    };
-  }
-  function d3_geo_clipSort(a, b) {
-    return ((a = a.x)[0] < 0 ? a[1] - halfπ - ε : halfπ - a[1]) - ((b = b.x)[0] < 0 ? b[1] - halfπ - ε : halfπ - b[1]);
-  }
-  function d3_geo_pointInPolygon(point, polygon) {
-    var meridian = point[0], parallel = point[1], meridianNormal = [ Math.sin(meridian), -Math.cos(meridian), 0 ], polarAngle = 0, winding = 0;
-    d3_geo_areaRingSum.reset();
-    for (var i = 0, n = polygon.length; i < n; ++i) {
-      var ring = polygon[i], m = ring.length;
-      if (!m) continue;
-      var point0 = ring[0], λ0 = point0[0], φ0 = point0[1] / 2 + π / 4, sinφ0 = Math.sin(φ0), cosφ0 = Math.cos(φ0), j = 1;
-      while (true) {
-        if (j === m) j = 0;
-        point = ring[j];
-        var λ = point[0], φ = point[1] / 2 + π / 4, sinφ = Math.sin(φ), cosφ = Math.cos(φ), dλ = λ - λ0, sdλ = dλ >= 0 ? 1 : -1, adλ = sdλ * dλ, antimeridian = adλ > π, k = sinφ0 * sinφ;
-        d3_geo_areaRingSum.add(Math.atan2(k * sdλ * Math.sin(adλ), cosφ0 * cosφ + k * Math.cos(adλ)));
-        polarAngle += antimeridian ? dλ + sdλ * τ : dλ;
-        if (antimeridian ^ λ0 >= meridian ^ λ >= meridian) {
-          var arc = d3_geo_cartesianCross(d3_geo_cartesian(point0), d3_geo_cartesian(point));
-          d3_geo_cartesianNormalize(arc);
-          var intersection = d3_geo_cartesianCross(meridianNormal, arc);
-          d3_geo_cartesianNormalize(intersection);
-          var φarc = (antimeridian ^ dλ >= 0 ? -1 : 1) * d3_asin(intersection[2]);
-          if (parallel > φarc || parallel === φarc && (arc[0] || arc[1])) {
-            winding += antimeridian ^ dλ >= 0 ? 1 : -1;
-          }
-        }
-        if (!j++) break;
-        λ0 = λ, sinφ0 = sinφ, cosφ0 = cosφ, point0 = point;
-      }
-    }
-    return (polarAngle < -ε || polarAngle < ε && d3_geo_areaRingSum < 0) ^ winding & 1;
-  }
-  var d3_geo_clipAntimeridian = d3_geo_clip(d3_true, d3_geo_clipAntimeridianLine, d3_geo_clipAntimeridianInterpolate, [ -π, -π / 2 ]);
-  function d3_geo_clipAntimeridianLine(listener) {
-    var λ0 = NaN, φ0 = NaN, sλ0 = NaN, clean;
-    return {
-      lineStart: function() {
-        listener.lineStart();
-        clean = 1;
-      },
-      point: function(λ1, φ1) {
-        var sλ1 = λ1 > 0 ? π : -π, dλ = abs(λ1 - λ0);
-        if (abs(dλ - π) < ε) {
-          listener.point(λ0, φ0 = (φ0 + φ1) / 2 > 0 ? halfπ : -halfπ);
-          listener.point(sλ0, φ0);
-          listener.lineEnd();
-          listener.lineStart();
-          listener.point(sλ1, φ0);
-          listener.point(λ1, φ0);
-          clean = 0;
-        } else if (sλ0 !== sλ1 && dλ >= π) {
-          if (abs(λ0 - sλ0) < ε) λ0 -= sλ0 * ε;
-          if (abs(λ1 - sλ1) < ε) λ1 -= sλ1 * ε;
-          φ0 = d3_geo_clipAntimeridianIntersect(λ0, φ0, λ1, φ1);
-          listener.point(sλ0, φ0);
-          listener.lineEnd();
-          listener.lineStart();
-          listener.point(sλ1, φ0);
-          clean = 0;
-        }
-        listener.point(λ0 = λ1, φ0 = φ1);
-        sλ0 = sλ1;
-      },
-      lineEnd: function() {
-        listener.lineEnd();
-        λ0 = φ0 = NaN;
-      },
-      clean: function() {
-        return 2 - clean;
-      }
-    };
-  }
-  function d3_geo_clipAntimeridianIntersect(λ0, φ0, λ1, φ1) {
-    var cosφ0, cosφ1, sinλ0_λ1 = Math.sin(λ0 - λ1);
-    return abs(sinλ0_λ1) > ε ? Math.atan((Math.sin(φ0) * (cosφ1 = Math.cos(φ1)) * Math.sin(λ1) - Math.sin(φ1) * (cosφ0 = Math.cos(φ0)) * Math.sin(λ0)) / (cosφ0 * cosφ1 * sinλ0_λ1)) : (φ0 + φ1) / 2;
-  }
-  function d3_geo_clipAntimeridianInterpolate(from, to, direction, listener) {
-    var φ;
-    if (from == null) {
-      φ = direction * halfπ;
-      listener.point(-π, φ);
-      listener.point(0, φ);
-      listener.point(π, φ);
-      listener.point(π, 0);
-      listener.point(π, -φ);
-      listener.point(0, -φ);
-      listener.point(-π, -φ);
-      listener.point(-π, 0);
-      listener.point(-π, φ);
-    } else if (abs(from[0] - to[0]) > ε) {
-      var s = from[0] < to[0] ? π : -π;
-      φ = direction * s / 2;
-      listener.point(-s, φ);
-      listener.point(0, φ);
-      listener.point(s, φ);
-    } else {
-      listener.point(to[0], to[1]);
-    }
-  }
-  function d3_geo_clipCircle(radius) {
-    var cr = Math.cos(radius), smallRadius = cr > 0, notHemisphere = abs(cr) > ε, interpolate = d3_geo_circleInterpolate(radius, 6 * d3_radians);
-    return d3_geo_clip(visible, clipLine, interpolate, smallRadius ? [ 0, -radius ] : [ -π, radius - π ]);
-    function visible(λ, φ) {
-      return Math.cos(λ) * Math.cos(φ) > cr;
-    }
-    function clipLine(listener) {
-      var point0, c0, v0, v00, clean;
-      return {
-        lineStart: function() {
-          v00 = v0 = false;
-          clean = 1;
-        },
-        point: function(λ, φ) {
-          var point1 = [ λ, φ ], point2, v = visible(λ, φ), c = smallRadius ? v ? 0 : code(λ, φ) : v ? code(λ + (λ < 0 ? π : -π), φ) : 0;
-          if (!point0 && (v00 = v0 = v)) listener.lineStart();
-          if (v !== v0) {
-            point2 = intersect(point0, point1);
-            if (d3_geo_sphericalEqual(point0, point2) || d3_geo_sphericalEqual(point1, point2)) {
-              point1[0] += ε;
-              point1[1] += ε;
-              v = visible(point1[0], point1[1]);
-            }
-          }
-          if (v !== v0) {
-            clean = 0;
-            if (v) {
-              listener.lineStart();
-              point2 = intersect(point1, point0);
-              listener.point(point2[0], point2[1]);
-            } else {
-              point2 = intersect(point0, point1);
-              listener.point(point2[0], point2[1]);
-              listener.lineEnd();
-            }
-            point0 = point2;
-          } else if (notHemisphere && point0 && smallRadius ^ v) {
-            var t;
-            if (!(c & c0) && (t = intersect(point1, point0, true))) {
-              clean = 0;
-              if (smallRadius) {
-                listener.lineStart();
-                listener.point(t[0][0], t[0][1]);
-                listener.point(t[1][0], t[1][1]);
-                listener.lineEnd();
-              } else {
-                listener.point(t[1][0], t[1][1]);
-                listener.lineEnd();
-                listener.lineStart();
-                listener.point(t[0][0], t[0][1]);
-              }
-            }
-          }
-          if (v && (!point0 || !d3_geo_sphericalEqual(point0, point1))) {
-            listener.point(point1[0], point1[1]);
-          }
-          point0 = point1, v0 = v, c0 = c;
-        },
-        lineEnd: function() {
-          if (v0) listener.lineEnd();
-          point0 = null;
-        },
-        clean: function() {
-          return clean | (v00 && v0) << 1;
-        }
-      };
-    }
-    function intersect(a, b, two) {
-      var pa = d3_geo_cartesian(a), pb = d3_geo_cartesian(b);
-      var n1 = [ 1, 0, 0 ], n2 = d3_geo_cartesianCross(pa, pb), n2n2 = d3_geo_cartesianDot(n2, n2), n1n2 = n2[0], determinant = n2n2 - n1n2 * n1n2;
-      if (!determinant) return !two && a;
-      var c1 = cr * n2n2 / determinant, c2 = -cr * n1n2 / determinant, n1xn2 = d3_geo_cartesianCross(n1, n2), A = d3_geo_cartesianScale(n1, c1), B = d3_geo_cartesianScale(n2, c2);
-      d3_geo_cartesianAdd(A, B);
-      var u = n1xn2, w = d3_geo_cartesianDot(A, u), uu = d3_geo_cartesianDot(u, u), t2 = w * w - uu * (d3_geo_cartesianDot(A, A) - 1);
-      if (t2 < 0) return;
-      var t = Math.sqrt(t2), q = d3_geo_cartesianScale(u, (-w - t) / uu);
-      d3_geo_cartesianAdd(q, A);
-      q = d3_geo_spherical(q);
-      if (!two) return q;
-      var λ0 = a[0], λ1 = b[0], φ0 = a[1], φ1 = b[1], z;
-      if (λ1 < λ0) z = λ0, λ0 = λ1, λ1 = z;
-      var δλ = λ1 - λ0, polar = abs(δλ - π) < ε, meridian = polar || δλ < ε;
-      if (!polar && φ1 < φ0) z = φ0, φ0 = φ1, φ1 = z;
-      if (meridian ? polar ? φ0 + φ1 > 0 ^ q[1] < (abs(q[0] - λ0) < ε ? φ0 : φ1) : φ0 <= q[1] && q[1] <= φ1 : δλ > π ^ (λ0 <= q[0] && q[0] <= λ1)) {
-        var q1 = d3_geo_cartesianScale(u, (-w + t) / uu);
-        d3_geo_cartesianAdd(q1, A);
-        return [ q, d3_geo_spherical(q1) ];
-      }
-    }
-    function code(λ, φ) {
-      var r = smallRadius ? radius : π - radius, code = 0;
-      if (λ < -r) code |= 1; else if (λ > r) code |= 2;
-      if (φ < -r) code |= 4; else if (φ > r) code |= 8;
-      return code;
-    }
-  }
-  function d3_geom_clipLine(x0, y0, x1, y1) {
-    return function(line) {
-      var a = line.a, b = line.b, ax = a.x, ay = a.y, bx = b.x, by = b.y, t0 = 0, t1 = 1, dx = bx - ax, dy = by - ay, r;
-      r = x0 - ax;
-      if (!dx && r > 0) return;
-      r /= dx;
-      if (dx < 0) {
-        if (r < t0) return;
-        if (r < t1) t1 = r;
-      } else if (dx > 0) {
-        if (r > t1) return;
-        if (r > t0) t0 = r;
-      }
-      r = x1 - ax;
-      if (!dx && r < 0) return;
-      r /= dx;
-      if (dx < 0) {
-        if (r > t1) return;
-        if (r > t0) t0 = r;
-      } else if (dx > 0) {
-        if (r < t0) return;
-        if (r < t1) t1 = r;
-      }
-      r = y0 - ay;
-      if (!dy && r > 0) return;
-      r /= dy;
-      if (dy < 0) {
-        if (r < t0) return;
-        if (r < t1) t1 = r;
-      } else if (dy > 0) {
-        if (r > t1) return;
-        if (r > t0) t0 = r;
-      }
-      r = y1 - ay;
-      if (!dy && r < 0) return;
-      r /= dy;
-      if (dy < 0) {
-        if (r > t1) return;
-        if (r > t0) t0 = r;
-      } else if (dy > 0) {
-        if (r < t0) return;
-        if (r < t1) t1 = r;
-      }
-      if (t0 > 0) line.a = {
-        x: ax + t0 * dx,
-        y: ay + t0 * dy
-      };
-      if (t1 < 1) line.b = {
-        x: ax + t1 * dx,
-        y: ay + t1 * dy
-      };
-      return line;
-    };
-  }
-  var d3_geo_clipExtentMAX = 1e9;
-  d3.geo.clipExtent = function() {
-    var x0, y0, x1, y1, stream, clip, clipExtent = {
-      stream: function(output) {
-        if (stream) stream.valid = false;
-        stream = clip(output);
-        stream.valid = true;
-        return stream;
-      },
-      extent: function(_) {
-        if (!arguments.length) return [ [ x0, y0 ], [ x1, y1 ] ];
-        clip = d3_geo_clipExtent(x0 = +_[0][0], y0 = +_[0][1], x1 = +_[1][0], y1 = +_[1][1]);
-        if (stream) stream.valid = false, stream = null;
-        return clipExtent;
-      }
-    };
-    return clipExtent.extent([ [ 0, 0 ], [ 960, 500 ] ]);
-  };
-  function d3_geo_clipExtent(x0, y0, x1, y1) {
-    return function(listener) {
-      var listener_ = listener, bufferListener = d3_geo_clipBufferListener(), clipLine = d3_geom_clipLine(x0, y0, x1, y1), segments, polygon, ring;
-      var clip = {
-        point: point,
-        lineStart: lineStart,
-        lineEnd: lineEnd,
-        polygonStart: function() {
-          listener = bufferListener;
-          segments = [];
-          polygon = [];
-          clean = true;
-        },
-        polygonEnd: function() {
-          listener = listener_;
-          segments = d3.merge(segments);
-          var clipStartInside = insidePolygon([ x0, y1 ]), inside = clean && clipStartInside, visible = segments.length;
-          if (inside || visible) {
-            listener.polygonStart();
-            if (inside) {
-              listener.lineStart();
-              interpolate(null, null, 1, listener);
-              listener.lineEnd();
-            }
-            if (visible) {
-              d3_geo_clipPolygon(segments, compare, clipStartInside, interpolate, listener);
-            }
-            listener.polygonEnd();
-          }
-          segments = polygon = ring = null;
-        }
-      };
-      function insidePolygon(p) {
-        var wn = 0, n = polygon.length, y = p[1];
-        for (var i = 0; i < n; ++i) {
-          for (var j = 1, v = polygon[i], m = v.length, a = v[0], b; j < m; ++j) {
-            b = v[j];
-            if (a[1] <= y) {
-              if (b[1] > y && d3_cross2d(a, b, p) > 0) ++wn;
-            } else {
-              if (b[1] <= y && d3_cross2d(a, b, p) < 0) --wn;
-            }
-            a = b;
-          }
-        }
-        return wn !== 0;
-      }
-      function interpolate(from, to, direction, listener) {
-        var a = 0, a1 = 0;
-        if (from == null || (a = corner(from, direction)) !== (a1 = corner(to, direction)) || comparePoints(from, to) < 0 ^ direction > 0) {
-          do {
-            listener.point(a === 0 || a === 3 ? x0 : x1, a > 1 ? y1 : y0);
-          } while ((a = (a + direction + 4) % 4) !== a1);
-        } else {
-          listener.point(to[0], to[1]);
-        }
-      }
-      function pointVisible(x, y) {
-        return x0 <= x && x <= x1 && y0 <= y && y <= y1;
-      }
-      function point(x, y) {
-        if (pointVisible(x, y)) listener.point(x, y);
-      }
-      var x__, y__, v__, x_, y_, v_, first, clean;
-      function lineStart() {
-        clip.point = linePoint;
-        if (polygon) polygon.push(ring = []);
-        first = true;
-        v_ = false;
-        x_ = y_ = NaN;
-      }
-      function lineEnd() {
-        if (segments) {
-          linePoint(x__, y__);
-          if (v__ && v_) bufferListener.rejoin();
-          segments.push(bufferListener.buffer());
-        }
-        clip.point = point;
-        if (v_) listener.lineEnd();
-      }
-      function linePoint(x, y) {
-        x = Math.max(-d3_geo_clipExtentMAX, Math.min(d3_geo_clipExtentMAX, x));
-        y = Math.max(-d3_geo_clipExtentMAX, Math.min(d3_geo_clipExtentMAX, y));
-        var v = pointVisible(x, y);
-        if (polygon) ring.push([ x, y ]);
-        if (first) {
-          x__ = x, y__ = y, v__ = v;
-          first = false;
-          if (v) {
-            listener.lineStart();
-            listener.point(x, y);
-          }
-        } else {
-          if (v && v_) listener.point(x, y); else {
-            var l = {
-              a: {
-                x: x_,
-                y: y_
-              },
-              b: {
-                x: x,
-                y: y
-              }
-            };
-            if (clipLine(l)) {
-              if (!v_) {
-                listener.lineStart();
-                listener.point(l.a.x, l.a.y);
-              }
-              listener.point(l.b.x, l.b.y);
-              if (!v) listener.lineEnd();
-              clean = false;
-            } else if (v) {
-              listener.lineStart();
-              listener.point(x, y);
-              clean = false;
-            }
-          }
-        }
-        x_ = x, y_ = y, v_ = v;
-      }
-      return clip;
-    };
-    function corner(p, direction) {
-      return abs(p[0] - x0) < ε ? direction > 0 ? 0 : 3 : abs(p[0] - x1) < ε ? direction > 0 ? 2 : 1 : abs(p[1] - y0) < ε ? direction > 0 ? 1 : 0 : direction > 0 ? 3 : 2;
-    }
-    function compare(a, b) {
-      return comparePoints(a.x, b.x);
-    }
-    function comparePoints(a, b) {
-      var ca = corner(a, 1), cb = corner(b, 1);
-      return ca !== cb ? ca - cb : ca === 0 ? b[1] - a[1] : ca === 1 ? a[0] - b[0] : ca === 2 ? a[1] - b[1] : b[0] - a[0];
-    }
-  }
-  function d3_geo_compose(a, b) {
-    function compose(x, y) {
-      return x = a(x, y), b(x[0], x[1]);
-    }
-    if (a.invert && b.invert) compose.invert = function(x, y) {
-      return x = b.invert(x, y), x && a.invert(x[0], x[1]);
-    };
-    return compose;
-  }
-  function d3_geo_conic(projectAt) {
-    var φ0 = 0, φ1 = π / 3, m = d3_geo_projectionMutator(projectAt), p = m(φ0, φ1);
-    p.parallels = function(_) {
-      if (!arguments.length) return [ φ0 / π * 180, φ1 / π * 180 ];
-      return m(φ0 = _[0] * π / 180, φ1 = _[1] * π / 180);
-    };
-    return p;
-  }
-  function d3_geo_conicEqualArea(φ0, φ1) {
-    var sinφ0 = Math.sin(φ0), n = (sinφ0 + Math.sin(φ1)) / 2, C = 1 + sinφ0 * (2 * n - sinφ0), ρ0 = Math.sqrt(C) / n;
-    function forward(λ, φ) {
-      var ρ = Math.sqrt(C - 2 * n * Math.sin(φ)) / n;
-      return [ ρ * Math.sin(λ *= n), ρ0 - ρ * Math.cos(λ) ];
-    }
-    forward.invert = function(x, y) {
-      var ρ0_y = ρ0 - y;
-      return [ Math.atan2(x, ρ0_y) / n, d3_asin((C - (x * x + ρ0_y * ρ0_y) * n * n) / (2 * n)) ];
-    };
-    return forward;
-  }
-  (d3.geo.conicEqualArea = function() {
-    return d3_geo_conic(d3_geo_conicEqualArea);
-  }).raw = d3_geo_conicEqualArea;
-  d3.geo.albers = function() {
-    return d3.geo.conicEqualArea().rotate([ 96, 0 ]).center([ -.6, 38.7 ]).parallels([ 29.5, 45.5 ]).scale(1070);
-  };
-  d3.geo.albersUsa = function() {
-    var lower48 = d3.geo.albers();
-    var alaska = d3.geo.conicEqualArea().rotate([ 154, 0 ]).center([ -2, 58.5 ]).parallels([ 55, 65 ]);
-    var hawaii = d3.geo.conicEqualArea().rotate([ 157, 0 ]).center([ -3, 19.9 ]).parallels([ 8, 18 ]);
-    var point, pointStream = {
-      point: function(x, y) {
-        point = [ x, y ];
-      }
-    }, lower48Point, alaskaPoint, hawaiiPoint;
-    function albersUsa(coordinates) {
-      var x = coordinates[0], y = coordinates[1];
-      point = null;
-      (lower48Point(x, y), point) || (alaskaPoint(x, y), point) || hawaiiPoint(x, y);
-      return point;
-    }
-    albersUsa.invert = function(coordinates) {
-      var k = lower48.scale(), t = lower48.translate(), x = (coordinates[0] - t[0]) / k, y = (coordinates[1] - t[1]) / k;
-      return (y >= .12 && y < .234 && x >= -.425 && x < -.214 ? alaska : y >= .166 && y < .234 && x >= -.214 && x < -.115 ? hawaii : lower48).invert(coordinates);
-    };
-    albersUsa.stream = function(stream) {
-      var lower48Stream = lower48.stream(stream), alaskaStream = alaska.stream(stream), hawaiiStream = hawaii.stream(stream);
-      return {
-        point: function(x, y) {
-          lower48Stream.point(x, y);
-          alaskaStream.point(x, y);
-          hawaiiStream.point(x, y);
-        },
-        sphere: function() {
-          lower48Stream.sphere();
-          alaskaStream.sphere();
-          hawaiiStream.sphere();
-        },
-        lineStart: function() {
-          lower48Stream.lineStart();
-          alaskaStream.lineStart();
-          hawaiiStream.lineStart();
-        },
-        lineEnd: function() {
-          lower48Stream.lineEnd();
-          alaskaStream.lineEnd();
-          hawaiiStream.lineEnd();
-        },
-        polygonStart: function() {
-          lower48Stream.polygonStart();
-          alaskaStream.polygonStart();
-          hawaiiStream.polygonStart();
-        },
-        polygonEnd: function() {
-          lower48Stream.polygonEnd();
-          alaskaStream.polygonEnd();
-          hawaiiStream.polygonEnd();
-        }
-      };
-    };
-    albersUsa.precision = function(_) {
-      if (!arguments.length) return lower48.precision();
-      lower48.precision(_);
-      alaska.precision(_);
-      hawaii.precision(_);
-      return albersUsa;
-    };
-    albersUsa.scale = function(_) {
-      if (!arguments.length) return lower48.scale();
-      lower48.scale(_);
-      alaska.scale(_ * .35);
-      hawaii.scale(_);
-      return albersUsa.translate(lower48.translate());
-    };
-    albersUsa.translate = function(_) {
-      if (!arguments.length) return lower48.translate();
-      var k = lower48.scale(), x = +_[0], y = +_[1];
-      lower48Point = lower48.translate(_).clipExtent([ [ x - .455 * k, y - .238 * k ], [ x + .455 * k, y + .238 * k ] ]).stream(pointStream).point;
-      alaskaPoint = alaska.translate([ x - .307 * k, y + .201 * k ]).clipExtent([ [ x - .425 * k + ε, y + .12 * k + ε ], [ x - .214 * k - ε, y + .234 * k - ε ] ]).stream(pointStream).point;
-      hawaiiPoint = hawaii.translate([ x - .205 * k, y + .212 * k ]).clipExtent([ [ x - .214 * k + ε, y + .166 * k + ε ], [ x - .115 * k - ε, y + .234 * k - ε ] ]).stream(pointStream).point;
-      return albersUsa;
-    };
-    return albersUsa.scale(1070);
-  };
-  var d3_geo_pathAreaSum, d3_geo_pathAreaPolygon, d3_geo_pathArea = {
-    point: d3_noop,
-    lineStart: d3_noop,
-    lineEnd: d3_noop,
-    polygonStart: function() {
-      d3_geo_pathAreaPolygon = 0;
-      d3_geo_pathArea.lineStart = d3_geo_pathAreaRingStart;
-    },
-    polygonEnd: function() {
-      d3_geo_pathArea.lineStart = d3_geo_pathArea.lineEnd = d3_geo_pathArea.point = d3_noop;
-      d3_geo_pathAreaSum += abs(d3_geo_pathAreaPolygon / 2);
-    }
-  };
-  function d3_geo_pathAreaRingStart() {
-    var x00, y00, x0, y0;
-    d3_geo_pathArea.point = function(x, y) {
-      d3_geo_pathArea.point = nextPoint;
-      x00 = x0 = x, y00 = y0 = y;
-    };
-    function nextPoint(x, y) {
-      d3_geo_pathAreaPolygon += y0 * x - x0 * y;
-      x0 = x, y0 = y;
-    }
-    d3_geo_pathArea.lineEnd = function() {
-      nextPoint(x00, y00);
-    };
-  }
-  var d3_geo_pathBoundsX0, d3_geo_pathBoundsY0, d3_geo_pathBoundsX1, d3_geo_pathBoundsY1;
-  var d3_geo_pathBounds = {
-    point: d3_geo_pathBoundsPoint,
-    lineStart: d3_noop,
-    lineEnd: d3_noop,
-    polygonStart: d3_noop,
-    polygonEnd: d3_noop
-  };
-  function d3_geo_pathBoundsPoint(x, y) {
-    if (x < d3_geo_pathBoundsX0) d3_geo_pathBoundsX0 = x;
-    if (x > d3_geo_pathBoundsX1) d3_geo_pathBoundsX1 = x;
-    if (y < d3_geo_pathBoundsY0) d3_geo_pathBoundsY0 = y;
-    if (y > d3_geo_pathBoundsY1) d3_geo_pathBoundsY1 = y;
-  }
-  function d3_geo_pathBuffer() {
-    var pointCircle = d3_geo_pathBufferCircle(4.5), buffer = [];
-    var stream = {
-      point: point,
-      lineStart: function() {
-        stream.point = pointLineStart;
-      },
-      lineEnd: lineEnd,
-      polygonStart: function() {
-        stream.lineEnd = lineEndPolygon;
-      },
-      polygonEnd: function() {
-        stream.lineEnd = lineEnd;
-        stream.point = point;
-      },
-      pointRadius: function(_) {
-        pointCircle = d3_geo_pathBufferCircle(_);
-        return stream;
-      },
-      result: function() {
-        if (buffer.length) {
-          var result = buffer.join("");
-          buffer = [];
-          return result;
-        }
-      }
-    };
-    function point(x, y) {
-      buffer.push("M", x, ",", y, pointCircle);
-    }
-    function pointLineStart(x, y) {
-      buffer.push("M", x, ",", y);
-      stream.point = pointLine;
-    }
-    function pointLine(x, y) {
-      buffer.push("L", x, ",", y);
-    }
-    function lineEnd() {
-      stream.point = point;
-    }
-    function lineEndPolygon() {
-      buffer.push("Z");
-    }
-    return stream;
-  }
-  function d3_geo_pathBufferCircle(radius) {
-    return "m0," + radius + "a" + radius + "," + radius + " 0 1,1 0," + -2 * radius + "a" + radius + "," + radius + " 0 1,1 0," + 2 * radius + "z";
-  }
-  var d3_geo_pathCentroid = {
-    point: d3_geo_pathCentroidPoint,
-    lineStart: d3_geo_pathCentroidLineStart,
-    lineEnd: d3_geo_pathCentroidLineEnd,
-    polygonStart: function() {
-      d3_geo_pathCentroid.lineStart = d3_geo_pathCentroidRingStart;
-    },
-    polygonEnd: function() {
-      d3_geo_pathCentroid.point = d3_geo_pathCentroidPoint;
-      d3_geo_pathCentroid.lineStart = d3_geo_pathCentroidLineStart;
-      d3_geo_pathCentroid.lineEnd = d3_geo_pathCentroidLineEnd;
-    }
-  };
-  function d3_geo_pathCentroidPoint(x, y) {
-    d3_geo_centroidX0 += x;
-    d3_geo_centroidY0 += y;
-    ++d3_geo_centroidZ0;
-  }
-  function d3_geo_pathCentroidLineStart() {
-    var x0, y0;
-    d3_geo_pathCentroid.point = function(x, y) {
-      d3_geo_pathCentroid.point = nextPoint;
-      d3_geo_pathCentroidPoint(x0 = x, y0 = y);
-    };
-    function nextPoint(x, y) {
-      var dx = x - x0, dy = y - y0, z = Math.sqrt(dx * dx + dy * dy);
-      d3_geo_centroidX1 += z * (x0 + x) / 2;
-      d3_geo_centroidY1 += z * (y0 + y) / 2;
-      d3_geo_centroidZ1 += z;
-      d3_geo_pathCentroidPoint(x0 = x, y0 = y);
-    }
-  }
-  function d3_geo_pathCentroidLineEnd() {
-    d3_geo_pathCentroid.point = d3_geo_pathCentroidPoint;
-  }
-  function d3_geo_pathCentroidRingStart() {
-    var x00, y00, x0, y0;
-    d3_geo_pathCentroid.point = function(x, y) {
-      d3_geo_pathCentroid.point = nextPoint;
-      d3_geo_pathCentroidPoint(x00 = x0 = x, y00 = y0 = y);
-    };
-    function nextPoint(x, y) {
-      var dx = x - x0, dy = y - y0, z = Math.sqrt(dx * dx + dy * dy);
-      d3_geo_centroidX1 += z * (x0 + x) / 2;
-      d3_geo_centroidY1 += z * (y0 + y) / 2;
-      d3_geo_centroidZ1 += z;
-      z = y0 * x - x0 * y;
-      d3_geo_centroidX2 += z * (x0 + x);
-      d3_geo_centroidY2 += z * (y0 + y);
-      d3_geo_centroidZ2 += z * 3;
-      d3_geo_pathCentroidPoint(x0 = x, y0 = y);
-    }
-    d3_geo_pathCentroid.lineEnd = function() {
-      nextPoint(x00, y00);
-    };
-  }
-  function d3_geo_pathContext(context) {
-    var pointRadius = 4.5;
-    var stream = {
-      point: point,
-      lineStart: function() {
-        stream.point = pointLineStart;
-      },
-      lineEnd: lineEnd,
-      polygonStart: function() {
-        stream.lineEnd = lineEndPolygon;
-      },
-      polygonEnd: function() {
-        stream.lineEnd = lineEnd;
-        stream.point = point;
-      },
-      pointRadius: function(_) {
-        pointRadius = _;
-        return stream;
-      },
-      result: d3_noop
-    };
-    function point(x, y) {
-      context.moveTo(x, y);
-      context.arc(x, y, pointRadius, 0, τ);
-    }
-    function pointLineStart(x, y) {
-      context.moveTo(x, y);
-      stream.point = pointLine;
-    }
-    function pointLine(x, y) {
-      context.lineTo(x, y);
-    }
-    function lineEnd() {
-      stream.point = point;
-    }
-    function lineEndPolygon() {
-      context.closePath();
-    }
-    return stream;
-  }
-  function d3_geo_resample(project) {
-    var δ2 = .5, cosMinDistance = Math.cos(30 * d3_radians), maxDepth = 16;
-    function resample(stream) {
-      return (maxDepth ? resampleRecursive : resampleNone)(stream);
-    }
-    function resampleNone(stream) {
-      return d3_geo_transformPoint(stream, function(x, y) {
-        x = project(x, y);
-        stream.point(x[0], x[1]);
-      });
-    }
-    function resampleRecursive(stream) {
-      var λ00, φ00, x00, y00, a00, b00, c00, λ0, x0, y0, a0, b0, c0;
-      var resample = {
-        point: point,
-        lineStart: lineStart,
-        lineEnd: lineEnd,
-        polygonStart: function() {
-          stream.polygonStart();
-          resample.lineStart = ringStart;
-        },
-        polygonEnd: function() {
-          stream.polygonEnd();
-          resample.lineStart = lineStart;
-        }
-      };
-      function point(x, y) {
-        x = project(x, y);
-        stream.point(x[0], x[1]);
-      }
-      function lineStart() {
-        x0 = NaN;
-        resample.point = linePoint;
-        stream.lineStart();
-      }
-      function linePoint(λ, φ) {
-        var c = d3_geo_cartesian([ λ, φ ]), p = project(λ, φ);
-        resampleLineTo(x0, y0, λ0, a0, b0, c0, x0 = p[0], y0 = p[1], λ0 = λ, a0 = c[0], b0 = c[1], c0 = c[2], maxDepth, stream);
-        stream.point(x0, y0);
-      }
-      function lineEnd() {
-        resample.point = point;
-        stream.lineEnd();
-      }
-      function ringStart() {
-        lineStart();
-        resample.point = ringPoint;
-        resample.lineEnd = ringEnd;
-      }
-      function ringPoint(λ, φ) {
-        linePoint(λ00 = λ, φ00 = φ), x00 = x0, y00 = y0, a00 = a0, b00 = b0, c00 = c0;
-        resample.point = linePoint;
-      }
-      function ringEnd() {
-        resampleLineTo(x0, y0, λ0, a0, b0, c0, x00, y00, λ00, a00, b00, c00, maxDepth, stream);
-        resample.lineEnd = lineEnd;
-        lineEnd();
-      }
-      return resample;
-    }
-    function resampleLineTo(x0, y0, λ0, a0, b0, c0, x1, y1, λ1, a1, b1, c1, depth, stream) {
-      var dx = x1 - x0, dy = y1 - y0, d2 = dx * dx + dy * dy;
-      if (d2 > 4 * δ2 && depth--) {
-        var a = a0 + a1, b = b0 + b1, c = c0 + c1, m = Math.sqrt(a * a + b * b + c * c), φ2 = Math.asin(c /= m), λ2 = abs(abs(c) - 1) < ε || abs(λ0 - λ1) < ε ? (λ0 + λ1) / 2 : Math.atan2(b, a), p = project(λ2, φ2), x2 = p[0], y2 = p[1], dx2 = x2 - x0, dy2 = y2 - y0, dz = dy * dx2 - dx * dy2;
-        if (dz * dz / d2 > δ2 || abs((dx * dx2 + dy * dy2) / d2 - .5) > .3 || a0 * a1 + b0 * b1 + c0 * c1 < cosMinDistance) {
-          resampleLineTo(x0, y0, λ0, a0, b0, c0, x2, y2, λ2, a /= m, b /= m, c, depth, stream);
-          stream.point(x2, y2);
-          resampleLineTo(x2, y2, λ2, a, b, c, x1, y1, λ1, a1, b1, c1, depth, stream);
-        }
-      }
-    }
-    resample.precision = function(_) {
-      if (!arguments.length) return Math.sqrt(δ2);
-      maxDepth = (δ2 = _ * _) > 0 && 16;
-      return resample;
-    };
-    return resample;
-  }
-  d3.geo.path = function() {
-    var pointRadius = 4.5, projection, context, projectStream, contextStream, cacheStream;
-    function path(object) {
-      if (object) {
-        if (typeof pointRadius === "function") contextStream.pointRadius(+pointRadius.apply(this, arguments));
-        if (!cacheStream || !cacheStream.valid) cacheStream = projectStream(contextStream);
-        d3.geo.stream(object, cacheStream);
-      }
-      return contextStream.result();
-    }
-    path.area = function(object) {
-      d3_geo_pathAreaSum = 0;
-      d3.geo.stream(object, projectStream(d3_geo_pathArea));
-      return d3_geo_pathAreaSum;
-    };
-    path.centroid = function(object) {
-      d3_geo_centroidX0 = d3_geo_centroidY0 = d3_geo_centroidZ0 = d3_geo_centroidX1 = d3_geo_centroidY1 = d3_geo_centroidZ1 = d3_geo_centroidX2 = d3_geo_centroidY2 = d3_geo_centroidZ2 = 0;
-      d3.geo.stream(object, projectStream(d3_geo_pathCentroid));
-      return d3_geo_centroidZ2 ? [ d3_geo_centroidX2 / d3_geo_centroidZ2, d3_geo_centroidY2 / d3_geo_centroidZ2 ] : d3_geo_centroidZ1 ? [ d3_geo_centroidX1 / d3_geo_centroidZ1, d3_geo_centroidY1 / d3_geo_centroidZ1 ] : d3_geo_centroidZ0 ? [ d3_geo_centroidX0 / d3_geo_centroidZ0, d3_geo_centroidY0 / d3_geo_centroidZ0 ] : [ NaN, NaN ];
-    };
-    path.bounds = function(object) {
-      d3_geo_pathBoundsX1 = d3_geo_pathBoundsY1 = -(d3_geo_pathBoundsX0 = d3_geo_pathBoundsY0 = Infinity);
-      d3.geo.stream(object, projectStream(d3_geo_pathBounds));
-      return [ [ d3_geo_pathBoundsX0, d3_geo_pathBoundsY0 ], [ d3_geo_pathBoundsX1, d3_geo_pathBoundsY1 ] ];
-    };
-    path.projection = function(_) {
-      if (!arguments.length) return projection;
-      projectStream = (projection = _) ? _.stream || d3_geo_pathProjectStream(_) : d3_identity;
-      return reset();
-    };
-    path.context = function(_) {
-      if (!arguments.length) return context;
-      contextStream = (context = _) == null ? new d3_geo_pathBuffer() : new d3_geo_pathContext(_);
-      if (typeof pointRadius !== "function") contextStream.pointRadius(pointRadius);
-      return reset();
-    };
-    path.pointRadius = function(_) {
-      if (!arguments.length) return pointRadius;
-      pointRadius = typeof _ === "function" ? _ : (contextStream.pointRadius(+_), +_);
-      return path;
-    };
-    function reset() {
-      cacheStream = null;
-      return path;
-    }
-    return path.projection(d3.geo.albersUsa()).context(null);
-  };
-  function d3_geo_pathProjectStream(project) {
-    var resample = d3_geo_resample(function(x, y) {
-      return project([ x * d3_degrees, y * d3_degrees ]);
-    });
-    return function(stream) {
-      return d3_geo_projectionRadians(resample(stream));
-    };
-  }
-  d3.geo.transform = function(methods) {
-    return {
-      stream: function(stream) {
-        var transform = new d3_geo_transform(stream);
-        for (var k in methods) transform[k] = methods[k];
-        return transform;
-      }
-    };
-  };
-  function d3_geo_transform(stream) {
-    this.stream = stream;
-  }
-  d3_geo_transform.prototype = {
-    point: function(x, y) {
-      this.stream.point(x, y);
-    },
-    sphere: function() {
-      this.stream.sphere();
-    },
-    lineStart: function() {
-      this.stream.lineStart();
-    },
-    lineEnd: function() {
-      this.stream.lineEnd();
-    },
-    polygonStart: function() {
-      this.stream.polygonStart();
-    },
-    polygonEnd: function() {
-      this.stream.polygonEnd();
-    }
-  };
-  function d3_geo_transformPoint(stream, point) {
-    return {
-      point: point,
-      sphere: function() {
-        stream.sphere();
-      },
-      lineStart: function() {
-        stream.lineStart();
-      },
-      lineEnd: function() {
-        stream.lineEnd();
-      },
-      polygonStart: function() {
-        stream.polygonStart();
-      },
-      polygonEnd: function() {
-        stream.polygonEnd();
-      }
-    };
-  }
-  d3.geo.projection = d3_geo_projection;
-  d3.geo.projectionMutator = d3_geo_projectionMutator;
-  function d3_geo_projection(project) {
-    return d3_geo_projectionMutator(function() {
-      return project;
-    })();
-  }
-  function d3_geo_projectionMutator(projectAt) {
-    var project, rotate, projectRotate, projectResample = d3_geo_resample(function(x, y) {
-      x = project(x, y);
-      return [ x[0] * k + δx, δy - x[1] * k ];
-    }), k = 150, x = 480, y = 250, λ = 0, φ = 0, δλ = 0, δφ = 0, δγ = 0, δx, δy, preclip = d3_geo_clipAntimeridian, postclip = d3_identity, clipAngle = null, clipExtent = null, stream;
-    function projection(point) {
-      point = projectRotate(point[0] * d3_radians, point[1] * d3_radians);
-      return [ point[0] * k + δx, δy - point[1] * k ];
-    }
-    function invert(point) {
-      point = projectRotate.invert((point[0] - δx) / k, (δy - point[1]) / k);
-      return point && [ point[0] * d3_degrees, point[1] * d3_degrees ];
-    }
-    projection.stream = function(output) {
-      if (stream) stream.valid = false;
-      stream = d3_geo_projectionRadians(preclip(rotate, projectResample(postclip(output))));
-      stream.valid = true;
-      return stream;
-    };
-    projection.clipAngle = function(_) {
-      if (!arguments.length) return clipAngle;
-      preclip = _ == null ? (clipAngle = _, d3_geo_clipAntimeridian) : d3_geo_clipCircle((clipAngle = +_) * d3_radians);
-      return invalidate();
-    };
-    projection.clipExtent = function(_) {
-      if (!arguments.length) return clipExtent;
-      clipExtent = _;
-      postclip = _ ? d3_geo_clipExtent(_[0][0], _[0][1], _[1][0], _[1][1]) : d3_identity;
-      return invalidate();
-    };
-    projection.scale = function(_) {
-      if (!arguments.length) return k;
-      k = +_;
-      return reset();
-    };
-    projection.translate = function(_) {
-      if (!arguments.length) return [ x, y ];
-      x = +_[0];
-      y = +_[1];
-      return reset();
-    };
-    projection.center = function(_) {
-      if (!arguments.length) return [ λ * d3_degrees, φ * d3_degrees ];
-      λ = _[0] % 360 * d3_radians;
-      φ = _[1] % 360 * d3_radians;
-      return reset();
-    };
-    projection.rotate = function(_) {
-      if (!arguments.length) return [ δλ * d3_degrees, δφ * d3_degrees, δγ * d3_degrees ];
-      δλ = _[0] % 360 * d3_radians;
-      δφ = _[1] % 360 * d3_radians;
-      δγ = _.length > 2 ? _[2] % 360 * d3_radians : 0;
-      return reset();
-    };
-    d3.rebind(projection, projectResample, "precision");
-    function reset() {
-      projectRotate = d3_geo_compose(rotate = d3_geo_rotation(δλ, δφ, δγ), project);
-      var center = project(λ, φ);
-      δx = x - center[0] * k;
-      δy = y + center[1] * k;
-      return invalidate();
-    }
-    function invalidate() {
-      if (stream) stream.valid = false, stream = null;
-      return projection;
-    }
-    return function() {
-      project = projectAt.apply(this, arguments);
-      projection.invert = project.invert && invert;
-      return reset();
-    };
-  }
-  function d3_geo_projectionRadians(stream) {
-    return d3_geo_transformPoint(stream, function(x, y) {
-      stream.point(x * d3_radians, y * d3_radians);
-    });
-  }
-  function d3_geo_equirectangular(λ, φ) {
-    return [ λ, φ ];
-  }
-  (d3.geo.equirectangular = function() {
-    return d3_geo_projection(d3_geo_equirectangular);
-  }).raw = d3_geo_equirectangular.invert = d3_geo_equirectangular;
-  d3.geo.rotation = function(rotate) {
-    rotate = d3_geo_rotation(rotate[0] % 360 * d3_radians, rotate[1] * d3_radians, rotate.length > 2 ? rotate[2] * d3_radians : 0);
-    function forward(coordinates) {
-      coordinates = rotate(coordinates[0] * d3_radians, coordinates[1] * d3_radians);
-      return coordinates[0] *= d3_degrees, coordinates[1] *= d3_degrees, coordinates;
-    }
-    forward.invert = function(coordinates) {
-      coordinates = rotate.invert(coordinates[0] * d3_radians, coordinates[1] * d3_radians);
-      return coordinates[0] *= d3_degrees, coordinates[1] *= d3_degrees, coordinates;
-    };
-    return forward;
-  };
-  function d3_geo_identityRotation(λ, φ) {
-    return [ λ > π ? λ - τ : λ < -π ? λ + τ : λ, φ ];
-  }
-  d3_geo_identityRotation.invert = d3_geo_equirectangular;
-  function d3_geo_rotation(δλ, δφ, δγ) {
-    return δλ ? δφ || δγ ? d3_geo_compose(d3_geo_rotationλ(δλ), d3_geo_rotationφγ(δφ, δγ)) : d3_geo_rotationλ(δλ) : δφ || δγ ? d3_geo_rotationφγ(δφ, δγ) : d3_geo_identityRotation;
-  }
-  function d3_geo_forwardRotationλ(δλ) {
-    return function(λ, φ) {
-      return λ += δλ, [ λ > π ? λ - τ : λ < -π ? λ + τ : λ, φ ];
-    };
-  }
-  function d3_geo_rotationλ(δλ) {
-    var rotation = d3_geo_forwardRotationλ(δλ);
-    rotation.invert = d3_geo_forwardRotationλ(-δλ);
-    return rotation;
-  }
-  function d3_geo_rotationφγ(δφ, δγ) {
-    var cosδφ = Math.cos(δφ), sinδφ = Math.sin(δφ), cosδγ = Math.cos(δγ), sinδγ = Math.sin(δγ);
-    function rotation(λ, φ) {
-      var cosφ = Math.cos(φ), x = Math.cos(λ) * cosφ, y = Math.sin(λ) * cosφ, z = Math.sin(φ), k = z * cosδφ + x * sinδφ;
-      return [ Math.atan2(y * cosδγ - k * sinδγ, x * cosδφ - z * sinδφ), d3_asin(k * cosδγ + y * sinδγ) ];
-    }
-    rotation.invert = function(λ, φ) {
-      var cosφ = Math.cos(φ), x = Math.cos(λ) * cosφ, y = Math.sin(λ) * cosφ, z = Math.sin(φ), k = z * cosδγ - y * sinδγ;
-      return [ Math.atan2(y * cosδγ + z * sinδγ, x * cosδφ + k * sinδφ), d3_asin(k * cosδφ - x * sinδφ) ];
-    };
-    return rotation;
-  }
-  d3.geo.circle = function() {
-    var origin = [ 0, 0 ], angle, precision = 6, interpolate;
-    function circle() {
-      var center = typeof origin === "function" ? origin.apply(this, arguments) : origin, rotate = d3_geo_rotation(-center[0] * d3_radians, -center[1] * d3_radians, 0).invert, ring = [];
-      interpolate(null, null, 1, {
-        point: function(x, y) {
-          ring.push(x = rotate(x, y));
-          x[0] *= d3_degrees, x[1] *= d3_degrees;
-        }
-      });
-      return {
-        type: "Polygon",
-        coordinates: [ ring ]
-      };
-    }
-    circle.origin = function(x) {
-      if (!arguments.length) return origin;
-      origin = x;
-      return circle;
-    };
-    circle.angle = function(x) {
-      if (!arguments.length) return angle;
-      interpolate = d3_geo_circleInterpolate((angle = +x) * d3_radians, precision * d3_radians);
-      return circle;
-    };
-    circle.precision = function(_) {
-      if (!arguments.length) return precision;
-      interpolate = d3_geo_circleInterpolate(angle * d3_radians, (precision = +_) * d3_radians);
-      return circle;
-    };
-    return circle.angle(90);
-  };
-  function d3_geo_circleInterpolate(radius, precision) {
-    var cr = Math.cos(radius), sr = Math.sin(radius);
-    return function(from, to, direction, listener) {
-      var step = direction * precision;
-      if (from != null) {
-        from = d3_geo_circleAngle(cr, from);
-        to = d3_geo_circleAngle(cr, to);
-        if (direction > 0 ? from < to : from > to) from += direction * τ;
-      } else {
-        from = radius + direction * τ;
-        to = radius - .5 * step;
-      }
-      for (var point, t = from; direction > 0 ? t > to : t < to; t -= step) {
-        listener.point((point = d3_geo_spherical([ cr, -sr * Math.cos(t), -sr * Math.sin(t) ]))[0], point[1]);
-      }
-    };
-  }
-  function d3_geo_circleAngle(cr, point) {
-    var a = d3_geo_cartesian(point);
-    a[0] -= cr;
-    d3_geo_cartesianNormalize(a);
-    var angle = d3_acos(-a[1]);
-    return ((-a[2] < 0 ? -angle : angle) + 2 * Math.PI - ε) % (2 * Math.PI);
-  }
-  d3.geo.distance = function(a, b) {
-    var Δλ = (b[0] - a[0]) * d3_radians, φ0 = a[1] * d3_radians, φ1 = b[1] * d3_radians, sinΔλ = Math.sin(Δλ), cosΔλ = Math.cos(Δλ), sinφ0 = Math.sin(φ0), cosφ0 = Math.cos(φ0), sinφ1 = Math.sin(φ1), cosφ1 = Math.cos(φ1), t;
-    return Math.atan2(Math.sqrt((t = cosφ1 * sinΔλ) * t + (t = cosφ0 * sinφ1 - sinφ0 * cosφ1 * cosΔλ) * t), sinφ0 * sinφ1 + cosφ0 * cosφ1 * cosΔλ);
-  };
-  d3.geo.graticule = function() {
-    var x1, x0, X1, X0, y1, y0, Y1, Y0, dx = 10, dy = dx, DX = 90, DY = 360, x, y, X, Y, precision = 2.5;
-    function graticule() {
-      return {
-        type: "MultiLineString",
-        coordinates: lines()
-      };
-    }
-    function lines() {
-      return d3.range(Math.ceil(X0 / DX) * DX, X1, DX).map(X).concat(d3.range(Math.ceil(Y0 / DY) * DY, Y1, DY).map(Y)).concat(d3.range(Math.ceil(x0 / dx) * dx, x1, dx).filter(function(x) {
-        return abs(x % DX) > ε;
-      }).map(x)).concat(d3.range(Math.ceil(y0 / dy) * dy, y1, dy).filter(function(y) {
-        return abs(y % DY) > ε;
-      }).map(y));
-    }
-    graticule.lines = function() {
-      return lines().map(function(coordinates) {
-        return {
-          type: "LineString",
-          coordinates: coordinates
-        };
-      });
-    };
-    graticule.outline = function() {
-      return {
-        type: "Polygon",
-        coordinates: [ X(X0).concat(Y(Y1).slice(1), X(X1).reverse().slice(1), Y(Y0).reverse().slice(1)) ]
-      };
-    };
-    graticule.extent = function(_) {
-      if (!arguments.length) return graticule.minorExtent();
-      return graticule.majorExtent(_).minorExtent(_);
-    };
-    graticule.majorExtent = function(_) {
-      if (!arguments.length) return [ [ X0, Y0 ], [ X1, Y1 ] ];
-      X0 = +_[0][0], X1 = +_[1][0];
-      Y0 = +_[0][1], Y1 = +_[1][1];
-      if (X0 > X1) _ = X0, X0 = X1, X1 = _;
-      if (Y0 > Y1) _ = Y0, Y0 = Y1, Y1 = _;
-      return graticule.precision(precision);
-    };
-    graticule.minorExtent = function(_) {
-      if (!arguments.length) return [ [ x0, y0 ], [ x1, y1 ] ];
-      x0 = +_[0][0], x1 = +_[1][0];
-      y0 = +_[0][1], y1 = +_[1][1];
-      if (x0 > x1) _ = x0, x0 = x1, x1 = _;
-      if (y0 > y1) _ = y0, y0 = y1, y1 = _;
-      return graticule.precision(precision);
-    };
-    graticule.step = function(_) {
-      if (!arguments.length) return graticule.minorStep();
-      return graticule.majorStep(_).minorStep(_);
-    };
-    graticule.majorStep = function(_) {
-      if (!arguments.length) return [ DX, DY ];
-      DX = +_[0], DY = +_[1];
-      return graticule;
-    };
-    graticule.minorStep = function(_) {
-      if (!arguments.length) return [ dx, dy ];
-      dx = +_[0], dy = +_[1];
-      return graticule;
-    };
-    graticule.precision = function(_) {
-      if (!arguments.length) return precision;
-      precision = +_;
-      x = d3_geo_graticuleX(y0, y1, 90);
-      y = d3_geo_graticuleY(x0, x1, precision);
-      X = d3_geo_graticuleX(Y0, Y1, 90);
-      Y = d3_geo_graticuleY(X0, X1, precision);
-      return graticule;
-    };
-    return graticule.majorExtent([ [ -180, -90 + ε ], [ 180, 90 - ε ] ]).minorExtent([ [ -180, -80 - ε ], [ 180, 80 + ε ] ]);
-  };
-  function d3_geo_graticuleX(y0, y1, dy) {
-    var y = d3.range(y0, y1 - ε, dy).concat(y1);
-    return function(x) {
-      return y.map(function(y) {
-        return [ x, y ];
-      });
-    };
-  }
-  function d3_geo_graticuleY(x0, x1, dx) {
-    var x = d3.range(x0, x1 - ε, dx).concat(x1);
-    return function(y) {
-      return x.map(function(x) {
-        return [ x, y ];
-      });
-    };
-  }
-  function d3_source(d) {
-    return d.source;
-  }
-  function d3_target(d) {
-    return d.target;
-  }
-  d3.geo.greatArc = function() {
-    var source = d3_source, source_, target = d3_target, target_;
-    function greatArc() {
-      return {
-        type: "LineString",
-        coordinates: [ source_ || source.apply(this, arguments), target_ || target.apply(this, arguments) ]
-      };
-    }
-    greatArc.distance = function() {
-      return d3.geo.distance(source_ || source.apply(this, arguments), target_ || target.apply(this, arguments));
-    };
-    greatArc.source = function(_) {
-      if (!arguments.length) return source;
-      source = _, source_ = typeof _ === "function" ? null : _;
-      return greatArc;
-    };
-    greatArc.target = function(_) {
-      if (!arguments.length) return target;
-      target = _, target_ = typeof _ === "function" ? null : _;
-      return greatArc;
-    };
-    greatArc.precision = function() {
-      return arguments.length ? greatArc : 0;
-    };
-    return greatArc;
-  };
-  d3.geo.interpolate = function(source, target) {
-    return d3_geo_interpolate(source[0] * d3_radians, source[1] * d3_radians, target[0] * d3_radians, target[1] * d3_radians);
-  };
-  function d3_geo_interpolate(x0, y0, x1, y1) {
-    var cy0 = Math.cos(y0), sy0 = Math.sin(y0), cy1 = Math.cos(y1), sy1 = Math.sin(y1), kx0 = cy0 * Math.cos(x0), ky0 = cy0 * Math.sin(x0), kx1 = cy1 * Math.cos(x1), ky1 = cy1 * Math.sin(x1), d = 2 * Math.asin(Math.sqrt(d3_haversin(y1 - y0) + cy0 * cy1 * d3_haversin(x1 - x0))), k = 1 / Math.sin(d);
-    var interpolate = d ? function(t) {
-      var B = Math.sin(t *= d) * k, A = Math.sin(d - t) * k, x = A * kx0 + B * kx1, y = A * ky0 + B * ky1, z = A * sy0 + B * sy1;
-      return [ Math.atan2(y, x) * d3_degrees, Math.atan2(z, Math.sqrt(x * x + y * y)) * d3_degrees ];
-    } : function() {
-      return [ x0 * d3_degrees, y0 * d3_degrees ];
-    };
-    interpolate.distance = d;
-    return interpolate;
-  }
-  d3.geo.length = function(object) {
-    d3_geo_lengthSum = 0;
-    d3.geo.stream(object, d3_geo_length);
-    return d3_geo_lengthSum;
-  };
-  var d3_geo_lengthSum;
-  var d3_geo_length = {
-    sphere: d3_noop,
-    point: d3_noop,
-    lineStart: d3_geo_lengthLineStart,
-    lineEnd: d3_noop,
-    polygonStart: d3_noop,
-    polygonEnd: d3_noop
-  };
-  function d3_geo_lengthLineStart() {
-    var λ0, sinφ0, cosφ0;
-    d3_geo_length.point = function(λ, φ) {
-      λ0 = λ * d3_radians, sinφ0 = Math.sin(φ *= d3_radians), cosφ0 = Math.cos(φ);
-      d3_geo_length.point = nextPoint;
-    };
-    d3_geo_length.lineEnd = function() {
-      d3_geo_length.point = d3_geo_length.lineEnd = d3_noop;
-    };
-    function nextPoint(λ, φ) {
-      var sinφ = Math.sin(φ *= d3_radians), cosφ = Math.cos(φ), t = abs((λ *= d3_radians) - λ0), cosΔλ = Math.cos(t);
-      d3_geo_lengthSum += Math.atan2(Math.sqrt((t = cosφ * Math.sin(t)) * t + (t = cosφ0 * sinφ - sinφ0 * cosφ * cosΔλ) * t), sinφ0 * sinφ + cosφ0 * cosφ * cosΔλ);
-      λ0 = λ, sinφ0 = sinφ, cosφ0 = cosφ;
-    }
-  }
-  function d3_geo_azimuthal(scale, angle) {
-    function azimuthal(λ, φ) {
-      var cosλ = Math.cos(λ), cosφ = Math.cos(φ), k = scale(cosλ * cosφ);
-      return [ k * cosφ * Math.sin(λ), k * Math.sin(φ) ];
-    }
-    azimuthal.invert = function(x, y) {
-      var ρ = Math.sqrt(x * x + y * y), c = angle(ρ), sinc = Math.sin(c), cosc = Math.cos(c);
-      return [ Math.atan2(x * sinc, ρ * cosc), Math.asin(ρ && y * sinc / ρ) ];
-    };
-    return azimuthal;
-  }
-  var d3_geo_azimuthalEqualArea = d3_geo_azimuthal(function(cosλcosφ) {
-    return Math.sqrt(2 / (1 + cosλcosφ));
-  }, function(ρ) {
-    return 2 * Math.asin(ρ / 2);
-  });
-  (d3.geo.azimuthalEqualArea = function() {
-    return d3_geo_projection(d3_geo_azimuthalEqualArea);
-  }).raw = d3_geo_azimuthalEqualArea;
-  var d3_geo_azimuthalEquidistant = d3_geo_azimuthal(function(cosλcosφ) {
-    var c = Math.acos(cosλcosφ);
-    return c && c / Math.sin(c);
-  }, d3_identity);
-  (d3.geo.azimuthalEquidistant = function() {
-    return d3_geo_projection(d3_geo_azimuthalEquidistant);
-  }).raw = d3_geo_azimuthalEquidistant;
-  function d3_geo_conicConformal(φ0, φ1) {
-    var cosφ0 = Math.cos(φ0), t = function(φ) {
-      return Math.tan(π / 4 + φ / 2);
-    }, n = φ0 === φ1 ? Math.sin(φ0) : Math.log(cosφ0 / Math.cos(φ1)) / Math.log(t(φ1) / t(φ0)), F = cosφ0 * Math.pow(t(φ0), n) / n;
-    if (!n) return d3_geo_mercator;
-    function forward(λ, φ) {
-      if (F > 0) {
-        if (φ < -halfπ + ε) φ = -halfπ + ε;
-      } else {
-        if (φ > halfπ - ε) φ = halfπ - ε;
-      }
-      var ρ = F / Math.pow(t(φ), n);
-      return [ ρ * Math.sin(n * λ), F - ρ * Math.cos(n * λ) ];
-    }
-    forward.invert = function(x, y) {
-      var ρ0_y = F - y, ρ = d3_sgn(n) * Math.sqrt(x * x + ρ0_y * ρ0_y);
-      return [ Math.atan2(x, ρ0_y) / n, 2 * Math.atan(Math.pow(F / ρ, 1 / n)) - halfπ ];
-    };
-    return forward;
-  }
-  (d3.geo.conicConformal = function() {
-    return d3_geo_conic(d3_geo_conicConformal);
-  }).raw = d3_geo_conicConformal;
-  function d3_geo_conicEquidistant(φ0, φ1) {
-    var cosφ0 = Math.cos(φ0), n = φ0 === φ1 ? Math.sin(φ0) : (cosφ0 - Math.cos(φ1)) / (φ1 - φ0), G = cosφ0 / n + φ0;
-    if (abs(n) < ε) return d3_geo_equirectangular;
-    function forward(λ, φ) {
-      var ρ = G - φ;
-      return [ ρ * Math.sin(n * λ), G - ρ * Math.cos(n * λ) ];
-    }
-    forward.invert = function(x, y) {
-      var ρ0_y = G - y;
-      return [ Math.atan2(x, ρ0_y) / n, G - d3_sgn(n) * Math.sqrt(x * x + ρ0_y * ρ0_y) ];
-    };
-    return forward;
-  }
-  (d3.geo.conicEquidistant = function() {
-    return d3_geo_conic(d3_geo_conicEquidistant);
-  }).raw = d3_geo_conicEquidistant;
-  var d3_geo_gnomonic = d3_geo_azimuthal(function(cosλcosφ) {
-    return 1 / cosλcosφ;
-  }, Math.atan);
-  (d3.geo.gnomonic = function() {
-    return d3_geo_projection(d3_geo_gnomonic);
-  }).raw = d3_geo_gnomonic;
-  function d3_geo_mercator(λ, φ) {
-    return [ λ, Math.log(Math.tan(π / 4 + φ / 2)) ];
-  }
-  d3_geo_mercator.invert = function(x, y) {
-    return [ x, 2 * Math.atan(Math.exp(y)) - halfπ ];
-  };
-  function d3_geo_mercatorProjection(project) {
-    var m = d3_geo_projection(project), scale = m.scale, translate = m.translate, clipExtent = m.clipExtent, clipAuto;
-    m.scale = function() {
-      var v = scale.apply(m, arguments);
-      return v === m ? clipAuto ? m.clipExtent(null) : m : v;
-    };
-    m.translate = function() {
-      var v = translate.apply(m, arguments);
-      return v === m ? clipAuto ? m.clipExtent(null) : m : v;
-    };
-    m.clipExtent = function(_) {
-      var v = clipExtent.apply(m, arguments);
-      if (v === m) {
-        if (clipAuto = _ == null) {
-          var k = π * scale(), t = translate();
-          clipExtent([ [ t[0] - k, t[1] - k ], [ t[0] + k, t[1] + k ] ]);
-        }
-      } else if (clipAuto) {
-        v = null;
-      }
-      return v;
-    };
-    return m.clipExtent(null);
-  }
-  (d3.geo.mercator = function() {
-    return d3_geo_mercatorProjection(d3_geo_mercator);
-  }).raw = d3_geo_mercator;
-  var d3_geo_orthographic = d3_geo_azimuthal(function() {
-    return 1;
-  }, Math.asin);
-  (d3.geo.orthographic = function() {
-    return d3_geo_projection(d3_geo_orthographic);
-  }).raw = d3_geo_orthographic;
-  var d3_geo_stereographic = d3_geo_azimuthal(function(cosλcosφ) {
-    return 1 / (1 + cosλcosφ);
-  }, function(ρ) {
-    return 2 * Math.atan(ρ);
-  });
-  (d3.geo.stereographic = function() {
-    return d3_geo_projection(d3_geo_stereographic);
-  }).raw = d3_geo_stereographic;
-  function d3_geo_transverseMercator(λ, φ) {
-    return [ Math.log(Math.tan(π / 4 + φ / 2)), -λ ];
-  }
-  d3_geo_transverseMercator.invert = function(x, y) {
-    return [ -y, 2 * Math.atan(Math.exp(x)) - halfπ ];
-  };
-  (d3.geo.transverseMercator = function() {
-    var projection = d3_geo_mercatorProjection(d3_geo_transverseMercator), center = projection.center, rotate = projection.rotate;
-    projection.center = function(_) {
-      return _ ? center([ -_[1], _[0] ]) : (_ = center(), [ -_[1], _[0] ]);
-    };
-    projection.rotate = function(_) {
-      return _ ? rotate([ _[0], _[1], _.length > 2 ? _[2] + 90 : 90 ]) : (_ = rotate(), 
-      [ _[0], _[1], _[2] - 90 ]);
-    };
-    return projection.rotate([ 0, 0 ]);
-  }).raw = d3_geo_transverseMercator;
-  d3.geom = {};
-  function d3_geom_pointX(d) {
-    return d[0];
-  }
-  function d3_geom_pointY(d) {
-    return d[1];
-  }
-  d3.geom.hull = function(vertices) {
-    var x = d3_geom_pointX, y = d3_geom_pointY;
-    if (arguments.length) return hull(vertices);
-    function hull(data) {
-      if (data.length < 3) return [];
-      var fx = d3_functor(x), fy = d3_functor(y), i, n = data.length, points = [], flippedPoints = [];
-      for (i = 0; i < n; i++) {
-        points.push([ +fx.call(this, data[i], i), +fy.call(this, data[i], i), i ]);
-      }
-      points.sort(d3_geom_hullOrder);
-      for (i = 0; i < n; i++) flippedPoints.push([ points[i][0], -points[i][1] ]);
-      var upper = d3_geom_hullUpper(points), lower = d3_geom_hullUpper(flippedPoints);
-      var skipLeft = lower[0] === upper[0], skipRight = lower[lower.length - 1] === upper[upper.length - 1], polygon = [];
-      for (i = upper.length - 1; i >= 0; --i) polygon.push(data[points[upper[i]][2]]);
-      for (i = +skipLeft; i < lower.length - skipRight; ++i) polygon.push(data[points[lower[i]][2]]);
-      return polygon;
-    }
-    hull.x = function(_) {
-      return arguments.length ? (x = _, hull) : x;
-    };
-    hull.y = function(_) {
-      return arguments.length ? (y = _, hull) : y;
-    };
-    return hull;
-  };
-  function d3_geom_hullUpper(points) {
-    var n = points.length, hull = [ 0, 1 ], hs = 2;
-    for (var i = 2; i < n; i++) {
-      while (hs > 1 && d3_cross2d(points[hull[hs - 2]], points[hull[hs - 1]], points[i]) <= 0) --hs;
-      hull[hs++] = i;
-    }
-    return hull.slice(0, hs);
-  }
-  function d3_geom_hullOrder(a, b) {
-    return a[0] - b[0] || a[1] - b[1];
-  }
-  d3.geom.polygon = function(coordinates) {
-    d3_subclass(coordinates, d3_geom_polygonPrototype);
-    return coordinates;
-  };
-  var d3_geom_polygonPrototype = d3.geom.polygon.prototype = [];
-  d3_geom_polygonPrototype.area = function() {
-    var i = -1, n = this.length, a, b = this[n - 1], area = 0;
-    while (++i < n) {
-      a = b;
-      b = this[i];
-      area += a[1] * b[0] - a[0] * b[1];
-    }
-    return area * .5;
-  };
-  d3_geom_polygonPrototype.centroid = function(k) {
-    var i = -1, n = this.length, x = 0, y = 0, a, b = this[n - 1], c;
-    if (!arguments.length) k = -1 / (6 * this.area());
-    while (++i < n) {
-      a = b;
-      b = this[i];
-      c = a[0] * b[1] - b[0] * a[1];
-      x += (a[0] + b[0]) * c;
-      y += (a[1] + b[1]) * c;
-    }
-    return [ x * k, y * k ];
-  };
-  d3_geom_polygonPrototype.clip = function(subject) {
-    var input, closed = d3_geom_polygonClosed(subject), i = -1, n = this.length - d3_geom_polygonClosed(this), j, m, a = this[n - 1], b, c, d;
-    while (++i < n) {
-      input = subject.slice();
-      subject.length = 0;
-      b = this[i];
-      c = input[(m = input.length - closed) - 1];
-      j = -1;
-      while (++j < m) {
-        d = input[j];
-        if (d3_geom_polygonInside(d, a, b)) {
-          if (!d3_geom_polygonInside(c, a, b)) {
-            subject.push(d3_geom_polygonIntersect(c, d, a, b));
-          }
-          subject.push(d);
-        } else if (d3_geom_polygonInside(c, a, b)) {
-          subject.push(d3_geom_polygonIntersect(c, d, a, b));
-        }
-        c = d;
-      }
-      if (closed) subject.push(subject[0]);
-      a = b;
-    }
-    return subject;
-  };
-  function d3_geom_polygonInside(p, a, b) {
-    return (b[0] - a[0]) * (p[1] - a[1]) < (b[1] - a[1]) * (p[0] - a[0]);
-  }
-  function d3_geom_polygonIntersect(c, d, a, b) {
-    var x1 = c[0], x3 = a[0], x21 = d[0] - x1, x43 = b[0] - x3, y1 = c[1], y3 = a[1], y21 = d[1] - y1, y43 = b[1] - y3, ua = (x43 * (y1 - y3) - y43 * (x1 - x3)) / (y43 * x21 - x43 * y21);
-    return [ x1 + ua * x21, y1 + ua * y21 ];
-  }
-  function d3_geom_polygonClosed(coordinates) {
-    var a = coordinates[0], b = coordinates[coordinates.length - 1];
-    return !(a[0] - b[0] || a[1] - b[1]);
-  }
-  var d3_geom_voronoiEdges, d3_geom_voronoiCells, d3_geom_voronoiBeaches, d3_geom_voronoiBeachPool = [], d3_geom_voronoiFirstCircle, d3_geom_voronoiCircles, d3_geom_voronoiCirclePool = [];
-  function d3_geom_voronoiBeach() {
-    d3_geom_voronoiRedBlackNode(this);
-    this.edge = this.site = this.circle = null;
-  }
-  function d3_geom_voronoiCreateBeach(site) {
-    var beach = d3_geom_voronoiBeachPool.pop() || new d3_geom_voronoiBeach();
-    beach.site = site;
-    return beach;
-  }
-  function d3_geom_voronoiDetachBeach(beach) {
-    d3_geom_voronoiDetachCircle(beach);
-    d3_geom_voronoiBeaches.remove(beach);
-    d3_geom_voronoiBeachPool.push(beach);
-    d3_geom_voronoiRedBlackNode(beach);
-  }
-  function d3_geom_voronoiRemoveBeach(beach) {
-    var circle = beach.circle, x = circle.x, y = circle.cy, vertex = {
-      x: x,
-      y: y
-    }, previous = beach.P, next = beach.N, disappearing = [ beach ];
-    d3_geom_voronoiDetachBeach(beach);
-    var lArc = previous;
-    while (lArc.circle && abs(x - lArc.circle.x) < ε && abs(y - lArc.circle.cy) < ε) {
-      previous = lArc.P;
-      disappearing.unshift(lArc);
-      d3_geom_voronoiDetachBeach(lArc);
-      lArc = previous;
-    }
-    disappearing.unshift(lArc);
-    d3_geom_voronoiDetachCircle(lArc);
-    var rArc = next;
-    while (rArc.circle && abs(x - rArc.circle.x) < ε && abs(y - rArc.circle.cy) < ε) {
-      next = rArc.N;
-      disappearing.push(rArc);
-      d3_geom_voronoiDetachBeach(rArc);
-      rArc = next;
-    }
-    disappearing.push(rArc);
-    d3_geom_voronoiDetachCircle(rArc);
-    var nArcs = disappearing.length, iArc;
-    for (iArc = 1; iArc < nArcs; ++iArc) {
-      rArc = disappearing[iArc];
-      lArc = disappearing[iArc - 1];
-      d3_geom_voronoiSetEdgeEnd(rArc.edge, lArc.site, rArc.site, vertex);
-    }
-    lArc = disappearing[0];
-    rArc = disappearing[nArcs - 1];
-    rArc.edge = d3_geom_voronoiCreateEdge(lArc.site, rArc.site, null, vertex);
-    d3_geom_voronoiAttachCircle(lArc);
-    d3_geom_voronoiAttachCircle(rArc);
-  }
-  function d3_geom_voronoiAddBeach(site) {
-    var x = site.x, directrix = site.y, lArc, rArc, dxl, dxr, node = d3_geom_voronoiBeaches._;
-    while (node) {
-      dxl = d3_geom_voronoiLeftBreakPoint(node, directrix) - x;
-      if (dxl > ε) node = node.L; else {
-        dxr = x - d3_geom_voronoiRightBreakPoint(node, directrix);
-        if (dxr > ε) {
-          if (!node.R) {
-            lArc = node;
-            break;
-          }
-          node = node.R;
-        } else {
-          if (dxl > -ε) {
-            lArc = node.P;
-            rArc = node;
-          } else if (dxr > -ε) {
-            lArc = node;
-            rArc = node.N;
-          } else {
-            lArc = rArc = node;
-          }
-          break;
-        }
-      }
-    }
-    var newArc = d3_geom_voronoiCreateBeach(site);
-    d3_geom_voronoiBeaches.insert(lArc, newArc);
-    if (!lArc && !rArc) return;
-    if (lArc === rArc) {
-      d3_geom_voronoiDetachCircle(lArc);
-      rArc = d3_geom_voronoiCreateBeach(lArc.site);
-      d3_geom_voronoiBeaches.insert(newArc, rArc);
-      newArc.edge = rArc.edge = d3_geom_voronoiCreateEdge(lArc.site, newArc.site);
-      d3_geom_voronoiAttachCircle(lArc);
-      d3_geom_voronoiAttachCircle(rArc);
-      return;
-    }
-    if (!rArc) {
-      newArc.edge = d3_geom_voronoiCreateEdge(lArc.site, newArc.site);
-      return;
-    }
-    d3_geom_voronoiDetachCircle(lArc);
-    d3_geom_voronoiDetachCircle(rArc);
-    var lSite = lArc.site, ax = lSite.x, ay = lSite.y, bx = site.x - ax, by = site.y - ay, rSite = rArc.site, cx = rSite.x - ax, cy = rSite.y - ay, d = 2 * (bx * cy - by * cx), hb = bx * bx + by * by, hc = cx * cx + cy * cy, vertex = {
-      x: (cy * hb - by * hc) / d + ax,
-      y: (bx * hc - cx * hb) / d + ay
-    };
-    d3_geom_voronoiSetEdgeEnd(rArc.edge, lSite, rSite, vertex);
-    newArc.edge = d3_geom_voronoiCreateEdge(lSite, site, null, vertex);
-    rArc.edge = d3_geom_voronoiCreateEdge(site, rSite, null, vertex);
-    d3_geom_voronoiAttachCircle(lArc);
-    d3_geom_voronoiAttachCircle(rArc);
-  }
-  function d3_geom_voronoiLeftBreakPoint(arc, directrix) {
-    var site = arc.site, rfocx = site.x, rfocy = site.y, pby2 = rfocy - directrix;
-    if (!pby2) return rfocx;
-    var lArc = arc.P;
-    if (!lArc) return -Infinity;
-    site = lArc.site;
-    var lfocx = site.x, lfocy = site.y, plby2 = lfocy - directrix;
-    if (!plby2) return lfocx;
-    var hl = lfocx - rfocx, aby2 = 1 / pby2 - 1 / plby2, b = hl / plby2;
-    if (aby2) return (-b + Math.sqrt(b * b - 2 * aby2 * (hl * hl / (-2 * plby2) - lfocy + plby2 / 2 + rfocy - pby2 / 2))) / aby2 + rfocx;
-    return (rfocx + lfocx) / 2;
-  }
-  function d3_geom_voronoiRightBreakPoint(arc, directrix) {
-    var rArc = arc.N;
-    if (rArc) return d3_geom_voronoiLeftBreakPoint(rArc, directrix);
-    var site = arc.site;
-    return site.y === directrix ? site.x : Infinity;
-  }
-  function d3_geom_voronoiCell(site) {
-    this.site = site;
-    this.edges = [];
-  }
-  d3_geom_voronoiCell.prototype.prepare = function() {
-    var halfEdges = this.edges, iHalfEdge = halfEdges.length, edge;
-    while (iHalfEdge--) {
-      edge = halfEdges[iHalfEdge].edge;
-      if (!edge.b || !edge.a) halfEdges.splice(iHalfEdge, 1);
-    }
-    halfEdges.sort(d3_geom_voronoiHalfEdgeOrder);
-    return halfEdges.length;
-  };
-  function d3_geom_voronoiCloseCells(extent) {
-    var x0 = extent[0][0], x1 = extent[1][0], y0 = extent[0][1], y1 = extent[1][1], x2, y2, x3, y3, cells = d3_geom_voronoiCells, iCell = cells.length, cell, iHalfEdge, halfEdges, nHalfEdges, start, end;
-    while (iCell--) {
-      cell = cells[iCell];
-      if (!cell || !cell.prepare()) continue;
-      halfEdges = cell.edges;
-      nHalfEdges = halfEdges.length;
-      iHalfEdge = 0;
-      while (iHalfEdge < nHalfEdges) {
-        end = halfEdges[iHalfEdge].end(), x3 = end.x, y3 = end.y;
-        start = halfEdges[++iHalfEdge % nHalfEdges].start(), x2 = start.x, y2 = start.y;
-        if (abs(x3 - x2) > ε || abs(y3 - y2) > ε) {
-          halfEdges.splice(iHalfEdge, 0, new d3_geom_voronoiHalfEdge(d3_geom_voronoiCreateBorderEdge(cell.site, end, abs(x3 - x0) < ε && y1 - y3 > ε ? {
-            x: x0,
-            y: abs(x2 - x0) < ε ? y2 : y1
-          } : abs(y3 - y1) < ε && x1 - x3 > ε ? {
-            x: abs(y2 - y1) < ε ? x2 : x1,
-            y: y1
-          } : abs(x3 - x1) < ε && y3 - y0 > ε ? {
-            x: x1,
-            y: abs(x2 - x1) < ε ? y2 : y0
-          } : abs(y3 - y0) < ε && x3 - x0 > ε ? {
-            x: abs(y2 - y0) < ε ? x2 : x0,
-            y: y0
-          } : null), cell.site, null));
-          ++nHalfEdges;
-        }
-      }
-    }
-  }
-  function d3_geom_voronoiHalfEdgeOrder(a, b) {
-    return b.angle - a.angle;
-  }
-  function d3_geom_voronoiCircle() {
-    d3_geom_voronoiRedBlackNode(this);
-    this.x = this.y = this.arc = this.site = this.cy = null;
-  }
-  function d3_geom_voronoiAttachCircle(arc) {
-    var lArc = arc.P, rArc = arc.N;
-    if (!lArc || !rArc) return;
-    var lSite = lArc.site, cSite = arc.site, rSite = rArc.site;
-    if (lSite === rSite) return;
-    var bx = cSite.x, by = cSite.y, ax = lSite.x - bx, ay = lSite.y - by, cx = rSite.x - bx, cy = rSite.y - by;
-    var d = 2 * (ax * cy - ay * cx);
-    if (d >= -ε2) return;
-    var ha = ax * ax + ay * ay, hc = cx * cx + cy * cy, x = (cy * ha - ay * hc) / d, y = (ax * hc - cx * ha) / d, cy = y + by;
-    var circle = d3_geom_voronoiCirclePool.pop() || new d3_geom_voronoiCircle();
-    circle.arc = arc;
-    circle.site = cSite;
-    circle.x = x + bx;
-    circle.y = cy + Math.sqrt(x * x + y * y);
-    circle.cy = cy;
-    arc.circle = circle;
-    var before = null, node = d3_geom_voronoiCircles._;
-    while (node) {
-      if (circle.y < node.y || circle.y === node.y && circle.x <= node.x) {
-        if (node.L) node = node.L; else {
-          before = node.P;
-          break;
-        }
-      } else {
-        if (node.R) node = node.R; else {
-          before = node;
-          break;
-        }
-      }
-    }
-    d3_geom_voronoiCircles.insert(before, circle);
-    if (!before) d3_geom_voronoiFirstCircle = circle;
-  }
-  function d3_geom_voronoiDetachCircle(arc) {
-    var circle = arc.circle;
-    if (circle) {
-      if (!circle.P) d3_geom_voronoiFirstCircle = circle.N;
-      d3_geom_voronoiCircles.remove(circle);
-      d3_geom_voronoiCirclePool.push(circle);
-      d3_geom_voronoiRedBlackNode(circle);
-      arc.circle = null;
-    }
-  }
-  function d3_geom_voronoiClipEdges(extent) {
-    var edges = d3_geom_voronoiEdges, clip = d3_geom_clipLine(extent[0][0], extent[0][1], extent[1][0], extent[1][1]), i = edges.length, e;
-    while (i--) {
-      e = edges[i];
-      if (!d3_geom_voronoiConnectEdge(e, extent) || !clip(e) || abs(e.a.x - e.b.x) < ε && abs(e.a.y - e.b.y) < ε) {
-        e.a = e.b = null;
-        edges.splice(i, 1);
-      }
-    }
-  }
-  function d3_geom_voronoiConnectEdge(edge, extent) {
-    var vb = edge.b;
-    if (vb) return true;
-    var va = edge.a, x0 = extent[0][0], x1 = extent[1][0], y0 = extent[0][1], y1 = extent[1][1], lSite = edge.l, rSite = edge.r, lx = lSite.x, ly = lSite.y, rx = rSite.x, ry = rSite.y, fx = (lx + rx) / 2, fy = (ly + ry) / 2, fm, fb;
-    if (ry === ly) {
-      if (fx < x0 || fx >= x1) return;
-      if (lx > rx) {
-        if (!va) va = {
-          x: fx,
-          y: y0
-        }; else if (va.y >= y1) return;
-        vb = {
-          x: fx,
-          y: y1
-        };
-      } else {
-        if (!va) va = {
-          x: fx,
-          y: y1
-        }; else if (va.y < y0) return;
-        vb = {
-          x: fx,
-          y: y0
-        };
-      }
-    } else {
-      fm = (lx - rx) / (ry - ly);
-      fb = fy - fm * fx;
-      if (fm < -1 || fm > 1) {
-        if (lx > rx) {
-          if (!va) va = {
-            x: (y0 - fb) / fm,
-            y: y0
-          }; else if (va.y >= y1) return;
-          vb = {
-            x: (y1 - fb) / fm,
-            y: y1
-          };
-        } else {
-          if (!va) va = {
-            x: (y1 - fb) / fm,
-            y: y1
-          }; else if (va.y < y0) return;
-          vb = {
-            x: (y0 - fb) / fm,
-            y: y0
-          };
-        }
-      } else {
-        if (ly < ry) {
-          if (!va) va = {
-            x: x0,
-            y: fm * x0 + fb
-          }; else if (va.x >= x1) return;
-          vb = {
-            x: x1,
-            y: fm * x1 + fb
-          };
-        } else {
-          if (!va) va = {
-            x: x1,
-            y: fm * x1 + fb
-          }; else if (va.x < x0) return;
-          vb = {
-            x: x0,
-            y: fm * x0 + fb
-          };
-        }
-      }
-    }
-    edge.a = va;
-    edge.b = vb;
-    return true;
-  }
-  function d3_geom_voronoiEdge(lSite, rSite) {
-    this.l = lSite;
-    this.r = rSite;
-    this.a = this.b = null;
-  }
-  function d3_geom_voronoiCreateEdge(lSite, rSite, va, vb) {
-    var edge = new d3_geom_voronoiEdge(lSite, rSite);
-    d3_geom_voronoiEdges.push(edge);
-    if (va) d3_geom_voronoiSetEdgeEnd(edge, lSite, rSite, va);
-    if (vb) d3_geom_voronoiSetEdgeEnd(edge, rSite, lSite, vb);
-    d3_geom_voronoiCells[lSite.i].edges.push(new d3_geom_voronoiHalfEdge(edge, lSite, rSite));
-    d3_geom_voronoiCells[rSite.i].edges.push(new d3_geom_voronoiHalfEdge(edge, rSite, lSite));
-    return edge;
-  }
-  function d3_geom_voronoiCreateBorderEdge(lSite, va, vb) {
-    var edge = new d3_geom_voronoiEdge(lSite, null);
-    edge.a = va;
-    edge.b = vb;
-    d3_geom_voronoiEdges.push(edge);
-    return edge;
-  }
-  function d3_geom_voronoiSetEdgeEnd(edge, lSite, rSite, vertex) {
-    if (!edge.a && !edge.b) {
-      edge.a = vertex;
-      edge.l = lSite;
-      edge.r = rSite;
-    } else if (edge.l === rSite) {
-      edge.b = vertex;
-    } else {
-      edge.a = vertex;
-    }
-  }
-  function d3_geom_voronoiHalfEdge(edge, lSite, rSite) {
-    var va = edge.a, vb = edge.b;
-    this.edge = edge;
-    this.site = lSite;
-    this.angle = rSite ? Math.atan2(rSite.y - lSite.y, rSite.x - lSite.x) : edge.l === lSite ? Math.atan2(vb.x - va.x, va.y - vb.y) : Math.atan2(va.x - vb.x, vb.y - va.y);
-  }
-  d3_geom_voronoiHalfEdge.prototype = {
-    start: function() {
-      return this.edge.l === this.site ? this.edge.a : this.edge.b;
-    },
-    end: function() {
-      return this.edge.l === this.site ? this.edge.b : this.edge.a;
-    }
-  };
-  function d3_geom_voronoiRedBlackTree() {
-    this._ = null;
-  }
-  function d3_geom_voronoiRedBlackNode(node) {
-    node.U = node.C = node.L = node.R = node.P = node.N = null;
-  }
-  d3_geom_voronoiRedBlackTree.prototype = {
-    insert: function(after, node) {
-      var parent, grandpa, uncle;
-      if (after) {
-        node.P = after;
-        node.N = after.N;
-        if (after.N) after.N.P = node;
-        after.N = node;
-        if (after.R) {
-          after = after.R;
-          while (after.L) after = after.L;
-          after.L = node;
-        } else {
-          after.R = node;
-        }
-        parent = after;
-      } else if (this._) {
-        after = d3_geom_voronoiRedBlackFirst(this._);
-        node.P = null;
-        node.N = after;
-        after.P = after.L = node;
-        parent = after;
-      } else {
-        node.P = node.N = null;
-        this._ = node;
-        parent = null;
-      }
-      node.L = node.R = null;
-      node.U = parent;
-      node.C = true;
-      after = node;
-      while (parent && parent.C) {
-        grandpa = parent.U;
-        if (parent === grandpa.L) {
-          uncle = grandpa.R;
-          if (uncle && uncle.C) {
-            parent.C = uncle.C = false;
-            grandpa.C = true;
-            after = grandpa;
-          } else {
-            if (after === parent.R) {
-              d3_geom_voronoiRedBlackRotateLeft(this, parent);
-              after = parent;
-              parent = after.U;
-            }
-            parent.C = false;
-            grandpa.C = true;
-            d3_geom_voronoiRedBlackRotateRight(this, grandpa);
-          }
-        } else {
-          uncle = grandpa.L;
-          if (uncle && uncle.C) {
-            parent.C = uncle.C = false;
-            grandpa.C = true;
-            after = grandpa;
-          } else {
-            if (after === parent.L) {
-              d3_geom_voronoiRedBlackRotateRight(this, parent);
-              after = parent;
-              parent = after.U;
-            }
-            parent.C = false;
-            grandpa.C = true;
-            d3_geom_voronoiRedBlackRotateLeft(this, grandpa);
-          }
-        }
-        parent = after.U;
-      }
-      this._.C = false;
-    },
-    remove: function(node) {
-      if (node.N) node.N.P = node.P;
-      if (node.P) node.P.N = node.N;
-      node.N = node.P = null;
-      var parent = node.U, sibling, left = node.L, right = node.R, next, red;
-      if (!left) next = right; else if (!right) next = left; else next = d3_geom_voronoiRedBlackFirst(right);
-      if (parent) {
-        if (parent.L === node) parent.L = next; else parent.R = next;
-      } else {
-        this._ = next;
-      }
-      if (left && right) {
-        red = next.C;
-        next.C = node.C;
-        next.L = left;
-        left.U = next;
-        if (next !== right) {
-          parent = next.U;
-          next.U = node.U;
-          node = next.R;
-          parent.L = node;
-          next.R = right;
-          right.U = next;
-        } else {
-          next.U = parent;
-          parent = next;
-          node = next.R;
-        }
-      } else {
-        red = node.C;
-        node = next;
-      }
-      if (node) node.U = parent;
-      if (red) return;
-      if (node && node.C) {
-        node.C = false;
-        return;
-      }
-      do {
-        if (node === this._) break;
-        if (node === parent.L) {
-          sibling = parent.R;
-          if (sibling.C) {
-            sibling.C = false;
-            parent.C = true;
-            d3_geom_voronoiRedBlackRotateLeft(this, parent);
-            sibling = parent.R;
-          }
-          if (sibling.L && sibling.L.C || sibling.R && sibling.R.C) {
-            if (!sibling.R || !sibling.R.C) {
-              sibling.L.C = false;
-              sibling.C = true;
-              d3_geom_voronoiRedBlackRotateRight(this, sibling);
-              sibling = parent.R;
-            }
-            sibling.C = parent.C;
-            parent.C = sibling.R.C = false;
-            d3_geom_voronoiRedBlackRotateLeft(this, parent);
-            node = this._;
-            break;
-          }
-        } else {
-          sibling = parent.L;
-          if (sibling.C) {
-            sibling.C = false;
-            parent.C = true;
-            d3_geom_voronoiRedBlackRotateRight(this, parent);
-            sibling = parent.L;
-          }
-          if (sibling.L && sibling.L.C || sibling.R && sibling.R.C) {
-            if (!sibling.L || !sibling.L.C) {
-              sibling.R.C = false;
-              sibling.C = true;
-              d3_geom_voronoiRedBlackRotateLeft(this, sibling);
-              sibling = parent.L;
-            }
-            sibling.C = parent.C;
-            parent.C = sibling.L.C = false;
-            d3_geom_voronoiRedBlackRotateRight(this, parent);
-            node = this._;
-            break;
-          }
-        }
-        sibling.C = true;
-        node = parent;
-        parent = parent.U;
-      } while (!node.C);
-      if (node) node.C = false;
-    }
-  };
-  function d3_geom_voronoiRedBlackRotateLeft(tree, node) {
-    var p = node, q = node.R, parent = p.U;
-    if (parent) {
-      if (parent.L === p) parent.L = q; else parent.R = q;
-    } else {
-      tree._ = q;
-    }
-    q.U = parent;
-    p.U = q;
-    p.R = q.L;
-    if (p.R) p.R.U = p;
-    q.L = p;
-  }
-  function d3_geom_voronoiRedBlackRotateRight(tree, node) {
-    var p = node, q = node.L, parent = p.U;
-    if (parent) {
-      if (parent.L === p) parent.L = q; else parent.R = q;
-    } else {
-      tree._ = q;
-    }
-    q.U = parent;
-    p.U = q;
-    p.L = q.R;
-    if (p.L) p.L.U = p;
-    q.R = p;
-  }
-  function d3_geom_voronoiRedBlackFirst(node) {
-    while (node.L) node = node.L;
-    return node;
-  }
-  function d3_geom_voronoi(sites, bbox) {
-    var site = sites.sort(d3_geom_voronoiVertexOrder).pop(), x0, y0, circle;
-    d3_geom_voronoiEdges = [];
-    d3_geom_voronoiCells = new Array(sites.length);
-    d3_geom_voronoiBeaches = new d3_geom_voronoiRedBlackTree();
-    d3_geom_voronoiCircles = new d3_geom_voronoiRedBlackTree();
-    while (true) {
-      circle = d3_geom_voronoiFirstCircle;
-      if (site && (!circle || site.y < circle.y || site.y === circle.y && site.x < circle.x)) {
-        if (site.x !== x0 || site.y !== y0) {
-          d3_geom_voronoiCells[site.i] = new d3_geom_voronoiCell(site);
-          d3_geom_voronoiAddBeach(site);
-          x0 = site.x, y0 = site.y;
-        }
-        site = sites.pop();
-      } else if (circle) {
-        d3_geom_voronoiRemoveBeach(circle.arc);
-      } else {
-        break;
-      }
-    }
-    if (bbox) d3_geom_voronoiClipEdges(bbox), d3_geom_voronoiCloseCells(bbox);
-    var diagram = {
-      cells: d3_geom_voronoiCells,
-      edges: d3_geom_voronoiEdges
-    };
-    d3_geom_voronoiBeaches = d3_geom_voronoiCircles = d3_geom_voronoiEdges = d3_geom_voronoiCells = null;
-    return diagram;
-  }
-  function d3_geom_voronoiVertexOrder(a, b) {
-    return b.y - a.y || b.x - a.x;
-  }
-  d3.geom.voronoi = function(points) {
-    var x = d3_geom_pointX, y = d3_geom_pointY, fx = x, fy = y, clipExtent = d3_geom_voronoiClipExtent;
-    if (points) return voronoi(points);
-    function voronoi(data) {
-      var polygons = new Array(data.length), x0 = clipExtent[0][0], y0 = clipExtent[0][1], x1 = clipExtent[1][0], y1 = clipExtent[1][1];
-      d3_geom_voronoi(sites(data), clipExtent).cells.forEach(function(cell, i) {
-        var edges = cell.edges, site = cell.site, polygon = polygons[i] = edges.length ? edges.map(function(e) {
-          var s = e.start();
-          return [ s.x, s.y ];
-        }) : site.x >= x0 && site.x <= x1 && site.y >= y0 && site.y <= y1 ? [ [ x0, y1 ], [ x1, y1 ], [ x1, y0 ], [ x0, y0 ] ] : [];
-        polygon.point = data[i];
-      });
-      return polygons;
-    }
-    function sites(data) {
-      return data.map(function(d, i) {
-        return {
-          x: Math.round(fx(d, i) / ε) * ε,
-          y: Math.round(fy(d, i) / ε) * ε,
-          i: i
-        };
-      });
-    }
-    voronoi.links = function(data) {
-      return d3_geom_voronoi(sites(data)).edges.filter(function(edge) {
-        return edge.l && edge.r;
-      }).map(function(edge) {
-        return {
-          source: data[edge.l.i],
-          target: data[edge.r.i]
-        };
-      });
-    };
-    voronoi.triangles = function(data) {
-      var triangles = [];
-      d3_geom_voronoi(sites(data)).cells.forEach(function(cell, i) {
-        var site = cell.site, edges = cell.edges.sort(d3_geom_voronoiHalfEdgeOrder), j = -1, m = edges.length, e0, s0, e1 = edges[m - 1].edge, s1 = e1.l === site ? e1.r : e1.l;
-        while (++j < m) {
-          e0 = e1;
-          s0 = s1;
-          e1 = edges[j].edge;
-          s1 = e1.l === site ? e1.r : e1.l;
-          if (i < s0.i && i < s1.i && d3_geom_voronoiTriangleArea(site, s0, s1) < 0) {
-            triangles.push([ data[i], data[s0.i], data[s1.i] ]);
-          }
-        }
-      });
-      return triangles;
-    };
-    voronoi.x = function(_) {
-      return arguments.length ? (fx = d3_functor(x = _), voronoi) : x;
-    };
-    voronoi.y = function(_) {
-      return arguments.length ? (fy = d3_functor(y = _), voronoi) : y;
-    };
-    voronoi.clipExtent = function(_) {
-      if (!arguments.length) return clipExtent === d3_geom_voronoiClipExtent ? null : clipExtent;
-      clipExtent = _ == null ? d3_geom_voronoiClipExtent : _;
-      return voronoi;
-    };
-    voronoi.size = function(_) {
-      if (!arguments.length) return clipExtent === d3_geom_voronoiClipExtent ? null : clipExtent && clipExtent[1];
-      return voronoi.clipExtent(_ && [ [ 0, 0 ], _ ]);
-    };
-    return voronoi;
-  };
-  var d3_geom_voronoiClipExtent = [ [ -1e6, -1e6 ], [ 1e6, 1e6 ] ];
-  function d3_geom_voronoiTriangleArea(a, b, c) {
-    return (a.x - c.x) * (b.y - a.y) - (a.x - b.x) * (c.y - a.y);
-  }
-  d3.geom.delaunay = function(vertices) {
-    return d3.geom.voronoi().triangles(vertices);
-  };
-  d3.geom.quadtree = function(points, x1, y1, x2, y2) {
-    var x = d3_geom_pointX, y = d3_geom_pointY, compat;
-    if (compat = arguments.length) {
-      x = d3_geom_quadtreeCompatX;
-      y = d3_geom_quadtreeCompatY;
-      if (compat === 3) {
-        y2 = y1;
-        x2 = x1;
-        y1 = x1 = 0;
-      }
-      return quadtree(points);
-    }
-    function quadtree(data) {
-      var d, fx = d3_functor(x), fy = d3_functor(y), xs, ys, i, n, x1_, y1_, x2_, y2_;
-      if (x1 != null) {
-        x1_ = x1, y1_ = y1, x2_ = x2, y2_ = y2;
-      } else {
-        x2_ = y2_ = -(x1_ = y1_ = Infinity);
-        xs = [], ys = [];
-        n = data.length;
-        if (compat) for (i = 0; i < n; ++i) {
-          d = data[i];
-          if (d.x < x1_) x1_ = d.x;
-          if (d.y < y1_) y1_ = d.y;
-          if (d.x > x2_) x2_ = d.x;
-          if (d.y > y2_) y2_ = d.y;
-          xs.push(d.x);
-          ys.push(d.y);
-        } else for (i = 0; i < n; ++i) {
-          var x_ = +fx(d = data[i], i), y_ = +fy(d, i);
-          if (x_ < x1_) x1_ = x_;
-          if (y_ < y1_) y1_ = y_;
-          if (x_ > x2_) x2_ = x_;
-          if (y_ > y2_) y2_ = y_;
-          xs.push(x_);
-          ys.push(y_);
-        }
-      }
-      var dx = x2_ - x1_, dy = y2_ - y1_;
-      if (dx > dy) y2_ = y1_ + dx; else x2_ = x1_ + dy;
-      function insert(n, d, x, y, x1, y1, x2, y2) {
-        if (isNaN(x) || isNaN(y)) return;
-        if (n.leaf) {
-          var nx = n.x, ny = n.y;
-          if (nx != null) {
-            if (abs(nx - x) + abs(ny - y) < .01) {
-              insertChild(n, d, x, y, x1, y1, x2, y2);
-            } else {
-              var nPoint = n.point;
-              n.x = n.y = n.point = null;
-              insertChild(n, nPoint, nx, ny, x1, y1, x2, y2);
-              insertChild(n, d, x, y, x1, y1, x2, y2);
-            }
-          } else {
-            n.x = x, n.y = y, n.point = d;
-          }
-        } else {
-          insertChild(n, d, x, y, x1, y1, x2, y2);
-        }
-      }
-      function insertChild(n, d, x, y, x1, y1, x2, y2) {
-        var sx = (x1 + x2) * .5, sy = (y1 + y2) * .5, right = x >= sx, bottom = y >= sy, i = (bottom << 1) + right;
-        n.leaf = false;
-        n = n.nodes[i] || (n.nodes[i] = d3_geom_quadtreeNode());
-        if (right) x1 = sx; else x2 = sx;
-        if (bottom) y1 = sy; else y2 = sy;
-        insert(n, d, x, y, x1, y1, x2, y2);
-      }
-      var root = d3_geom_quadtreeNode();
-      root.add = function(d) {
-        insert(root, d, +fx(d, ++i), +fy(d, i), x1_, y1_, x2_, y2_);
-      };
-      root.visit = function(f) {
-        d3_geom_quadtreeVisit(f, root, x1_, y1_, x2_, y2_);
-      };
-      i = -1;
-      if (x1 == null) {
-        while (++i < n) {
-          insert(root, data[i], xs[i], ys[i], x1_, y1_, x2_, y2_);
-        }
-        --i;
-      } else data.forEach(root.add);
-      xs = ys = data = d = null;
-      return root;
-    }
-    quadtree.x = function(_) {
-      return arguments.length ? (x = _, quadtree) : x;
-    };
-    quadtree.y = function(_) {
-      return arguments.length ? (y = _, quadtree) : y;
-    };
-    quadtree.extent = function(_) {
-      if (!arguments.length) return x1 == null ? null : [ [ x1, y1 ], [ x2, y2 ] ];
-      if (_ == null) x1 = y1 = x2 = y2 = null; else x1 = +_[0][0], y1 = +_[0][1], x2 = +_[1][0], 
-      y2 = +_[1][1];
-      return quadtree;
-    };
-    quadtree.size = function(_) {
-      if (!arguments.length) return x1 == null ? null : [ x2 - x1, y2 - y1 ];
-      if (_ == null) x1 = y1 = x2 = y2 = null; else x1 = y1 = 0, x2 = +_[0], y2 = +_[1];
-      return quadtree;
-    };
-    return quadtree;
-  };
-  function d3_geom_quadtreeCompatX(d) {
-    return d.x;
-  }
-  function d3_geom_quadtreeCompatY(d) {
-    return d.y;
-  }
-  function d3_geom_quadtreeNode() {
-    return {
-      leaf: true,
-      nodes: [],
-      point: null,
-      x: null,
-      y: null
-    };
-  }
-  function d3_geom_quadtreeVisit(f, node, x1, y1, x2, y2) {
-    if (!f(node, x1, y1, x2, y2)) {
-      var sx = (x1 + x2) * .5, sy = (y1 + y2) * .5, children = node.nodes;
-      if (children[0]) d3_geom_quadtreeVisit(f, children[0], x1, y1, sx, sy);
-      if (children[1]) d3_geom_quadtreeVisit(f, children[1], sx, y1, x2, sy);
-      if (children[2]) d3_geom_quadtreeVisit(f, children[2], x1, sy, sx, y2);
-      if (children[3]) d3_geom_quadtreeVisit(f, children[3], sx, sy, x2, y2);
-    }
-  }
-  d3.interpolateRgb = d3_interpolateRgb;
-  function d3_interpolateRgb(a, b) {
-    a = d3.rgb(a);
-    b = d3.rgb(b);
-    var ar = a.r, ag = a.g, ab = a.b, br = b.r - ar, bg = b.g - ag, bb = b.b - ab;
-    return function(t) {
-      return "#" + d3_rgb_hex(Math.round(ar + br * t)) + d3_rgb_hex(Math.round(ag + bg * t)) + d3_rgb_hex(Math.round(ab + bb * t));
-    };
-  }
-  d3.interpolateObject = d3_interpolateObject;
-  function d3_interpolateObject(a, b) {
-    var i = {}, c = {}, k;
-    for (k in a) {
-      if (k in b) {
-        i[k] = d3_interpolate(a[k], b[k]);
-      } else {
-        c[k] = a[k];
-      }
-    }
-    for (k in b) {
-      if (!(k in a)) {
-        c[k] = b[k];
-      }
-    }
-    return function(t) {
-      for (k in i) c[k] = i[k](t);
-      return c;
-    };
-  }
-  d3.interpolateNumber = d3_interpolateNumber;
-  function d3_interpolateNumber(a, b) {
-    b -= a = +a;
-    return function(t) {
-      return a + b * t;
-    };
-  }
-  d3.interpolateString = d3_interpolateString;
-  function d3_interpolateString(a, b) {
-    var m, i, j, s0 = 0, s1 = 0, s = [], q = [], n, o;
-    a = a + "", b = b + "";
-    d3_interpolate_number.lastIndex = 0;
-    for (i = 0; m = d3_interpolate_number.exec(b); ++i) {
-      if (m.index) s.push(b.substring(s0, s1 = m.index));
-      q.push({
-        i: s.length,
-        x: m[0]
-      });
-      s.push(null);
-      s0 = d3_interpolate_number.lastIndex;
-    }
-    if (s0 < b.length) s.push(b.substring(s0));
-    for (i = 0, n = q.length; (m = d3_interpolate_number.exec(a)) && i < n; ++i) {
-      o = q[i];
-      if (o.x == m[0]) {
-        if (o.i) {
-          if (s[o.i + 1] == null) {
-            s[o.i - 1] += o.x;
-            s.splice(o.i, 1);
-            for (j = i + 1; j < n; ++j) q[j].i--;
-          } else {
-            s[o.i - 1] += o.x + s[o.i + 1];
-            s.splice(o.i, 2);
-            for (j = i + 1; j < n; ++j) q[j].i -= 2;
-          }
-        } else {
-          if (s[o.i + 1] == null) {
-            s[o.i] = o.x;
-          } else {
-            s[o.i] = o.x + s[o.i + 1];
-            s.splice(o.i + 1, 1);
-            for (j = i + 1; j < n; ++j) q[j].i--;
-          }
-        }
-        q.splice(i, 1);
-        n--;
-        i--;
-      } else {
-        o.x = d3_interpolateNumber(parseFloat(m[0]), parseFloat(o.x));
-      }
-    }
-    while (i < n) {
-      o = q.pop();
-      if (s[o.i + 1] == null) {
-        s[o.i] = o.x;
-      } else {
-        s[o.i] = o.x + s[o.i + 1];
-        s.splice(o.i + 1, 1);
-      }
-      n--;
-    }
-    if (s.length === 1) {
-      return s[0] == null ? (o = q[0].x, function(t) {
-        return o(t) + "";
-      }) : function() {
-        return b;
-      };
-    }
-    return function(t) {
-      for (i = 0; i < n; ++i) s[(o = q[i]).i] = o.x(t);
-      return s.join("");
-    };
-  }
-  var d3_interpolate_number = /[-+]?(?:\d+\.?\d*|\.?\d+)(?:[eE][-+]?\d+)?/g;
-  d3.interpolate = d3_interpolate;
-  function d3_interpolate(a, b) {
-    var i = d3.interpolators.length, f;
-    while (--i >= 0 && !(f = d3.interpolators[i](a, b))) ;
-    return f;
-  }
-  d3.interpolators = [ function(a, b) {
-    var t = typeof b;
-    return (t === "string" ? d3_rgb_names.has(b) || /^(#|rgb\(|hsl\()/.test(b) ? d3_interpolateRgb : d3_interpolateString : b instanceof d3_Color ? d3_interpolateRgb : Array.isArray(b) ? d3_interpolateArray : t === "object" && isNaN(b) ? d3_interpolateObject : d3_interpolateNumber)(a, b);
-  } ];
-  d3.interpolateArray = d3_interpolateArray;
-  function d3_interpolateArray(a, b) {
-    var x = [], c = [], na = a.length, nb = b.length, n0 = Math.min(a.length, b.length), i;
-    for (i = 0; i < n0; ++i) x.push(d3_interpolate(a[i], b[i]));
-    for (;i < na; ++i) c[i] = a[i];
-    for (;i < nb; ++i) c[i] = b[i];
-    return function(t) {
-      for (i = 0; i < n0; ++i) c[i] = x[i](t);
-      return c;
-    };
-  }
-  var d3_ease_default = function() {
-    return d3_identity;
-  };
-  var d3_ease = d3.map({
-    linear: d3_ease_default,
-    poly: d3_ease_poly,
-    quad: function() {
-      return d3_ease_quad;
-    },
-    cubic: function() {
-      return d3_ease_cubic;
-    },
-    sin: function() {
-      return d3_ease_sin;
-    },
-    exp: function() {
-      return d3_ease_exp;
-    },
-    circle: function() {
-      return d3_ease_circle;
-    },
-    elastic: d3_ease_elastic,
-    back: d3_ease_back,
-    bounce: function() {
-      return d3_ease_bounce;
-    }
-  });
-  var d3_ease_mode = d3.map({
-    "in": d3_identity,
-    out: d3_ease_reverse,
-    "in-out": d3_ease_reflect,
-    "out-in": function(f) {
-      return d3_ease_reflect(d3_ease_reverse(f));
-    }
-  });
-  d3.ease = function(name) {
-    var i = name.indexOf("-"), t = i >= 0 ? name.substring(0, i) : name, m = i >= 0 ? name.substring(i + 1) : "in";
-    t = d3_ease.get(t) || d3_ease_default;
-    m = d3_ease_mode.get(m) || d3_identity;
-    return d3_ease_clamp(m(t.apply(null, d3_arraySlice.call(arguments, 1))));
-  };
-  function d3_ease_clamp(f) {
-    return function(t) {
-      return t <= 0 ? 0 : t >= 1 ? 1 : f(t);
-    };
-  }
-  function d3_ease_reverse(f) {
-    return function(t) {
-      return 1 - f(1 - t);
-    };
-  }
-  function d3_ease_reflect(f) {
-    return function(t) {
-      return .5 * (t < .5 ? f(2 * t) : 2 - f(2 - 2 * t));
-    };
-  }
-  function d3_ease_quad(t) {
-    return t * t;
-  }
-  function d3_ease_cubic(t) {
-    return t * t * t;
-  }
-  function d3_ease_cubicInOut(t) {
-    if (t <= 0) return 0;
-    if (t >= 1) return 1;
-    var t2 = t * t, t3 = t2 * t;
-    return 4 * (t < .5 ? t3 : 3 * (t - t2) + t3 - .75);
-  }
-  function d3_ease_poly(e) {
-    return function(t) {
-      return Math.pow(t, e);
-    };
-  }
-  function d3_ease_sin(t) {
-    return 1 - Math.cos(t * halfπ);
-  }
-  function d3_ease_exp(t) {
-    return Math.pow(2, 10 * (t - 1));
-  }
-  function d3_ease_circle(t) {
-    return 1 - Math.sqrt(1 - t * t);
-  }
-  function d3_ease_elastic(a, p) {
-    var s;
-    if (arguments.length < 2) p = .45;
-    if (arguments.length) s = p / τ * Math.asin(1 / a); else a = 1, s = p / 4;
-    return function(t) {
-      return 1 + a * Math.pow(2, -10 * t) * Math.sin((t - s) * τ / p);
-    };
-  }
-  function d3_ease_back(s) {
-    if (!s) s = 1.70158;
-    return function(t) {
-      return t * t * ((s + 1) * t - s);
-    };
-  }
-  function d3_ease_bounce(t) {
-    return t < 1 / 2.75 ? 7.5625 * t * t : t < 2 / 2.75 ? 7.5625 * (t -= 1.5 / 2.75) * t + .75 : t < 2.5 / 2.75 ? 7.5625 * (t -= 2.25 / 2.75) * t + .9375 : 7.5625 * (t -= 2.625 / 2.75) * t + .984375;
-  }
-  d3.interpolateHcl = d3_interpolateHcl;
-  function d3_interpolateHcl(a, b) {
-    a = d3.hcl(a);
-    b = d3.hcl(b);
-    var ah = a.h, ac = a.c, al = a.l, bh = b.h - ah, bc = b.c - ac, bl = b.l - al;
-    if (isNaN(bc)) bc = 0, ac = isNaN(ac) ? b.c : ac;
-    if (isNaN(bh)) bh = 0, ah = isNaN(ah) ? b.h : ah; else if (bh > 180) bh -= 360; else if (bh < -180) bh += 360;
-    return function(t) {
-      return d3_hcl_lab(ah + bh * t, ac + bc * t, al + bl * t) + "";
-    };
-  }
-  d3.interpolateHsl = d3_interpolateHsl;
-  function d3_interpolateHsl(a, b) {
-    a = d3.hsl(a);
-    b = d3.hsl(b);
-    var ah = a.h, as = a.s, al = a.l, bh = b.h - ah, bs = b.s - as, bl = b.l - al;
-    if (isNaN(bs)) bs = 0, as = isNaN(as) ? b.s : as;
-    if (isNaN(bh)) bh = 0, ah = isNaN(ah) ? b.h : ah; else if (bh > 180) bh -= 360; else if (bh < -180) bh += 360;
-    return function(t) {
-      return d3_hsl_rgb(ah + bh * t, as + bs * t, al + bl * t) + "";
-    };
-  }
-  d3.interpolateLab = d3_interpolateLab;
-  function d3_interpolateLab(a, b) {
-    a = d3.lab(a);
-    b = d3.lab(b);
-    var al = a.l, aa = a.a, ab = a.b, bl = b.l - al, ba = b.a - aa, bb = b.b - ab;
-    return function(t) {
-      return d3_lab_rgb(al + bl * t, aa + ba * t, ab + bb * t) + "";
-    };
-  }
-  d3.interpolateRound = d3_interpolateRound;
-  function d3_interpolateRound(a, b) {
-    b -= a;
-    return function(t) {
-      return Math.round(a + b * t);
-    };
-  }
-  d3.transform = function(string) {
-    var g = d3_document.createElementNS(d3.ns.prefix.svg, "g");
-    return (d3.transform = function(string) {
-      if (string != null) {
-        g.setAttribute("transform", string);
-        var t = g.transform.baseVal.consolidate();
-      }
-      return new d3_transform(t ? t.matrix : d3_transformIdentity);
-    })(string);
-  };
-  function d3_transform(m) {
-    var r0 = [ m.a, m.b ], r1 = [ m.c, m.d ], kx = d3_transformNormalize(r0), kz = d3_transformDot(r0, r1), ky = d3_transformNormalize(d3_transformCombine(r1, r0, -kz)) || 0;
-    if (r0[0] * r1[1] < r1[0] * r0[1]) {
-      r0[0] *= -1;
-      r0[1] *= -1;
-      kx *= -1;
-      kz *= -1;
-    }
-    this.rotate = (kx ? Math.atan2(r0[1], r0[0]) : Math.atan2(-r1[0], r1[1])) * d3_degrees;
-    this.translate = [ m.e, m.f ];
-    this.scale = [ kx, ky ];
-    this.skew = ky ? Math.atan2(kz, ky) * d3_degrees : 0;
-  }
-  d3_transform.prototype.toString = function() {
-    return "translate(" + this.translate + ")rotate(" + this.rotate + ")skewX(" + this.skew + ")scale(" + this.scale + ")";
-  };
-  function d3_transformDot(a, b) {
-    return a[0] * b[0] + a[1] * b[1];
-  }
-  function d3_transformNormalize(a) {
-    var k = Math.sqrt(d3_transformDot(a, a));
-    if (k) {
-      a[0] /= k;
-      a[1] /= k;
-    }
-    return k;
-  }
-  function d3_transformCombine(a, b, k) {
-    a[0] += k * b[0];
-    a[1] += k * b[1];
-    return a;
-  }
-  var d3_transformIdentity = {
-    a: 1,
-    b: 0,
-    c: 0,
-    d: 1,
-    e: 0,
-    f: 0
-  };
-  d3.interpolateTransform = d3_interpolateTransform;
-  function d3_interpolateTransform(a, b) {
-    var s = [], q = [], n, A = d3.transform(a), B = d3.transform(b), ta = A.translate, tb = B.translate, ra = A.rotate, rb = B.rotate, wa = A.skew, wb = B.skew, ka = A.scale, kb = B.scale;
-    if (ta[0] != tb[0] || ta[1] != tb[1]) {
-      s.push("translate(", null, ",", null, ")");
-      q.push({
-        i: 1,
-        x: d3_interpolateNumber(ta[0], tb[0])
-      }, {
-        i: 3,
-        x: d3_interpolateNumber(ta[1], tb[1])
-      });
-    } else if (tb[0] || tb[1]) {
-      s.push("translate(" + tb + ")");
-    } else {
-      s.push("");
-    }
-    if (ra != rb) {
-      if (ra - rb > 180) rb += 360; else if (rb - ra > 180) ra += 360;
-      q.push({
-        i: s.push(s.pop() + "rotate(", null, ")") - 2,
-        x: d3_interpolateNumber(ra, rb)
-      });
-    } else if (rb) {
-      s.push(s.pop() + "rotate(" + rb + ")");
-    }
-    if (wa != wb) {
-      q.push({
-        i: s.push(s.pop() + "skewX(", null, ")") - 2,
-        x: d3_interpolateNumber(wa, wb)
-      });
-    } else if (wb) {
-      s.push(s.pop() + "skewX(" + wb + ")");
-    }
-    if (ka[0] != kb[0] || ka[1] != kb[1]) {
-      n = s.push(s.pop() + "scale(", null, ",", null, ")");
-      q.push({
-        i: n - 4,
-        x: d3_interpolateNumber(ka[0], kb[0])
-      }, {
-        i: n - 2,
-        x: d3_interpolateNumber(ka[1], kb[1])
-      });
-    } else if (kb[0] != 1 || kb[1] != 1) {
-      s.push(s.pop() + "scale(" + kb + ")");
-    }
-    n = q.length;
-    return function(t) {
-      var i = -1, o;
-      while (++i < n) s[(o = q[i]).i] = o.x(t);
-      return s.join("");
-    };
-  }
-  function d3_uninterpolateNumber(a, b) {
-    b = b - (a = +a) ? 1 / (b - a) : 0;
-    return function(x) {
-      return (x - a) * b;
-    };
-  }
-  function d3_uninterpolateClamp(a, b) {
-    b = b - (a = +a) ? 1 / (b - a) : 0;
-    return function(x) {
-      return Math.max(0, Math.min(1, (x - a) * b));
-    };
-  }
-  d3.layout = {};
-  d3.layout.bundle = function() {
-    return function(links) {
-      var paths = [], i = -1, n = links.length;
-      while (++i < n) paths.push(d3_layout_bundlePath(links[i]));
-      return paths;
-    };
-  };
-  function d3_layout_bundlePath(link) {
-    var start = link.source, end = link.target, lca = d3_layout_bundleLeastCommonAncestor(start, end), points = [ start ];
-    while (start !== lca) {
-      start = start.parent;
-      points.push(start);
-    }
-    var k = points.length;
-    while (end !== lca) {
-      points.splice(k, 0, end);
-      end = end.parent;
-    }
-    return points;
-  }
-  function d3_layout_bundleAncestors(node) {
-    var ancestors = [], parent = node.parent;
-    while (parent != null) {
-      ancestors.push(node);
-      node = parent;
-      parent = parent.parent;
-    }
-    ancestors.push(node);
-    return ancestors;
-  }
-  function d3_layout_bundleLeastCommonAncestor(a, b) {
-    if (a === b) return a;
-    var aNodes = d3_layout_bundleAncestors(a), bNodes = d3_layout_bundleAncestors(b), aNode = aNodes.pop(), bNode = bNodes.pop(), sharedNode = null;
-    while (aNode === bNode) {
-      sharedNode = aNode;
-      aNode = aNodes.pop();
-      bNode = bNodes.pop();
-    }
-    return sharedNode;
-  }
-  d3.layout.chord = function() {
-    var chord = {}, chords, groups, matrix, n, padding = 0, sortGroups, sortSubgroups, sortChords;
-    function relayout() {
-      var subgroups = {}, groupSums = [], groupIndex = d3.range(n), subgroupIndex = [], k, x, x0, i, j;
-      chords = [];
-      groups = [];
-      k = 0, i = -1;
-      while (++i < n) {
-        x = 0, j = -1;
-        while (++j < n) {
-          x += matrix[i][j];
-        }
-        groupSums.push(x);
-        subgroupIndex.push(d3.range(n));
-        k += x;
-      }
-      if (sortGroups) {
-        groupIndex.sort(function(a, b) {
-          return sortGroups(groupSums[a], groupSums[b]);
-        });
-      }
-      if (sortSubgroups) {
-        subgroupIndex.forEach(function(d, i) {
-          d.sort(function(a, b) {
-            return sortSubgroups(matrix[i][a], matrix[i][b]);
-          });
-        });
-      }
-      k = (τ - padding * n) / k;
-      x = 0, i = -1;
-      while (++i < n) {
-        x0 = x, j = -1;
-        while (++j < n) {
-          var di = groupIndex[i], dj = subgroupIndex[di][j], v = matrix[di][dj], a0 = x, a1 = x += v * k;
-          subgroups[di + "-" + dj] = {
-            index: di,
-            subindex: dj,
-            startAngle: a0,
-            endAngle: a1,
-            value: v
-          };
-        }
-        groups[di] = {
-          index: di,
-          startAngle: x0,
-          endAngle: x,
-          value: (x - x0) / k
-        };
-        x += padding;
-      }
-      i = -1;
-      while (++i < n) {
-        j = i - 1;
-        while (++j < n) {
-          var source = subgroups[i + "-" + j], target = subgroups[j + "-" + i];
-          if (source.value || target.value) {
-            chords.push(source.value < target.value ? {
-              source: target,
-              target: source
-            } : {
-              source: source,
-              target: target
-            });
-          }
-        }
-      }
-      if (sortChords) resort();
-    }
-    function resort() {
-      chords.sort(function(a, b) {
-        return sortChords((a.source.value + a.target.value) / 2, (b.source.value + b.target.value) / 2);
-      });
-    }
-    chord.matrix = function(x) {
-      if (!arguments.length) return matrix;
-      n = (matrix = x) && matrix.length;
-      chords = groups = null;
-      return chord;
-    };
-    chord.padding = function(x) {
-      if (!arguments.length) return padding;
-      padding = x;
-      chords = groups = null;
-      return chord;
-    };
-    chord.sortGroups = function(x) {
-      if (!arguments.length) return sortGroups;
-      sortGroups = x;
-      chords = groups = null;
-      return chord;
-    };
-    chord.sortSubgroups = function(x) {
-      if (!arguments.length) return sortSubgroups;
-      sortSubgroups = x;
-      chords = null;
-      return chord;
-    };
-    chord.sortChords = function(x) {
-      if (!arguments.length) return sortChords;
-      sortChords = x;
-      if (chords) resort();
-      return chord;
-    };
-    chord.chords = function() {
-      if (!chords) relayout();
-      return chords;
-    };
-    chord.groups = function() {
-      if (!groups) relayout();
-      return groups;
-    };
-    return chord;
-  };
-  d3.layout.force = function() {
-    var force = {}, event = d3.dispatch("start", "tick", "end"), size = [ 1, 1 ], drag, alpha, friction = .9, linkDistance = d3_layout_forceLinkDistance, linkStrength = d3_layout_forceLinkStrength, charge = -30, chargeDistance2 = d3_layout_forceChargeDistance2, gravity = .1, theta2 = .64, nodes = [], links = [], distances, strengths, charges;
-    function repulse(node) {
-      return function(quad, x1, _, x2) {
-        if (quad.point !== node) {
-          var dx = quad.cx - node.x, dy = quad.cy - node.y, dw = x2 - x1, dn = dx * dx + dy * dy;
-          if (dw * dw / theta2 < dn) {
-            if (dn < chargeDistance2) {
-              var k = quad.charge / dn;
-              node.px -= dx * k;
-              node.py -= dy * k;
-            }
-            return true;
-          }
-          if (quad.point && dn && dn < chargeDistance2) {
-            var k = quad.pointCharge / dn;
-            node.px -= dx * k;
-            node.py -= dy * k;
-          }
-        }
-        return !quad.charge;
-      };
-    }
-    force.tick = function() {
-      if ((alpha *= .99) < .005) {
-        event.end({
-          type: "end",
-          alpha: alpha = 0
-        });
-        return true;
-      }
-      var n = nodes.length, m = links.length, q, i, o, s, t, l, k, x, y;
-      for (i = 0; i < m; ++i) {
-        o = links[i];
-        s = o.source;
-        t = o.target;
-        x = t.x - s.x;
-        y = t.y - s.y;
-        if (l = x * x + y * y) {
-          l = alpha * strengths[i] * ((l = Math.sqrt(l)) - distances[i]) / l;
-          x *= l;
-          y *= l;
-          t.x -= x * (k = s.weight / (t.weight + s.weight));
-          t.y -= y * k;
-          s.x += x * (k = 1 - k);
-          s.y += y * k;
-        }
-      }
-      if (k = alpha * gravity) {
-        x = size[0] / 2;
-        y = size[1] / 2;
-        i = -1;
-        if (k) while (++i < n) {
-          o = nodes[i];
-          o.x += (x - o.x) * k;
-          o.y += (y - o.y) * k;
-        }
-      }
-      if (charge) {
-        d3_layout_forceAccumulate(q = d3.geom.quadtree(nodes), alpha, charges);
-        i = -1;
-        while (++i < n) {
-          if (!(o = nodes[i]).fixed) {
-            q.visit(repulse(o));
-          }
-        }
-      }
-      i = -1;
-      while (++i < n) {
-        o = nodes[i];
-        if (o.fixed) {
-          o.x = o.px;
-          o.y = o.py;
-        } else {
-          o.x -= (o.px - (o.px = o.x)) * friction;
-          o.y -= (o.py - (o.py = o.y)) * friction;
-        }
-      }
-      event.tick({
-        type: "tick",
-        alpha: alpha
-      });
-    };
-    force.nodes = function(x) {
-      if (!arguments.length) return nodes;
-      nodes = x;
-      return force;
-    };
-    force.links = function(x) {
-      if (!arguments.length) return links;
-      links = x;
-      return force;
-    };
-    force.size = function(x) {
-      if (!arguments.length) return size;
-      size = x;
-      return force;
-    };
-    force.linkDistance = function(x) {
-      if (!arguments.length) return linkDistance;
-      linkDistance = typeof x === "function" ? x : +x;
-      return force;
-    };
-    force.distance = force.linkDistance;
-    force.linkStrength = function(x) {
-      if (!arguments.length) return linkStrength;
-      linkStrength = typeof x === "function" ? x : +x;
-      return force;
-    };
-    force.friction = function(x) {
-      if (!arguments.length) return friction;
-      friction = +x;
-      return force;
-    };
-    force.charge = function(x) {
-      if (!arguments.length) return charge;
-      charge = typeof x === "function" ? x : +x;
-      return force;
-    };
-    force.chargeDistance = function(x) {
-      if (!arguments.length) return Math.sqrt(chargeDistance2);
-      chargeDistance2 = x * x;
-      return force;
-    };
-    force.gravity = function(x) {
-      if (!arguments.length) return gravity;
-      gravity = +x;
-      return force;
-    };
-    force.theta = function(x) {
-      if (!arguments.length) return Math.sqrt(theta2);
-      theta2 = x * x;
-      return force;
-    };
-    force.alpha = function(x) {
-      if (!arguments.length) return alpha;
-      x = +x;
-      if (alpha) {
-        if (x > 0) alpha = x; else alpha = 0;
-      } else if (x > 0) {
-        event.start({
-          type: "start",
-          alpha: alpha = x
-        });
-        d3.timer(force.tick);
-      }
-      return force;
-    };
-    force.start = function() {
-      var i, n = nodes.length, m = links.length, w = size[0], h = size[1], neighbors, o;
-      for (i = 0; i < n; ++i) {
-        (o = nodes[i]).index = i;
-        o.weight = 0;
-      }
-      for (i = 0; i < m; ++i) {
-        o = links[i];
-        if (typeof o.source == "number") o.source = nodes[o.source];
-        if (typeof o.target == "number") o.target = nodes[o.target];
-        ++o.source.weight;
-        ++o.target.weight;
-      }
-      for (i = 0; i < n; ++i) {
-        o = nodes[i];
-        if (isNaN(o.x)) o.x = position("x", w);
-        if (isNaN(o.y)) o.y = position("y", h);
-        if (isNaN(o.px)) o.px = o.x;
-        if (isNaN(o.py)) o.py = o.y;
-      }
-      distances = [];
-      if (typeof linkDistance === "function") for (i = 0; i < m; ++i) distances[i] = +linkDistance.call(this, links[i], i); else for (i = 0; i < m; ++i) distances[i] = linkDistance;
-      strengths = [];
-      if (typeof linkStrength === "function") for (i = 0; i < m; ++i) strengths[i] = +linkStrength.call(this, links[i], i); else for (i = 0; i < m; ++i) strengths[i] = linkStrength;
-      charges = [];
-      if (typeof charge === "function") for (i = 0; i < n; ++i) charges[i] = +charge.call(this, nodes[i], i); else for (i = 0; i < n; ++i) charges[i] = charge;
-      function position(dimension, size) {
-        if (!neighbors) {
-          neighbors = new Array(n);
-          for (j = 0; j < n; ++j) {
-            neighbors[j] = [];
-          }
-          for (j = 0; j < m; ++j) {
-            var o = links[j];
-            neighbors[o.source.index].push(o.target);
-            neighbors[o.target.index].push(o.source);
-          }
-        }
-        var candidates = neighbors[i], j = -1, m = candidates.length, x;
-        while (++j < m) if (!isNaN(x = candidates[j][dimension])) return x;
-        return Math.random() * size;
-      }
-      return force.resume();
-    };
-    force.resume = function() {
-      return force.alpha(.1);
-    };
-    force.stop = function() {
-      return force.alpha(0);
-    };
-    force.drag = function() {
-      if (!drag) drag = d3.behavior.drag().origin(d3_identity).on("dragstart.force", d3_layout_forceDragstart).on("drag.force", dragmove).on("dragend.force", d3_layout_forceDragend);
-      if (!arguments.length) return drag;
-      this.on("mouseover.force", d3_layout_forceMouseover).on("mouseout.force", d3_layout_forceMouseout).call(drag);
-    };
-    function dragmove(d) {
-      d.px = d3.event.x, d.py = d3.event.y;
-      force.resume();
-    }
-    return d3.rebind(force, event, "on");
-  };
-  function d3_layout_forceDragstart(d) {
-    d.fixed |= 2;
-  }
-  function d3_layout_forceDragend(d) {
-    d.fixed &= ~6;
-  }
-  function d3_layout_forceMouseover(d) {
-    d.fixed |= 4;
-    d.px = d.x, d.py = d.y;
-  }
-  function d3_layout_forceMouseout(d) {
-    d.fixed &= ~4;
-  }
-  function d3_layout_forceAccumulate(quad, alpha, charges) {
-    var cx = 0, cy = 0;
-    quad.charge = 0;
-    if (!quad.leaf) {
-      var nodes = quad.nodes, n = nodes.length, i = -1, c;
-      while (++i < n) {
-        c = nodes[i];
-        if (c == null) continue;
-        d3_layout_forceAccumulate(c, alpha, charges);
-        quad.charge += c.charge;
-        cx += c.charge * c.cx;
-        cy += c.charge * c.cy;
-      }
-    }
-    if (quad.point) {
-      if (!quad.leaf) {
-        quad.point.x += Math.random() - .5;
-        quad.point.y += Math.random() - .5;
-      }
-      var k = alpha * charges[quad.point.index];
-      quad.charge += quad.pointCharge = k;
-      cx += k * quad.point.x;
-      cy += k * quad.point.y;
-    }
-    quad.cx = cx / quad.charge;
-    quad.cy = cy / quad.charge;
-  }
-  var d3_layout_forceLinkDistance = 20, d3_layout_forceLinkStrength = 1, d3_layout_forceChargeDistance2 = Infinity;
-  d3.layout.hierarchy = function() {
-    var sort = d3_layout_hierarchySort, children = d3_layout_hierarchyChildren, value = d3_layout_hierarchyValue;
-    function recurse(node, depth, nodes) {
-      var childs = children.call(hierarchy, node, depth);
-      node.depth = depth;
-      nodes.push(node);
-      if (childs && (n = childs.length)) {
-        var i = -1, n, c = node.children = new Array(n), v = 0, j = depth + 1, d;
-        while (++i < n) {
-          d = c[i] = recurse(childs[i], j, nodes);
-          d.parent = node;
-          v += d.value;
-        }
-        if (sort) c.sort(sort);
-        if (value) node.value = v;
-      } else {
-        delete node.children;
-        if (value) {
-          node.value = +value.call(hierarchy, node, depth) || 0;
-        }
-      }
-      return node;
-    }
-    function revalue(node, depth) {
-      var children = node.children, v = 0;
-      if (children && (n = children.length)) {
-        var i = -1, n, j = depth + 1;
-        while (++i < n) v += revalue(children[i], j);
-      } else if (value) {
-        v = +value.call(hierarchy, node, depth) || 0;
-      }
-      if (value) node.value = v;
-      return v;
-    }
-    function hierarchy(d) {
-      var nodes = [];
-      recurse(d, 0, nodes);
-      return nodes;
-    }
-    hierarchy.sort = function(x) {
-      if (!arguments.length) return sort;
-      sort = x;
-      return hierarchy;
-    };
-    hierarchy.children = function(x) {
-      if (!arguments.length) return children;
-      children = x;
-      return hierarchy;
-    };
-    hierarchy.value = function(x) {
-      if (!arguments.length) return value;
-      value = x;
-      return hierarchy;
-    };
-    hierarchy.revalue = function(root) {
-      revalue(root, 0);
-      return root;
-    };
-    return hierarchy;
-  };
-  function d3_layout_hierarchyRebind(object, hierarchy) {
-    d3.rebind(object, hierarchy, "sort", "children", "value");
-    object.nodes = object;
-    object.links = d3_layout_hierarchyLinks;
-    return object;
-  }
-  function d3_layout_hierarchyChildren(d) {
-    return d.children;
-  }
-  function d3_layout_hierarchyValue(d) {
-    return d.value;
-  }
-  function d3_layout_hierarchySort(a, b) {
-    return b.value - a.value;
-  }
-  function d3_layout_hierarchyLinks(nodes) {
-    return d3.merge(nodes.map(function(parent) {
-      return (parent.children || []).map(function(child) {
-        return {
-          source: parent,
-          target: child
-        };
-      });
-    }));
-  }
-  d3.layout.partition = function() {
-    var hierarchy = d3.layout.hierarchy(), size = [ 1, 1 ];
-    function position(node, x, dx, dy) {
-      var children = node.children;
-      node.x = x;
-      node.y = node.depth * dy;
-      node.dx = dx;
-      node.dy = dy;
-      if (children && (n = children.length)) {
-        var i = -1, n, c, d;
-        dx = node.value ? dx / node.value : 0;
-        while (++i < n) {
-          position(c = children[i], x, d = c.value * dx, dy);
-          x += d;
-        }
-      }
-    }
-    function depth(node) {
-      var children = node.children, d = 0;
-      if (children && (n = children.length)) {
-        var i = -1, n;
-        while (++i < n) d = Math.max(d, depth(children[i]));
-      }
-      return 1 + d;
-    }
-    function partition(d, i) {
-      var nodes = hierarchy.call(this, d, i);
-      position(nodes[0], 0, size[0], size[1] / depth(nodes[0]));
-      return nodes;
-    }
-    partition.size = function(x) {
-      if (!arguments.length) return size;
-      size = x;
-      return partition;
-    };
-    return d3_layout_hierarchyRebind(partition, hierarchy);
-  };
-  d3.layout.pie = function() {
-    var value = Number, sort = d3_layout_pieSortByValue, startAngle = 0, endAngle = τ;
-    function pie(data) {
-      var values = data.map(function(d, i) {
-        return +value.call(pie, d, i);
-      });
-      var a = +(typeof startAngle === "function" ? startAngle.apply(this, arguments) : startAngle);
-      var k = ((typeof endAngle === "function" ? endAngle.apply(this, arguments) : endAngle) - a) / d3.sum(values);
-      var index = d3.range(data.length);
-      if (sort != null) index.sort(sort === d3_layout_pieSortByValue ? function(i, j) {
-        return values[j] - values[i];
-      } : function(i, j) {
-        return sort(data[i], data[j]);
-      });
-      var arcs = [];
-      index.forEach(function(i) {
-        var d;
-        arcs[i] = {
-          data: data[i],
-          value: d = values[i],
-          startAngle: a,
-          endAngle: a += d * k
-        };
-      });
-      return arcs;
-    }
-    pie.value = function(x) {
-      if (!arguments.length) return value;
-      value = x;
-      return pie;
-    };
-    pie.sort = function(x) {
-      if (!arguments.length) return sort;
-      sort = x;
-      return pie;
-    };
-    pie.startAngle = function(x) {
-      if (!arguments.length) return startAngle;
-      startAngle = x;
-      return pie;
-    };
-    pie.endAngle = function(x) {
-      if (!arguments.length) return endAngle;
-      endAngle = x;
-      return pie;
-    };
-    return pie;
-  };
-  var d3_layout_pieSortByValue = {};
-  d3.layout.stack = function() {
-    var values = d3_identity, order = d3_layout_stackOrderDefault, offset = d3_layout_stackOffsetZero, out = d3_layout_stackOut, x = d3_layout_stackX, y = d3_layout_stackY;
-    function stack(data, index) {
-      var series = data.map(function(d, i) {
-        return values.call(stack, d, i);
-      });
-      var points = series.map(function(d) {
-        return d.map(function(v, i) {
-          return [ x.call(stack, v, i), y.call(stack, v, i) ];
-        });
-      });
-      var orders = order.call(stack, points, index);
-      series = d3.permute(series, orders);
-      points = d3.permute(points, orders);
-      var offsets = offset.call(stack, points, index);
-      var n = series.length, m = series[0].length, i, j, o;
-      for (j = 0; j < m; ++j) {
-        out.call(stack, series[0][j], o = offsets[j], points[0][j][1]);
-        for (i = 1; i < n; ++i) {
-          out.call(stack, series[i][j], o += points[i - 1][j][1], points[i][j][1]);
-        }
-      }
-      return data;
-    }
-    stack.values = function(x) {
-      if (!arguments.length) return values;
-      values = x;
-      return stack;
-    };
-    stack.order = function(x) {
-      if (!arguments.length) return order;
-      order = typeof x === "function" ? x : d3_layout_stackOrders.get(x) || d3_layout_stackOrderDefault;
-      return stack;
-    };
-    stack.offset = function(x) {
-      if (!arguments.length) return offset;
-      offset = typeof x === "function" ? x : d3_layout_stackOffsets.get(x) || d3_layout_stackOffsetZero;
-      return stack;
-    };
-    stack.x = function(z) {
-      if (!arguments.length) return x;
-      x = z;
-      return stack;
-    };
-    stack.y = function(z) {
-      if (!arguments.length) return y;
-      y = z;
-      return stack;
-    };
-    stack.out = function(z) {
-      if (!arguments.length) return out;
-      out = z;
-      return stack;
-    };
-    return stack;
-  };
-  function d3_layout_stackX(d) {
-    return d.x;
-  }
-  function d3_layout_stackY(d) {
-    return d.y;
-  }
-  function d3_layout_stackOut(d, y0, y) {
-    d.y0 = y0;
-    d.y = y;
-  }
-  var d3_layout_stackOrders = d3.map({
-    "inside-out": function(data) {
-      var n = data.length, i, j, max = data.map(d3_layout_stackMaxIndex), sums = data.map(d3_layout_stackReduceSum), index = d3.range(n).sort(function(a, b) {
-        return max[a] - max[b];
-      }), top = 0, bottom = 0, tops = [], bottoms = [];
-      for (i = 0; i < n; ++i) {
-        j = index[i];
-        if (top < bottom) {
-          top += sums[j];
-          tops.push(j);
-        } else {
-          bottom += sums[j];
-          bottoms.push(j);
-        }
-      }
-      return bottoms.reverse().concat(tops);
-    },
-    reverse: function(data) {
-      return d3.range(data.length).reverse();
-    },
-    "default": d3_layout_stackOrderDefault
-  });
-  var d3_layout_stackOffsets = d3.map({
-    silhouette: function(data) {
-      var n = data.length, m = data[0].length, sums = [], max = 0, i, j, o, y0 = [];
-      for (j = 0; j < m; ++j) {
-        for (i = 0, o = 0; i < n; i++) o += data[i][j][1];
-        if (o > max) max = o;
-        sums.push(o);
-      }
-      for (j = 0; j < m; ++j) {
-        y0[j] = (max - sums[j]) / 2;
-      }
-      return y0;
-    },
-    wiggle: function(data) {
-      var n = data.length, x = data[0], m = x.length, i, j, k, s1, s2, s3, dx, o, o0, y0 = [];
-      y0[0] = o = o0 = 0;
-      for (j = 1; j < m; ++j) {
-        for (i = 0, s1 = 0; i < n; ++i) s1 += data[i][j][1];
-        for (i = 0, s2 = 0, dx = x[j][0] - x[j - 1][0]; i < n; ++i) {
-          for (k = 0, s3 = (data[i][j][1] - data[i][j - 1][1]) / (2 * dx); k < i; ++k) {
-            s3 += (data[k][j][1] - data[k][j - 1][1]) / dx;
-          }
-          s2 += s3 * data[i][j][1];
-        }
-        y0[j] = o -= s1 ? s2 / s1 * dx : 0;
-        if (o < o0) o0 = o;
-      }
-      for (j = 0; j < m; ++j) y0[j] -= o0;
-      return y0;
-    },
-    expand: function(data) {
-      var n = data.length, m = data[0].length, k = 1 / n, i, j, o, y0 = [];
-      for (j = 0; j < m; ++j) {
-        for (i = 0, o = 0; i < n; i++) o += data[i][j][1];
-        if (o) for (i = 0; i < n; i++) data[i][j][1] /= o; else for (i = 0; i < n; i++) data[i][j][1] = k;
-      }
-      for (j = 0; j < m; ++j) y0[j] = 0;
-      return y0;
-    },
-    zero: d3_layout_stackOffsetZero
-  });
-  function d3_layout_stackOrderDefault(data) {
-    return d3.range(data.length);
-  }
-  function d3_layout_stackOffsetZero(data) {
-    var j = -1, m = data[0].length, y0 = [];
-    while (++j < m) y0[j] = 0;
-    return y0;
-  }
-  function d3_layout_stackMaxIndex(array) {
-    var i = 1, j = 0, v = array[0][1], k, n = array.length;
-    for (;i < n; ++i) {
-      if ((k = array[i][1]) > v) {
-        j = i;
-        v = k;
-      }
-    }
-    return j;
-  }
-  function d3_layout_stackReduceSum(d) {
-    return d.reduce(d3_layout_stackSum, 0);
-  }
-  function d3_layout_stackSum(p, d) {
-    return p + d[1];
-  }
-  d3.layout.histogram = function() {
-    var frequency = true, valuer = Number, ranger = d3_layout_histogramRange, binner = d3_layout_histogramBinSturges;
-    function histogram(data, i) {
-      var bins = [], values = data.map(valuer, this), range = ranger.call(this, values, i), thresholds = binner.call(this, range, values, i), bin, i = -1, n = values.length, m = thresholds.length - 1, k = frequency ? 1 : 1 / n, x;
-      while (++i < m) {
-        bin = bins[i] = [];
-        bin.dx = thresholds[i + 1] - (bin.x = thresholds[i]);
-        bin.y = 0;
-      }
-      if (m > 0) {
-        i = -1;
-        while (++i < n) {
-          x = values[i];
-          if (x >= range[0] && x <= range[1]) {
-            bin = bins[d3.bisect(thresholds, x, 1, m) - 1];
-            bin.y += k;
-            bin.push(data[i]);
-          }
-        }
-      }
-      return bins;
-    }
-    histogram.value = function(x) {
-      if (!arguments.length) return valuer;
-      valuer = x;
-      return histogram;
-    };
-    histogram.range = function(x) {
-      if (!arguments.length) return ranger;
-      ranger = d3_functor(x);
-      return histogram;
-    };
-    histogram.bins = function(x) {
-      if (!arguments.length) return binner;
-      binner = typeof x === "number" ? function(range) {
-        return d3_layout_histogramBinFixed(range, x);
-      } : d3_functor(x);
-      return histogram;
-    };
-    histogram.frequency = function(x) {
-      if (!arguments.length) return frequency;
-      frequency = !!x;
-      return histogram;
-    };
-    return histogram;
-  };
-  function d3_layout_histogramBinSturges(range, values) {
-    return d3_layout_histogramBinFixed(range, Math.ceil(Math.log(values.length) / Math.LN2 + 1));
-  }
-  function d3_layout_histogramBinFixed(range, n) {
-    var x = -1, b = +range[0], m = (range[1] - b) / n, f = [];
-    while (++x <= n) f[x] = m * x + b;
-    return f;
-  }
-  function d3_layout_histogramRange(values) {
-    return [ d3.min(values), d3.max(values) ];
-  }
-  d3.layout.tree = function() {
-    var hierarchy = d3.layout.hierarchy().sort(null).value(null), separation = d3_layout_treeSeparation, size = [ 1, 1 ], nodeSize = false;
-    function tree(d, i) {
-      var nodes = hierarchy.call(this, d, i), root = nodes[0];
-      function firstWalk(node, previousSibling) {
-        var children = node.children, layout = node._tree;
-        if (children && (n = children.length)) {
-          var n, firstChild = children[0], previousChild, ancestor = firstChild, child, i = -1;
-          while (++i < n) {
-            child = children[i];
-            firstWalk(child, previousChild);
-            ancestor = apportion(child, previousChild, ancestor);
-            previousChild = child;
-          }
-          d3_layout_treeShift(node);
-          var midpoint = .5 * (firstChild._tree.prelim + child._tree.prelim);
-          if (previousSibling) {
-            layout.prelim = previousSibling._tree.prelim + separation(node, previousSibling);
-            layout.mod = layout.prelim - midpoint;
-          } else {
-            layout.prelim = midpoint;
-          }
-        } else {
-          if (previousSibling) {
-            layout.prelim = previousSibling._tree.prelim + separation(node, previousSibling);
-          }
-        }
-      }
-      function secondWalk(node, x) {
-        node.x = node._tree.prelim + x;
-        var children = node.children;
-        if (children && (n = children.length)) {
-          var i = -1, n;
-          x += node._tree.mod;
-          while (++i < n) {
-            secondWalk(children[i], x);
-          }
-        }
-      }
-      function apportion(node, previousSibling, ancestor) {
-        if (previousSibling) {
-          var vip = node, vop = node, vim = previousSibling, vom = node.parent.children[0], sip = vip._tree.mod, sop = vop._tree.mod, sim = vim._tree.mod, som = vom._tree.mod, shift;
-          while (vim = d3_layout_treeRight(vim), vip = d3_layout_treeLeft(vip), vim && vip) {
-            vom = d3_layout_treeLeft(vom);
-            vop = d3_layout_treeRight(vop);
-            vop._tree.ancestor = node;
-            shift = vim._tree.prelim + sim - vip._tree.prelim - sip + separation(vim, vip);
-            if (shift > 0) {
-              d3_layout_treeMove(d3_layout_treeAncestor(vim, node, ancestor), node, shift);
-              sip += shift;
-              sop += shift;
-            }
-            sim += vim._tree.mod;
-            sip += vip._tree.mod;
-            som += vom._tree.mod;
-            sop += vop._tree.mod;
-          }
-          if (vim && !d3_layout_treeRight(vop)) {
-            vop._tree.thread = vim;
-            vop._tree.mod += sim - sop;
-          }
-          if (vip && !d3_layout_treeLeft(vom)) {
-            vom._tree.thread = vip;
-            vom._tree.mod += sip - som;
-            ancestor = node;
-          }
-        }
-        return ancestor;
-      }
-      d3_layout_treeVisitAfter(root, function(node, previousSibling) {
-        node._tree = {
-          ancestor: node,
-          prelim: 0,
-          mod: 0,
-          change: 0,
-          shift: 0,
-          number: previousSibling ? previousSibling._tree.number + 1 : 0
-        };
-      });
-      firstWalk(root);
-      secondWalk(root, -root._tree.prelim);
-      var left = d3_layout_treeSearch(root, d3_layout_treeLeftmost), right = d3_layout_treeSearch(root, d3_layout_treeRightmost), deep = d3_layout_treeSearch(root, d3_layout_treeDeepest), x0 = left.x - separation(left, right) / 2, x1 = right.x + separation(right, left) / 2, y1 = deep.depth || 1;
-      d3_layout_treeVisitAfter(root, nodeSize ? function(node) {
-        node.x *= size[0];
-        node.y = node.depth * size[1];
-        delete node._tree;
-      } : function(node) {
-        node.x = (node.x - x0) / (x1 - x0) * size[0];
-        node.y = node.depth / y1 * size[1];
-        delete node._tree;
-      });
-      return nodes;
-    }
-    tree.separation = function(x) {
-      if (!arguments.length) return separation;
-      separation = x;
-      return tree;
-    };
-    tree.size = function(x) {
-      if (!arguments.length) return nodeSize ? null : size;
-      nodeSize = (size = x) == null;
-      return tree;
-    };
-    tree.nodeSize = function(x) {
-      if (!arguments.length) return nodeSize ? size : null;
-      nodeSize = (size = x) != null;
-      return tree;
-    };
-    return d3_layout_hierarchyRebind(tree, hierarchy);
-  };
-  function d3_layout_treeSeparation(a, b) {
-    return a.parent == b.parent ? 1 : 2;
-  }
-  function d3_layout_treeLeft(node) {
-    var children = node.children;
-    return children && children.length ? children[0] : node._tree.thread;
-  }
-  function d3_layout_treeRight(node) {
-    var children = node.children, n;
-    return children && (n = children.length) ? children[n - 1] : node._tree.thread;
-  }
-  function d3_layout_treeSearch(node, compare) {
-    var children = node.children;
-    if (children && (n = children.length)) {
-      var child, n, i = -1;
-      while (++i < n) {
-        if (compare(child = d3_layout_treeSearch(children[i], compare), node) > 0) {
-          node = child;
-        }
-      }
-    }
-    return node;
-  }
-  function d3_layout_treeRightmost(a, b) {
-    return a.x - b.x;
-  }
-  function d3_layout_treeLeftmost(a, b) {
-    return b.x - a.x;
-  }
-  function d3_layout_treeDeepest(a, b) {
-    return a.depth - b.depth;
-  }
-  function d3_layout_treeVisitAfter(node, callback) {
-    function visit(node, previousSibling) {
-      var children = node.children;
-      if (children && (n = children.length)) {
-        var child, previousChild = null, i = -1, n;
-        while (++i < n) {
-          child = children[i];
-          visit(child, previousChild);
-          previousChild = child;
-        }
-      }
-      callback(node, previousSibling);
-    }
-    visit(node, null);
-  }
-  function d3_layout_treeShift(node) {
-    var shift = 0, change = 0, children = node.children, i = children.length, child;
-    while (--i >= 0) {
-      child = children[i]._tree;
-      child.prelim += shift;
-      child.mod += shift;
-      shift += child.shift + (change += child.change);
-    }
-  }
-  function d3_layout_treeMove(ancestor, node, shift) {
-    ancestor = ancestor._tree;
-    node = node._tree;
-    var change = shift / (node.number - ancestor.number);
-    ancestor.change += change;
-    node.change -= change;
-    node.shift += shift;
-    node.prelim += shift;
-    node.mod += shift;
-  }
-  function d3_layout_treeAncestor(vim, node, ancestor) {
-    return vim._tree.ancestor.parent == node.parent ? vim._tree.ancestor : ancestor;
-  }
-  d3.layout.pack = function() {
-    var hierarchy = d3.layout.hierarchy().sort(d3_layout_packSort), padding = 0, size = [ 1, 1 ], radius;
-    function pack(d, i) {
-      var nodes = hierarchy.call(this, d, i), root = nodes[0], w = size[0], h = size[1], r = radius == null ? Math.sqrt : typeof radius === "function" ? radius : function() {
-        return radius;
-      };
-      root.x = root.y = 0;
-      d3_layout_treeVisitAfter(root, function(d) {
-        d.r = +r(d.value);
-      });
-      d3_layout_treeVisitAfter(root, d3_layout_packSiblings);
-      if (padding) {
-        var dr = padding * (radius ? 1 : Math.max(2 * root.r / w, 2 * root.r / h)) / 2;
-        d3_layout_treeVisitAfter(root, function(d) {
-          d.r += dr;
-        });
-        d3_layout_treeVisitAfter(root, d3_layout_packSiblings);
-        d3_layout_treeVisitAfter(root, function(d) {
-          d.r -= dr;
-        });
-      }
-      d3_layout_packTransform(root, w / 2, h / 2, radius ? 1 : 1 / Math.max(2 * root.r / w, 2 * root.r / h));
-      return nodes;
-    }
-    pack.size = function(_) {
-      if (!arguments.length) return size;
-      size = _;
-      return pack;
-    };
-    pack.radius = function(_) {
-      if (!arguments.length) return radius;
-      radius = _ == null || typeof _ === "function" ? _ : +_;
-      return pack;
-    };
-    pack.padding = function(_) {
-      if (!arguments.length) return padding;
-      padding = +_;
-      return pack;
-    };
-    return d3_layout_hierarchyRebind(pack, hierarchy);
-  };
-  function d3_layout_packSort(a, b) {
-    return a.value - b.value;
-  }
-  function d3_layout_packInsert(a, b) {
-    var c = a._pack_next;
-    a._pack_next = b;
-    b._pack_prev = a;
-    b._pack_next = c;
-    c._pack_prev = b;
-  }
-  function d3_layout_packSplice(a, b) {
-    a._pack_next = b;
-    b._pack_prev = a;
-  }
-  function d3_layout_packIntersects(a, b) {
-    var dx = b.x - a.x, dy = b.y - a.y, dr = a.r + b.r;
-    return .999 * dr * dr > dx * dx + dy * dy;
-  }
-  function d3_layout_packSiblings(node) {
-    if (!(nodes = node.children) || !(n = nodes.length)) return;
-    var nodes, xMin = Infinity, xMax = -Infinity, yMin = Infinity, yMax = -Infinity, a, b, c, i, j, k, n;
-    function bound(node) {
-      xMin = Math.min(node.x - node.r, xMin);
-      xMax = Math.max(node.x + node.r, xMax);
-      yMin = Math.min(node.y - node.r, yMin);
-      yMax = Math.max(node.y + node.r, yMax);
-    }
-    nodes.forEach(d3_layout_packLink);
-    a = nodes[0];
-    a.x = -a.r;
-    a.y = 0;
-    bound(a);
-    if (n > 1) {
-      b = nodes[1];
-      b.x = b.r;
-      b.y = 0;
-      bound(b);
-      if (n > 2) {
-        c = nodes[2];
-        d3_layout_packPlace(a, b, c);
-        bound(c);
-        d3_layout_packInsert(a, c);
-        a._pack_prev = c;
-        d3_layout_packInsert(c, b);
-        b = a._pack_next;
-        for (i = 3; i < n; i++) {
-          d3_layout_packPlace(a, b, c = nodes[i]);
-          var isect = 0, s1 = 1, s2 = 1;
-          for (j = b._pack_next; j !== b; j = j._pack_next, s1++) {
-            if (d3_layout_packIntersects(j, c)) {
-              isect = 1;
-              break;
-            }
-          }
-          if (isect == 1) {
-            for (k = a._pack_prev; k !== j._pack_prev; k = k._pack_prev, s2++) {
-              if (d3_layout_packIntersects(k, c)) {
-                break;
-              }
-            }
-          }
-          if (isect) {
-            if (s1 < s2 || s1 == s2 && b.r < a.r) d3_layout_packSplice(a, b = j); else d3_layout_packSplice(a = k, b);
-            i--;
-          } else {
-            d3_layout_packInsert(a, c);
-            b = c;
-            bound(c);
-          }
-        }
-      }
-    }
-    var cx = (xMin + xMax) / 2, cy = (yMin + yMax) / 2, cr = 0;
-    for (i = 0; i < n; i++) {
-      c = nodes[i];
-      c.x -= cx;
-      c.y -= cy;
-      cr = Math.max(cr, c.r + Math.sqrt(c.x * c.x + c.y * c.y));
-    }
-    node.r = cr;
-    nodes.forEach(d3_layout_packUnlink);
-  }
-  function d3_layout_packLink(node) {
-    node._pack_next = node._pack_prev = node;
-  }
-  function d3_layout_packUnlink(node) {
-    delete node._pack_next;
-    delete node._pack_prev;
-  }
-  function d3_layout_packTransform(node, x, y, k) {
-    var children = node.children;
-    node.x = x += k * node.x;
-    node.y = y += k * node.y;
-    node.r *= k;
-    if (children) {
-      var i = -1, n = children.length;
-      while (++i < n) d3_layout_packTransform(children[i], x, y, k);
-    }
-  }
-  function d3_layout_packPlace(a, b, c) {
-    var db = a.r + c.r, dx = b.x - a.x, dy = b.y - a.y;
-    if (db && (dx || dy)) {
-      var da = b.r + c.r, dc = dx * dx + dy * dy;
-      da *= da;
-      db *= db;
-      var x = .5 + (db - da) / (2 * dc), y = Math.sqrt(Math.max(0, 2 * da * (db + dc) - (db -= dc) * db - da * da)) / (2 * dc);
-      c.x = a.x + x * dx + y * dy;
-      c.y = a.y + x * dy - y * dx;
-    } else {
-      c.x = a.x + db;
-      c.y = a.y;
-    }
-  }
-  d3.layout.cluster = function() {
-    var hierarchy = d3.layout.hierarchy().sort(null).value(null), separation = d3_layout_treeSeparation, size = [ 1, 1 ], nodeSize = false;
-    function cluster(d, i) {
-      var nodes = hierarchy.call(this, d, i), root = nodes[0], previousNode, x = 0;
-      d3_layout_treeVisitAfter(root, function(node) {
-        var children = node.children;
-        if (children && children.length) {
-          node.x = d3_layout_clusterX(children);
-          node.y = d3_layout_clusterY(children);
-        } else {
-          node.x = previousNode ? x += separation(node, previousNode) : 0;
-          node.y = 0;
-          previousNode = node;
-        }
-      });
-      var left = d3_layout_clusterLeft(root), right = d3_layout_clusterRight(root), x0 = left.x - separation(left, right) / 2, x1 = right.x + separation(right, left) / 2;
-      d3_layout_treeVisitAfter(root, nodeSize ? function(node) {
-        node.x = (node.x - root.x) * size[0];
-        node.y = (root.y - node.y) * size[1];
-      } : function(node) {
-        node.x = (node.x - x0) / (x1 - x0) * size[0];
-        node.y = (1 - (root.y ? node.y / root.y : 1)) * size[1];
-      });
-      return nodes;
-    }
-    cluster.separation = function(x) {
-      if (!arguments.length) return separation;
-      separation = x;
-      return cluster;
-    };
-    cluster.size = function(x) {
-      if (!arguments.length) return nodeSize ? null : size;
-      nodeSize = (size = x) == null;
-      return cluster;
-    };
-    cluster.nodeSize = function(x) {
-      if (!arguments.length) return nodeSize ? size : null;
-      nodeSize = (size = x) != null;
-      return cluster;
-    };
-    return d3_layout_hierarchyRebind(cluster, hierarchy);
-  };
-  function d3_layout_clusterY(children) {
-    return 1 + d3.max(children, function(child) {
-      return child.y;
-    });
-  }
-  function d3_layout_clusterX(children) {
-    return children.reduce(function(x, child) {
-      return x + child.x;
-    }, 0) / children.length;
-  }
-  function d3_layout_clusterLeft(node) {
-    var children = node.children;
-    return children && children.length ? d3_layout_clusterLeft(children[0]) : node;
-  }
-  function d3_layout_clusterRight(node) {
-    var children = node.children, n;
-    return children && (n = children.length) ? d3_layout_clusterRight(children[n - 1]) : node;
-  }
-  d3.layout.treemap = function() {
-    var hierarchy = d3.layout.hierarchy(), round = Math.round, size = [ 1, 1 ], padding = null, pad = d3_layout_treemapPadNull, sticky = false, stickies, mode = "squarify", ratio = .5 * (1 + Math.sqrt(5));
-    function scale(children, k) {
-      var i = -1, n = children.length, child, area;
-      while (++i < n) {
-        area = (child = children[i]).value * (k < 0 ? 0 : k);
-        child.area = isNaN(area) || area <= 0 ? 0 : area;
-      }
-    }
-    function squarify(node) {
-      var children = node.children;
-      if (children && children.length) {
-        var rect = pad(node), row = [], remaining = children.slice(), child, best = Infinity, score, u = mode === "slice" ? rect.dx : mode === "dice" ? rect.dy : mode === "slice-dice" ? node.depth & 1 ? rect.dy : rect.dx : Math.min(rect.dx, rect.dy), n;
-        scale(remaining, rect.dx * rect.dy / node.value);
-        row.area = 0;
-        while ((n = remaining.length) > 0) {
-          row.push(child = remaining[n - 1]);
-          row.area += child.area;
-          if (mode !== "squarify" || (score = worst(row, u)) <= best) {
-            remaining.pop();
-            best = score;
-          } else {
-            row.area -= row.pop().area;
-            position(row, u, rect, false);
-            u = Math.min(rect.dx, rect.dy);
-            row.length = row.area = 0;
-            best = Infinity;
-          }
-        }
-        if (row.length) {
-          position(row, u, rect, true);
-          row.length = row.area = 0;
-        }
-        children.forEach(squarify);
-      }
-    }
-    function stickify(node) {
-      var children = node.children;
-      if (children && children.length) {
-        var rect = pad(node), remaining = children.slice(), child, row = [];
-        scale(remaining, rect.dx * rect.dy / node.value);
-        row.area = 0;
-        while (child = remaining.pop()) {
-          row.push(child);
-          row.area += child.area;
-          if (child.z != null) {
-            position(row, child.z ? rect.dx : rect.dy, rect, !remaining.length);
-            row.length = row.area = 0;
-          }
-        }
-        children.forEach(stickify);
-      }
-    }
-    function worst(row, u) {
-      var s = row.area, r, rmax = 0, rmin = Infinity, i = -1, n = row.length;
-      while (++i < n) {
-        if (!(r = row[i].area)) continue;
-        if (r < rmin) rmin = r;
-        if (r > rmax) rmax = r;
-      }
-      s *= s;
-      u *= u;
-      return s ? Math.max(u * rmax * ratio / s, s / (u * rmin * ratio)) : Infinity;
-    }
-    function position(row, u, rect, flush) {
-      var i = -1, n = row.length, x = rect.x, y = rect.y, v = u ? round(row.area / u) : 0, o;
-      if (u == rect.dx) {
-        if (flush || v > rect.dy) v = rect.dy;
-        while (++i < n) {
-          o = row[i];
-          o.x = x;
-          o.y = y;
-          o.dy = v;
-          x += o.dx = Math.min(rect.x + rect.dx - x, v ? round(o.area / v) : 0);
-        }
-        o.z = true;
-        o.dx += rect.x + rect.dx - x;
-        rect.y += v;
-        rect.dy -= v;
-      } else {
-        if (flush || v > rect.dx) v = rect.dx;
-        while (++i < n) {
-          o = row[i];
-          o.x = x;
-          o.y = y;
-          o.dx = v;
-          y += o.dy = Math.min(rect.y + rect.dy - y, v ? round(o.area / v) : 0);
-        }
-        o.z = false;
-        o.dy += rect.y + rect.dy - y;
-        rect.x += v;
-        rect.dx -= v;
-      }
-    }
-    function treemap(d) {
-      var nodes = stickies || hierarchy(d), root = nodes[0];
-      root.x = 0;
-      root.y = 0;
-      root.dx = size[0];
-      root.dy = size[1];
-      if (stickies) hierarchy.revalue(root);
-      scale([ root ], root.dx * root.dy / root.value);
-      (stickies ? stickify : squarify)(root);
-      if (sticky) stickies = nodes;
-      return nodes;
-    }
-    treemap.size = function(x) {
-      if (!arguments.length) return size;
-      size = x;
-      return treemap;
-    };
-    treemap.padding = function(x) {
-      if (!arguments.length) return padding;
-      function padFunction(node) {
-        var p = x.call(treemap, node, node.depth);
-        return p == null ? d3_layout_treemapPadNull(node) : d3_layout_treemapPad(node, typeof p === "number" ? [ p, p, p, p ] : p);
-      }
-      function padConstant(node) {
-        return d3_layout_treemapPad(node, x);
-      }
-      var type;
-      pad = (padding = x) == null ? d3_layout_treemapPadNull : (type = typeof x) === "function" ? padFunction : type === "number" ? (x = [ x, x, x, x ], 
-      padConstant) : padConstant;
-      return treemap;
-    };
-    treemap.round = function(x) {
-      if (!arguments.length) return round != Number;
-      round = x ? Math.round : Number;
-      return treemap;
-    };
-    treemap.sticky = function(x) {
-      if (!arguments.length) return sticky;
-      sticky = x;
-      stickies = null;
-      return treemap;
-    };
-    treemap.ratio = function(x) {
-      if (!arguments.length) return ratio;
-      ratio = x;
-      return treemap;
-    };
-    treemap.mode = function(x) {
-      if (!arguments.length) return mode;
-      mode = x + "";
-      return treemap;
-    };
-    return d3_layout_hierarchyRebind(treemap, hierarchy);
-  };
-  function d3_layout_treemapPadNull(node) {
-    return {
-      x: node.x,
-      y: node.y,
-      dx: node.dx,
-      dy: node.dy
-    };
-  }
-  function d3_layout_treemapPad(node, padding) {
-    var x = node.x + padding[3], y = node.y + padding[0], dx = node.dx - padding[1] - padding[3], dy = node.dy - padding[0] - padding[2];
-    if (dx < 0) {
-      x += dx / 2;
-      dx = 0;
-    }
-    if (dy < 0) {
-      y += dy / 2;
-      dy = 0;
-    }
-    return {
-      x: x,
-      y: y,
-      dx: dx,
-      dy: dy
-    };
-  }
-  d3.random = {
-    normal: function(µ, σ) {
-      var n = arguments.length;
-      if (n < 2) σ = 1;
-      if (n < 1) µ = 0;
-      return function() {
-        var x, y, r;
-        do {
-          x = Math.random() * 2 - 1;
-          y = Math.random() * 2 - 1;
-          r = x * x + y * y;
-        } while (!r || r > 1);
-        return µ + σ * x * Math.sqrt(-2 * Math.log(r) / r);
-      };
-    },
-    logNormal: function() {
-      var random = d3.random.normal.apply(d3, arguments);
-      return function() {
-        return Math.exp(random());
-      };
-    },
-    bates: function(m) {
-      var random = d3.random.irwinHall(m);
-      return function() {
-        return random() / m;
-      };
-    },
-    irwinHall: function(m) {
-      return function() {
-        for (var s = 0, j = 0; j < m; j++) s += Math.random();
-        return s;
-      };
-    }
-  };
-  d3.scale = {};
-  function d3_scaleExtent(domain) {
-    var start = domain[0], stop = domain[domain.length - 1];
-    return start < stop ? [ start, stop ] : [ stop, start ];
-  }
-  function d3_scaleRange(scale) {
-    return scale.rangeExtent ? scale.rangeExtent() : d3_scaleExtent(scale.range());
-  }
-  function d3_scale_bilinear(domain, range, uninterpolate, interpolate) {
-    var u = uninterpolate(domain[0], domain[1]), i = interpolate(range[0], range[1]);
-    return function(x) {
-      return i(u(x));
-    };
-  }
-  function d3_scale_nice(domain, nice) {
-    var i0 = 0, i1 = domain.length - 1, x0 = domain[i0], x1 = domain[i1], dx;
-    if (x1 < x0) {
-      dx = i0, i0 = i1, i1 = dx;
-      dx = x0, x0 = x1, x1 = dx;
-    }
-    domain[i0] = nice.floor(x0);
-    domain[i1] = nice.ceil(x1);
-    return domain;
-  }
-  function d3_scale_niceStep(step) {
-    return step ? {
-      floor: function(x) {
-        return Math.floor(x / step) * step;
-      },
-      ceil: function(x) {
-        return Math.ceil(x / step) * step;
-      }
-    } : d3_scale_niceIdentity;
-  }
-  var d3_scale_niceIdentity = {
-    floor: d3_identity,
-    ceil: d3_identity
-  };
-  function d3_scale_polylinear(domain, range, uninterpolate, interpolate) {
-    var u = [], i = [], j = 0, k = Math.min(domain.length, range.length) - 1;
-    if (domain[k] < domain[0]) {
-      domain = domain.slice().reverse();
-      range = range.slice().reverse();
-    }
-    while (++j <= k) {
-      u.push(uninterpolate(domain[j - 1], domain[j]));
-      i.push(interpolate(range[j - 1], range[j]));
-    }
-    return function(x) {
-      var j = d3.bisect(domain, x, 1, k) - 1;
-      return i[j](u[j](x));
-    };
-  }
-  d3.scale.linear = function() {
-    return d3_scale_linear([ 0, 1 ], [ 0, 1 ], d3_interpolate, false);
-  };
-  function d3_scale_linear(domain, range, interpolate, clamp) {
-    var output, input;
-    function rescale() {
-      var linear = Math.min(domain.length, range.length) > 2 ? d3_scale_polylinear : d3_scale_bilinear, uninterpolate = clamp ? d3_uninterpolateClamp : d3_uninterpolateNumber;
-      output = linear(domain, range, uninterpolate, interpolate);
-      input = linear(range, domain, uninterpolate, d3_interpolate);
-      return scale;
-    }
-    function scale(x) {
-      return output(x);
-    }
-    scale.invert = function(y) {
-      return input(y);
-    };
-    scale.domain = function(x) {
-      if (!arguments.length) return domain;
-      domain = x.map(Number);
-      return rescale();
-    };
-    scale.range = function(x) {
-      if (!arguments.length) return range;
-      range = x;
-      return rescale();
-    };
-    scale.rangeRound = function(x) {
-      return scale.range(x).interpolate(d3_interpolateRound);
-    };
-    scale.clamp = function(x) {
-      if (!arguments.length) return clamp;
-      clamp = x;
-      return rescale();
-    };
-    scale.interpolate = function(x) {
-      if (!arguments.length) return interpolate;
-      interpolate = x;
-      return rescale();
-    };
-    scale.ticks = function(m) {
-      return d3_scale_linearTicks(domain, m);
-    };
-    scale.tickFormat = function(m, format) {
-      return d3_scale_linearTickFormat(domain, m, format);
-    };
-    scale.nice = function(m) {
-      d3_scale_linearNice(domain, m);
-      return rescale();
-    };
-    scale.copy = function() {
-      return d3_scale_linear(domain, range, interpolate, clamp);
-    };
-    return rescale();
-  }
-  function d3_scale_linearRebind(scale, linear) {
-    return d3.rebind(scale, linear, "range", "rangeRound", "interpolate", "clamp");
-  }
-  function d3_scale_linearNice(domain, m) {
-    return d3_scale_nice(domain, d3_scale_niceStep(d3_scale_linearTickRange(domain, m)[2]));
-  }
-  function d3_scale_linearTickRange(domain, m) {
-    if (m == null) m = 10;
-    var extent = d3_scaleExtent(domain), span = extent[1] - extent[0], step = Math.pow(10, Math.floor(Math.log(span / m) / Math.LN10)), err = m / span * step;
-    if (err <= .15) step *= 10; else if (err <= .35) step *= 5; else if (err <= .75) step *= 2;
-    extent[0] = Math.ceil(extent[0] / step) * step;
-    extent[1] = Math.floor(extent[1] / step) * step + step * .5;
-    extent[2] = step;
-    return extent;
-  }
-  function d3_scale_linearTicks(domain, m) {
-    return d3.range.apply(d3, d3_scale_linearTickRange(domain, m));
-  }
-  function d3_scale_linearTickFormat(domain, m, format) {
-    var range = d3_scale_linearTickRange(domain, m);
-    if (format) {
-      var match = d3_format_re.exec(format);
-      match.shift();
-      if (match[8] === "s") {
-        var prefix = d3.formatPrefix(Math.max(abs(range[0]), abs(range[1])));
-        if (!match[7]) match[7] = "." + d3_scale_linearPrecision(prefix.scale(range[2]));
-        match[8] = "f";
-        format = d3.format(match.join(""));
-        return function(d) {
-          return format(prefix.scale(d)) + prefix.symbol;
-        };
-      }
-      if (!match[7]) match[7] = "." + d3_scale_linearFormatPrecision(match[8], range);
-      format = match.join("");
-    } else {
-      format = ",." + d3_scale_linearPrecision(range[2]) + "f";
-    }
-    return d3.format(format);
-  }
-  var d3_scale_linearFormatSignificant = {
-    s: 1,
-    g: 1,
-    p: 1,
-    r: 1,
-    e: 1
-  };
-  function d3_scale_linearPrecision(value) {
-    return -Math.floor(Math.log(value) / Math.LN10 + .01);
-  }
-  function d3_scale_linearFormatPrecision(type, range) {
-    var p = d3_scale_linearPrecision(range[2]);
-    return type in d3_scale_linearFormatSignificant ? Math.abs(p - d3_scale_linearPrecision(Math.max(abs(range[0]), abs(range[1])))) + +(type !== "e") : p - (type === "%") * 2;
-  }
-  d3.scale.log = function() {
-    return d3_scale_log(d3.scale.linear().domain([ 0, 1 ]), 10, true, [ 1, 10 ]);
-  };
-  function d3_scale_log(linear, base, positive, domain) {
-    function log(x) {
-      return (positive ? Math.log(x < 0 ? 0 : x) : -Math.log(x > 0 ? 0 : -x)) / Math.log(base);
-    }
-    function pow(x) {
-      return positive ? Math.pow(base, x) : -Math.pow(base, -x);
-    }
-    function scale(x) {
-      return linear(log(x));
-    }
-    scale.invert = function(x) {
-      return pow(linear.invert(x));
-    };
-    scale.domain = function(x) {
-      if (!arguments.length) return domain;
-      positive = x[0] >= 0;
-      linear.domain((domain = x.map(Number)).map(log));
-      return scale;
-    };
-    scale.base = function(_) {
-      if (!arguments.length) return base;
-      base = +_;
-      linear.domain(domain.map(log));
-      return scale;
-    };
-    scale.nice = function() {
-      var niced = d3_scale_nice(domain.map(log), positive ? Math : d3_scale_logNiceNegative);
-      linear.domain(niced);
-      domain = niced.map(pow);
-      return scale;
-    };
-    scale.ticks = function() {
-      var extent = d3_scaleExtent(domain), ticks = [], u = extent[0], v = extent[1], i = Math.floor(log(u)), j = Math.ceil(log(v)), n = base % 1 ? 2 : base;
-      if (isFinite(j - i)) {
-        if (positive) {
-          for (;i < j; i++) for (var k = 1; k < n; k++) ticks.push(pow(i) * k);
-          ticks.push(pow(i));
-        } else {
-          ticks.push(pow(i));
-          for (;i++ < j; ) for (var k = n - 1; k > 0; k--) ticks.push(pow(i) * k);
-        }
-        for (i = 0; ticks[i] < u; i++) {}
-        for (j = ticks.length; ticks[j - 1] > v; j--) {}
-        ticks = ticks.slice(i, j);
-      }
-      return ticks;
-    };
-    scale.tickFormat = function(n, format) {
-      if (!arguments.length) return d3_scale_logFormat;
-      if (arguments.length < 2) format = d3_scale_logFormat; else if (typeof format !== "function") format = d3.format(format);
-      var k = Math.max(.1, n / scale.ticks().length), f = positive ? (e = 1e-12, Math.ceil) : (e = -1e-12, 
-      Math.floor), e;
-      return function(d) {
-        return d / pow(f(log(d) + e)) <= k ? format(d) : "";
-      };
-    };
-    scale.copy = function() {
-      return d3_scale_log(linear.copy(), base, positive, domain);
-    };
-    return d3_scale_linearRebind(scale, linear);
-  }
-  var d3_scale_logFormat = d3.format(".0e"), d3_scale_logNiceNegative = {
-    floor: function(x) {
-      return -Math.ceil(-x);
-    },
-    ceil: function(x) {
-      return -Math.floor(-x);
-    }
-  };
-  d3.scale.pow = function() {
-    return d3_scale_pow(d3.scale.linear(), 1, [ 0, 1 ]);
-  };
-  function d3_scale_pow(linear, exponent, domain) {
-    var powp = d3_scale_powPow(exponent), powb = d3_scale_powPow(1 / exponent);
-    function scale(x) {
-      return linear(powp(x));
-    }
-    scale.invert = function(x) {
-      return powb(linear.invert(x));
-    };
-    scale.domain = function(x) {
-      if (!arguments.length) return domain;
-      linear.domain((domain = x.map(Number)).map(powp));
-      return scale;
-    };
-    scale.ticks = function(m) {
-      return d3_scale_linearTicks(domain, m);
-    };
-    scale.tickFormat = function(m, format) {
-      return d3_scale_linearTickFormat(domain, m, format);
-    };
-    scale.nice = function(m) {
-      return scale.domain(d3_scale_linearNice(domain, m));
-    };
-    scale.exponent = function(x) {
-      if (!arguments.length) return exponent;
-      powp = d3_scale_powPow(exponent = x);
-      powb = d3_scale_powPow(1 / exponent);
-      linear.domain(domain.map(powp));
-      return scale;
-    };
-    scale.copy = function() {
-      return d3_scale_pow(linear.copy(), exponent, domain);
-    };
-    return d3_scale_linearRebind(scale, linear);
-  }
-  function d3_scale_powPow(e) {
-    return function(x) {
-      return x < 0 ? -Math.pow(-x, e) : Math.pow(x, e);
-    };
-  }
-  d3.scale.sqrt = function() {
-    return d3.scale.pow().exponent(.5);
-  };
-  d3.scale.ordinal = function() {
-    return d3_scale_ordinal([], {
-      t: "range",
-      a: [ [] ]
-    });
-  };
-  function d3_scale_ordinal(domain, ranger) {
-    var index, range, rangeBand;
-    function scale(x) {
-      return range[((index.get(x) || (ranger.t === "range" ? index.set(x, domain.push(x)) : NaN)) - 1) % range.length];
-    }
-    function steps(start, step) {
-      return d3.range(domain.length).map(function(i) {
-        return start + step * i;
-      });
-    }
-    scale.domain = function(x) {
-      if (!arguments.length) return domain;
-      domain = [];
-      index = new d3_Map();
-      var i = -1, n = x.length, xi;
-      while (++i < n) if (!index.has(xi = x[i])) index.set(xi, domain.push(xi));
-      return scale[ranger.t].apply(scale, ranger.a);
-    };
-    scale.range = function(x) {
-      if (!arguments.length) return range;
-      range = x;
-      rangeBand = 0;
-      ranger = {
-        t: "range",
-        a: arguments
-      };
-      return scale;
-    };
-    scale.rangePoints = function(x, padding) {
-      if (arguments.length < 2) padding = 0;
-      var start = x[0], stop = x[1], step = (stop - start) / (Math.max(1, domain.length - 1) + padding);
-      range = steps(domain.length < 2 ? (start + stop) / 2 : start + step * padding / 2, step);
-      rangeBand = 0;
-      ranger = {
-        t: "rangePoints",
-        a: arguments
-      };
-      return scale;
-    };
-    scale.rangeBands = function(x, padding, outerPadding) {
-      if (arguments.length < 2) padding = 0;
-      if (arguments.length < 3) outerPadding = padding;
-      var reverse = x[1] < x[0], start = x[reverse - 0], stop = x[1 - reverse], step = (stop - start) / (domain.length - padding + 2 * outerPadding);
-      range = steps(start + step * outerPadding, step);
-      if (reverse) range.reverse();
-      rangeBand = step * (1 - padding);
-      ranger = {
-        t: "rangeBands",
-        a: arguments
-      };
-      return scale;
-    };
-    scale.rangeRoundBands = function(x, padding, outerPadding) {
-      if (arguments.length < 2) padding = 0;
-      if (arguments.length < 3) outerPadding = padding;
-      var reverse = x[1] < x[0], start = x[reverse - 0], stop = x[1 - reverse], step = Math.floor((stop - start) / (domain.length - padding + 2 * outerPadding)), error = stop - start - (domain.length - padding) * step;
-      range = steps(start + Math.round(error / 2), step);
-      if (reverse) range.reverse();
-      rangeBand = Math.round(step * (1 - padding));
-      ranger = {
-        t: "rangeRoundBands",
-        a: arguments
-      };
-      return scale;
-    };
-    scale.rangeBand = function() {
-      return rangeBand;
-    };
-    scale.rangeExtent = function() {
-      return d3_scaleExtent(ranger.a[0]);
-    };
-    scale.copy = function() {
-      return d3_scale_ordinal(domain, ranger);
-    };
-    return scale.domain(domain);
-  }
-  d3.scale.category10 = function() {
-    return d3.scale.ordinal().range(d3_category10);
-  };
-  d3.scale.category20 = function() {
-    return d3.scale.ordinal().range(d3_category20);
-  };
-  d3.scale.category20b = function() {
-    return d3.scale.ordinal().range(d3_category20b);
-  };
-  d3.scale.category20c = function() {
-    return d3.scale.ordinal().range(d3_category20c);
-  };
-  var d3_category10 = [ 2062260, 16744206, 2924588, 14034728, 9725885, 9197131, 14907330, 8355711, 12369186, 1556175 ].map(d3_rgbString);
-  var d3_category20 = [ 2062260, 11454440, 16744206, 16759672, 2924588, 10018698, 14034728, 16750742, 9725885, 12955861, 9197131, 12885140, 14907330, 16234194, 8355711, 13092807, 12369186, 14408589, 1556175, 10410725 ].map(d3_rgbString);
-  var d3_category20b = [ 3750777, 5395619, 7040719, 10264286, 6519097, 9216594, 11915115, 13556636, 9202993, 12426809, 15186514, 15190932, 8666169, 11356490, 14049643, 15177372, 8077683, 10834324, 13528509, 14589654 ].map(d3_rgbString);
-  var d3_category20c = [ 3244733, 7057110, 10406625, 13032431, 15095053, 16616764, 16625259, 16634018, 3253076, 7652470, 10607003, 13101504, 7695281, 10394312, 12369372, 14342891, 6513507, 9868950, 12434877, 14277081 ].map(d3_rgbString);
-  d3.scale.quantile = function() {
-    return d3_scale_quantile([], []);
-  };
-  function d3_scale_quantile(domain, range) {
-    var thresholds;
-    function rescale() {
-      var k = 0, q = range.length;
-      thresholds = [];
-      while (++k < q) thresholds[k - 1] = d3.quantile(domain, k / q);
-      return scale;
-    }
-    function scale(x) {
-      if (!isNaN(x = +x)) return range[d3.bisect(thresholds, x)];
-    }
-    scale.domain = function(x) {
-      if (!arguments.length) return domain;
-      domain = x.filter(function(d) {
-        return !isNaN(d);
-      }).sort(d3_ascending);
-      return rescale();
-    };
-    scale.range = function(x) {
-      if (!arguments.length) return range;
-      range = x;
-      return rescale();
-    };
-    scale.quantiles = function() {
-      return thresholds;
-    };
-    scale.invertExtent = function(y) {
-      y = range.indexOf(y);
-      return y < 0 ? [ NaN, NaN ] : [ y > 0 ? thresholds[y - 1] : domain[0], y < thresholds.length ? thresholds[y] : domain[domain.length - 1] ];
-    };
-    scale.copy = function() {
-      return d3_scale_quantile(domain, range);
-    };
-    return rescale();
-  }
-  d3.scale.quantize = function() {
-    return d3_scale_quantize(0, 1, [ 0, 1 ]);
-  };
-  function d3_scale_quantize(x0, x1, range) {
-    var kx, i;
-    function scale(x) {
-      return range[Math.max(0, Math.min(i, Math.floor(kx * (x - x0))))];
-    }
-    function rescale() {
-      kx = range.length / (x1 - x0);
-      i = range.length - 1;
-      return scale;
-    }
-    scale.domain = function(x) {
-      if (!arguments.length) return [ x0, x1 ];
-      x0 = +x[0];
-      x1 = +x[x.length - 1];
-      return rescale();
-    };
-    scale.range = function(x) {
-      if (!arguments.length) return range;
-      range = x;
-      return rescale();
-    };
-    scale.invertExtent = function(y) {
-      y = range.indexOf(y);
-      y = y < 0 ? NaN : y / kx + x0;
-      return [ y, y + 1 / kx ];
-    };
-    scale.copy = function() {
-      return d3_scale_quantize(x0, x1, range);
-    };
-    return rescale();
-  }
-  d3.scale.threshold = function() {
-    return d3_scale_threshold([ .5 ], [ 0, 1 ]);
-  };
-  function d3_scale_threshold(domain, range) {
-    function scale(x) {
-      if (x <= x) return range[d3.bisect(domain, x)];
-    }
-    scale.domain = function(_) {
-      if (!arguments.length) return domain;
-      domain = _;
-      return scale;
-    };
-    scale.range = function(_) {
-      if (!arguments.length) return range;
-      range = _;
-      return scale;
-    };
-    scale.invertExtent = function(y) {
-      y = range.indexOf(y);
-      return [ domain[y - 1], domain[y] ];
-    };
-    scale.copy = function() {
-      return d3_scale_threshold(domain, range);
-    };
-    return scale;
-  }
-  d3.scale.identity = function() {
-    return d3_scale_identity([ 0, 1 ]);
-  };
-  function d3_scale_identity(domain) {
-    function identity(x) {
-      return +x;
-    }
-    identity.invert = identity;
-    identity.domain = identity.range = function(x) {
-      if (!arguments.length) return domain;
-      domain = x.map(identity);
-      return identity;
-    };
-    identity.ticks = function(m) {
-      return d3_scale_linearTicks(domain, m);
-    };
-    identity.tickFormat = function(m, format) {
-      return d3_scale_linearTickFormat(domain, m, format);
-    };
-    identity.copy = function() {
-      return d3_scale_identity(domain);
-    };
-    return identity;
-  }
-  d3.svg = {};
-  d3.svg.arc = function() {
-    var innerRadius = d3_svg_arcInnerRadius, outerRadius = d3_svg_arcOuterRadius, startAngle = d3_svg_arcStartAngle, endAngle = d3_svg_arcEndAngle;
-    function arc() {
-      var r0 = innerRadius.apply(this, arguments), r1 = outerRadius.apply(this, arguments), a0 = startAngle.apply(this, arguments) + d3_svg_arcOffset, a1 = endAngle.apply(this, arguments) + d3_svg_arcOffset, da = (a1 < a0 && (da = a0, 
-      a0 = a1, a1 = da), a1 - a0), df = da < π ? "0" : "1", c0 = Math.cos(a0), s0 = Math.sin(a0), c1 = Math.cos(a1), s1 = Math.sin(a1);
-      return da >= d3_svg_arcMax ? r0 ? "M0," + r1 + "A" + r1 + "," + r1 + " 0 1,1 0," + -r1 + "A" + r1 + "," + r1 + " 0 1,1 0," + r1 + "M0," + r0 + "A" + r0 + "," + r0 + " 0 1,0 0," + -r0 + "A" + r0 + "," + r0 + " 0 1,0 0," + r0 + "Z" : "M0," + r1 + "A" + r1 + "," + r1 + " 0 1,1 0," + -r1 + "A" + r1 + "," + r1 + " 0 1,1 0," + r1 + "Z" : r0 ? "M" + r1 * c0 + "," + r1 * s0 + "A" + r1 + "," + r1 + " 0 " + df + ",1 " + r1 * c1 + "," + r1 * s1 + "L" + r0 * c1 + "," + r0 * s1 + "A" + r0 + "," + r0 + " 0 " + df + ",0 " + r0 * c0 + "," + r0 * s0 + "Z" : "M" + r1 * c0 + "," + r1 * s0 + "A" + r1 + "," + r1 + " 0 " + df + ",1 " + r1 * c1 + "," + r1 * s1 + "L0,0" + "Z";
-    }
-    arc.innerRadius = function(v) {
-      if (!arguments.length) return innerRadius;
-      innerRadius = d3_functor(v);
-      return arc;
-    };
-    arc.outerRadius = function(v) {
-      if (!arguments.length) return outerRadius;
-      outerRadius = d3_functor(v);
-      return arc;
-    };
-    arc.startAngle = function(v) {
-      if (!arguments.length) return startAngle;
-      startAngle = d3_functor(v);
-      return arc;
-    };
-    arc.endAngle = function(v) {
-      if (!arguments.length) return endAngle;
-      endAngle = d3_functor(v);
-      return arc;
-    };
-    arc.centroid = function() {
-      var r = (innerRadius.apply(this, arguments) + outerRadius.apply(this, arguments)) / 2, a = (startAngle.apply(this, arguments) + endAngle.apply(this, arguments)) / 2 + d3_svg_arcOffset;
-      return [ Math.cos(a) * r, Math.sin(a) * r ];
-    };
-    return arc;
-  };
-  var d3_svg_arcOffset = -halfπ, d3_svg_arcMax = τ - ε;
-  function d3_svg_arcInnerRadius(d) {
-    return d.innerRadius;
-  }
-  function d3_svg_arcOuterRadius(d) {
-    return d.outerRadius;
-  }
-  function d3_svg_arcStartAngle(d) {
-    return d.startAngle;
-  }
-  function d3_svg_arcEndAngle(d) {
-    return d.endAngle;
-  }
-  function d3_svg_line(projection) {
-    var x = d3_geom_pointX, y = d3_geom_pointY, defined = d3_true, interpolate = d3_svg_lineLinear, interpolateKey = interpolate.key, tension = .7;
-    function line(data) {
-      var segments = [], points = [], i = -1, n = data.length, d, fx = d3_functor(x), fy = d3_functor(y);
-      function segment() {
-        segments.push("M", interpolate(projection(points), tension));
-      }
-      while (++i < n) {
-        if (defined.call(this, d = data[i], i)) {
-          points.push([ +fx.call(this, d, i), +fy.call(this, d, i) ]);
-        } else if (points.length) {
-          segment();
-          points = [];
-        }
-      }
-      if (points.length) segment();
-      return segments.length ? segments.join("") : null;
-    }
-    line.x = function(_) {
-      if (!arguments.length) return x;
-      x = _;
-      return line;
-    };
-    line.y = function(_) {
-      if (!arguments.length) return y;
-      y = _;
-      return line;
-    };
-    line.defined = function(_) {
-      if (!arguments.length) return defined;
-      defined = _;
-      return line;
-    };
-    line.interpolate = function(_) {
-      if (!arguments.length) return interpolateKey;
-      if (typeof _ === "function") interpolateKey = interpolate = _; else interpolateKey = (interpolate = d3_svg_lineInterpolators.get(_) || d3_svg_lineLinear).key;
-      return line;
-    };
-    line.tension = function(_) {
-      if (!arguments.length) return tension;
-      tension = _;
-      return line;
-    };
-    return line;
-  }
-  d3.svg.line = function() {
-    return d3_svg_line(d3_identity);
-  };
-  var d3_svg_lineInterpolators = d3.map({
-    linear: d3_svg_lineLinear,
-    "linear-closed": d3_svg_lineLinearClosed,
-    step: d3_svg_lineStep,
-    "step-before": d3_svg_lineStepBefore,
-    "step-after": d3_svg_lineStepAfter,
-    basis: d3_svg_lineBasis,
-    "basis-open": d3_svg_lineBasisOpen,
-    "basis-closed": d3_svg_lineBasisClosed,
-    bundle: d3_svg_lineBundle,
-    cardinal: d3_svg_lineCardinal,
-    "cardinal-open": d3_svg_lineCardinalOpen,
-    "cardinal-closed": d3_svg_lineCardinalClosed,
-    monotone: d3_svg_lineMonotone
-  });
-  d3_svg_lineInterpolators.forEach(function(key, value) {
-    value.key = key;
-    value.closed = /-closed$/.test(key);
-  });
-  function d3_svg_lineLinear(points) {
-    return points.join("L");
-  }
-  function d3_svg_lineLinearClosed(points) {
-    return d3_svg_lineLinear(points) + "Z";
-  }
-  function d3_svg_lineStep(points) {
-    var i = 0, n = points.length, p = points[0], path = [ p[0], ",", p[1] ];
-    while (++i < n) path.push("H", (p[0] + (p = points[i])[0]) / 2, "V", p[1]);
-    if (n > 1) path.push("H", p[0]);
-    return path.join("");
-  }
-  function d3_svg_lineStepBefore(points) {
-    var i = 0, n = points.length, p = points[0], path = [ p[0], ",", p[1] ];
-    while (++i < n) path.push("V", (p = points[i])[1], "H", p[0]);
-    return path.join("");
-  }
-  function d3_svg_lineStepAfter(points) {
-    var i = 0, n = points.length, p = points[0], path = [ p[0], ",", p[1] ];
-    while (++i < n) path.push("H", (p = points[i])[0], "V", p[1]);
-    return path.join("");
-  }
-  function d3_svg_lineCardinalOpen(points, tension) {
-    return points.length < 4 ? d3_svg_lineLinear(points) : points[1] + d3_svg_lineHermite(points.slice(1, points.length - 1), d3_svg_lineCardinalTangents(points, tension));
-  }
-  function d3_svg_lineCardinalClosed(points, tension) {
-    return points.length < 3 ? d3_svg_lineLinear(points) : points[0] + d3_svg_lineHermite((points.push(points[0]), 
-    points), d3_svg_lineCardinalTangents([ points[points.length - 2] ].concat(points, [ points[1] ]), tension));
-  }
-  function d3_svg_lineCardinal(points, tension) {
-    return points.length < 3 ? d3_svg_lineLinear(points) : points[0] + d3_svg_lineHermite(points, d3_svg_lineCardinalTangents(points, tension));
-  }
-  function d3_svg_lineHermite(points, tangents) {
-    if (tangents.length < 1 || points.length != tangents.length && points.length != tangents.length + 2) {
-      return d3_svg_lineLinear(points);
-    }
-    var quad = points.length != tangents.length, path = "", p0 = points[0], p = points[1], t0 = tangents[0], t = t0, pi = 1;
-    if (quad) {
-      path += "Q" + (p[0] - t0[0] * 2 / 3) + "," + (p[1] - t0[1] * 2 / 3) + "," + p[0] + "," + p[1];
-      p0 = points[1];
-      pi = 2;
-    }
-    if (tangents.length > 1) {
-      t = tangents[1];
-      p = points[pi];
-      pi++;
-      path += "C" + (p0[0] + t0[0]) + "," + (p0[1] + t0[1]) + "," + (p[0] - t[0]) + "," + (p[1] - t[1]) + "," + p[0] + "," + p[1];
-      for (var i = 2; i < tangents.length; i++, pi++) {
-        p = points[pi];
-        t = tangents[i];
-        path += "S" + (p[0] - t[0]) + "," + (p[1] - t[1]) + "," + p[0] + "," + p[1];
-      }
-    }
-    if (quad) {
-      var lp = points[pi];
-      path += "Q" + (p[0] + t[0] * 2 / 3) + "," + (p[1] + t[1] * 2 / 3) + "," + lp[0] + "," + lp[1];
-    }
-    return path;
-  }
-  function d3_svg_lineCardinalTangents(points, tension) {
-    var tangents = [], a = (1 - tension) / 2, p0, p1 = points[0], p2 = points[1], i = 1, n = points.length;
-    while (++i < n) {
-      p0 = p1;
-      p1 = p2;
-      p2 = points[i];
-      tangents.push([ a * (p2[0] - p0[0]), a * (p2[1] - p0[1]) ]);
-    }
-    return tangents;
-  }
-  function d3_svg_lineBasis(points) {
-    if (points.length < 3) return d3_svg_lineLinear(points);
-    var i = 1, n = points.length, pi = points[0], x0 = pi[0], y0 = pi[1], px = [ x0, x0, x0, (pi = points[1])[0] ], py = [ y0, y0, y0, pi[1] ], path = [ x0, ",", y0, "L", d3_svg_lineDot4(d3_svg_lineBasisBezier3, px), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, py) ];
-    points.push(points[n - 1]);
-    while (++i <= n) {
-      pi = points[i];
-      px.shift();
-      px.push(pi[0]);
-      py.shift();
-      py.push(pi[1]);
-      d3_svg_lineBasisBezier(path, px, py);
-    }
-    points.pop();
-    path.push("L", pi);
-    return path.join("");
-  }
-  function d3_svg_lineBasisOpen(points) {
-    if (points.length < 4) return d3_svg_lineLinear(points);
-    var path = [], i = -1, n = points.length, pi, px = [ 0 ], py = [ 0 ];
-    while (++i < 3) {
-      pi = points[i];
-      px.push(pi[0]);
-      py.push(pi[1]);
-    }
-    path.push(d3_svg_lineDot4(d3_svg_lineBasisBezier3, px) + "," + d3_svg_lineDot4(d3_svg_lineBasisBezier3, py));
-    --i;
-    while (++i < n) {
-      pi = points[i];
-      px.shift();
-      px.push(pi[0]);
-      py.shift();
-      py.push(pi[1]);
-      d3_svg_lineBasisBezier(path, px, py);
-    }
-    return path.join("");
-  }
-  function d3_svg_lineBasisClosed(points) {
-    var path, i = -1, n = points.length, m = n + 4, pi, px = [], py = [];
-    while (++i < 4) {
-      pi = points[i % n];
-      px.push(pi[0]);
-      py.push(pi[1]);
-    }
-    path = [ d3_svg_lineDot4(d3_svg_lineBasisBezier3, px), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, py) ];
-    --i;
-    while (++i < m) {
-      pi = points[i % n];
-      px.shift();
-      px.push(pi[0]);
-      py.shift();
-      py.push(pi[1]);
-      d3_svg_lineBasisBezier(path, px, py);
-    }
-    return path.join("");
-  }
-  function d3_svg_lineBundle(points, tension) {
-    var n = points.length - 1;
-    if (n) {
-      var x0 = points[0][0], y0 = points[0][1], dx = points[n][0] - x0, dy = points[n][1] - y0, i = -1, p, t;
-      while (++i <= n) {
-        p = points[i];
-        t = i / n;
-        p[0] = tension * p[0] + (1 - tension) * (x0 + t * dx);
-        p[1] = tension * p[1] + (1 - tension) * (y0 + t * dy);
-      }
-    }
-    return d3_svg_lineBasis(points);
-  }
-  function d3_svg_lineDot4(a, b) {
-    return a[0] * b[0] + a[1] * b[1] + a[2] * b[2] + a[3] * b[3];
-  }
-  var d3_svg_lineBasisBezier1 = [ 0, 2 / 3, 1 / 3, 0 ], d3_svg_lineBasisBezier2 = [ 0, 1 / 3, 2 / 3, 0 ], d3_svg_lineBasisBezier3 = [ 0, 1 / 6, 2 / 3, 1 / 6 ];
-  function d3_svg_lineBasisBezier(path, x, y) {
-    path.push("C", d3_svg_lineDot4(d3_svg_lineBasisBezier1, x), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier1, y), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier2, x), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier2, y), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, x), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, y));
-  }
-  function d3_svg_lineSlope(p0, p1) {
-    return (p1[1] - p0[1]) / (p1[0] - p0[0]);
-  }
-  function d3_svg_lineFiniteDifferences(points) {
-    var i = 0, j = points.length - 1, m = [], p0 = points[0], p1 = points[1], d = m[0] = d3_svg_lineSlope(p0, p1);
-    while (++i < j) {
-      m[i] = (d + (d = d3_svg_lineSlope(p0 = p1, p1 = points[i + 1]))) / 2;
-    }
-    m[i] = d;
-    return m;
-  }
-  function d3_svg_lineMonotoneTangents(points) {
-    var tangents = [], d, a, b, s, m = d3_svg_lineFiniteDifferences(points), i = -1, j = points.length - 1;
-    while (++i < j) {
-      d = d3_svg_lineSlope(points[i], points[i + 1]);
-      if (abs(d) < ε) {
-        m[i] = m[i + 1] = 0;
-      } else {
-        a = m[i] / d;
-        b = m[i + 1] / d;
-        s = a * a + b * b;
-        if (s > 9) {
-          s = d * 3 / Math.sqrt(s);
-          m[i] = s * a;
-          m[i + 1] = s * b;
-        }
-      }
-    }
-    i = -1;
-    while (++i <= j) {
-      s = (points[Math.min(j, i + 1)][0] - points[Math.max(0, i - 1)][0]) / (6 * (1 + m[i] * m[i]));
-      tangents.push([ s || 0, m[i] * s || 0 ]);
-    }
-    return tangents;
-  }
-  function d3_svg_lineMonotone(points) {
-    return points.length < 3 ? d3_svg_lineLinear(points) : points[0] + d3_svg_lineHermite(points, d3_svg_lineMonotoneTangents(points));
-  }
-  d3.svg.line.radial = function() {
-    var line = d3_svg_line(d3_svg_lineRadial);
-    line.radius = line.x, delete line.x;
-    line.angle = line.y, delete line.y;
-    return line;
-  };
-  function d3_svg_lineRadial(points) {
-    var point, i = -1, n = points.length, r, a;
-    while (++i < n) {
-      point = points[i];
-      r = point[0];
-      a = point[1] + d3_svg_arcOffset;
-      point[0] = r * Math.cos(a);
-      point[1] = r * Math.sin(a);
-    }
-    return points;
-  }
-  function d3_svg_area(projection) {
-    var x0 = d3_geom_pointX, x1 = d3_geom_pointX, y0 = 0, y1 = d3_geom_pointY, defined = d3_true, interpolate = d3_svg_lineLinear, interpolateKey = interpolate.key, interpolateReverse = interpolate, L = "L", tension = .7;
-    function area(data) {
-      var segments = [], points0 = [], points1 = [], i = -1, n = data.length, d, fx0 = d3_functor(x0), fy0 = d3_functor(y0), fx1 = x0 === x1 ? function() {
-        return x;
-      } : d3_functor(x1), fy1 = y0 === y1 ? function() {
-        return y;
-      } : d3_functor(y1), x, y;
-      function segment() {
-        segments.push("M", interpolate(projection(points1), tension), L, interpolateReverse(projection(points0.reverse()), tension), "Z");
-      }
-      while (++i < n) {
-        if (defined.call(this, d = data[i], i)) {
-          points0.push([ x = +fx0.call(this, d, i), y = +fy0.call(this, d, i) ]);
-          points1.push([ +fx1.call(this, d, i), +fy1.call(this, d, i) ]);
-        } else if (points0.length) {
-          segment();
-          points0 = [];
-          points1 = [];
-        }
-      }
-      if (points0.length) segment();
-      return segments.length ? segments.join("") : null;
-    }
-    area.x = function(_) {
-      if (!arguments.length) return x1;
-      x0 = x1 = _;
-      return area;
-    };
-    area.x0 = function(_) {
-      if (!arguments.length) return x0;
-      x0 = _;
-      return area;
-    };
-    area.x1 = function(_) {
-      if (!arguments.length) return x1;
-      x1 = _;
-      return area;
-    };
-    area.y = function(_) {
-      if (!arguments.length) return y1;
-      y0 = y1 = _;
-      return area;
-    };
-    area.y0 = function(_) {
-      if (!arguments.length) return y0;
-      y0 = _;
-      return area;
-    };
-    area.y1 = function(_) {
-      if (!arguments.length) return y1;
-      y1 = _;
-      return area;
-    };
-    area.defined = function(_) {
-      if (!arguments.length) return defined;
-      defined = _;
-      return area;
-    };
-    area.interpolate = function(_) {
-      if (!arguments.length) return interpolateKey;
-      if (typeof _ === "function") interpolateKey = interpolate = _; else interpolateKey = (interpolate = d3_svg_lineInterpolators.get(_) || d3_svg_lineLinear).key;
-      interpolateReverse = interpolate.reverse || interpolate;
-      L = interpolate.closed ? "M" : "L";
-      return area;
-    };
-    area.tension = function(_) {
-      if (!arguments.length) return tension;
-      tension = _;
-      return area;
-    };
-    return area;
-  }
-  d3_svg_lineStepBefore.reverse = d3_svg_lineStepAfter;
-  d3_svg_lineStepAfter.reverse = d3_svg_lineStepBefore;
-  d3.svg.area = function() {
-    return d3_svg_area(d3_identity);
-  };
-  d3.svg.area.radial = function() {
-    var area = d3_svg_area(d3_svg_lineRadial);
-    area.radius = area.x, delete area.x;
-    area.innerRadius = area.x0, delete area.x0;
-    area.outerRadius = area.x1, delete area.x1;
-    area.angle = area.y, delete area.y;
-    area.startAngle = area.y0, delete area.y0;
-    area.endAngle = area.y1, delete area.y1;
-    return area;
-  };
-  d3.svg.chord = function() {
-    var source = d3_source, target = d3_target, radius = d3_svg_chordRadius, startAngle = d3_svg_arcStartAngle, endAngle = d3_svg_arcEndAngle;
-    function chord(d, i) {
-      var s = subgroup(this, source, d, i), t = subgroup(this, target, d, i);
-      return "M" + s.p0 + arc(s.r, s.p1, s.a1 - s.a0) + (equals(s, t) ? curve(s.r, s.p1, s.r, s.p0) : curve(s.r, s.p1, t.r, t.p0) + arc(t.r, t.p1, t.a1 - t.a0) + curve(t.r, t.p1, s.r, s.p0)) + "Z";
-    }
-    function subgroup(self, f, d, i) {
-      var subgroup = f.call(self, d, i), r = radius.call(self, subgroup, i), a0 = startAngle.call(self, subgroup, i) + d3_svg_arcOffset, a1 = endAngle.call(self, subgroup, i) + d3_svg_arcOffset;
-      return {
-        r: r,
-        a0: a0,
-        a1: a1,
-        p0: [ r * Math.cos(a0), r * Math.sin(a0) ],
-        p1: [ r * Math.cos(a1), r * Math.sin(a1) ]
-      };
-    }
-    function equals(a, b) {
-      return a.a0 == b.a0 && a.a1 == b.a1;
-    }
-    function arc(r, p, a) {
-      return "A" + r + "," + r + " 0 " + +(a > π) + ",1 " + p;
-    }
-    function curve(r0, p0, r1, p1) {
-      return "Q 0,0 " + p1;
-    }
-    chord.radius = function(v) {
-      if (!arguments.length) return radius;
-      radius = d3_functor(v);
-      return chord;
-    };
-    chord.source = function(v) {
-      if (!arguments.length) return source;
-      source = d3_functor(v);
-      return chord;
-    };
-    chord.target = function(v) {
-      if (!arguments.length) return target;
-      target = d3_functor(v);
-      return chord;
-    };
-    chord.startAngle = function(v) {
-      if (!arguments.length) return startAngle;
-      startAngle = d3_functor(v);
-      return chord;
-    };
-    chord.endAngle = function(v) {
-      if (!arguments.length) return endAngle;
-      endAngle = d3_functor(v);
-      return chord;
-    };
-    return chord;
-  };
-  function d3_svg_chordRadius(d) {
-    return d.radius;
-  }
-  d3.svg.diagonal = function() {
-    var source = d3_source, target = d3_target, projection = d3_svg_diagonalProjection;
-    function diagonal(d, i) {
-      var p0 = source.call(this, d, i), p3 = target.call(this, d, i), m = (p0.y + p3.y) / 2, p = [ p0, {
-        x: p0.x,
-        y: m
-      }, {
-        x: p3.x,
-        y: m
-      }, p3 ];
-      p = p.map(projection);
-      return "M" + p[0] + "C" + p[1] + " " + p[2] + " " + p[3];
-    }
-    diagonal.source = function(x) {
-      if (!arguments.length) return source;
-      source = d3_functor(x);
-      return diagonal;
-    };
-    diagonal.target = function(x) {
-      if (!arguments.length) return target;
-      target = d3_functor(x);
-      return diagonal;
-    };
-    diagonal.projection = function(x) {
-      if (!arguments.length) return projection;
-      projection = x;
-      return diagonal;
-    };
-    return diagonal;
-  };
-  function d3_svg_diagonalProjection(d) {
-    return [ d.x, d.y ];
-  }
-  d3.svg.diagonal.radial = function() {
-    var diagonal = d3.svg.diagonal(), projection = d3_svg_diagonalProjection, projection_ = diagonal.projection;
-    diagonal.projection = function(x) {
-      return arguments.length ? projection_(d3_svg_diagonalRadialProjection(projection = x)) : projection;
-    };
-    return diagonal;
-  };
-  function d3_svg_diagonalRadialProjection(projection) {
-    return function() {
-      var d = projection.apply(this, arguments), r = d[0], a = d[1] + d3_svg_arcOffset;
-      return [ r * Math.cos(a), r * Math.sin(a) ];
-    };
-  }
-  d3.svg.symbol = function() {
-    var type = d3_svg_symbolType, size = d3_svg_symbolSize;
-    function symbol(d, i) {
-      return (d3_svg_symbols.get(type.call(this, d, i)) || d3_svg_symbolCircle)(size.call(this, d, i));
-    }
-    symbol.type = function(x) {
-      if (!arguments.length) return type;
-      type = d3_functor(x);
-      return symbol;
-    };
-    symbol.size = function(x) {
-      if (!arguments.length) return size;
-      size = d3_functor(x);
-      return symbol;
-    };
-    return symbol;
-  };
-  function d3_svg_symbolSize() {
-    return 64;
-  }
-  function d3_svg_symbolType() {
-    return "circle";
-  }
-  function d3_svg_symbolCircle(size) {
-    var r = Math.sqrt(size / π);
-    return "M0," + r + "A" + r + "," + r + " 0 1,1 0," + -r + "A" + r + "," + r + " 0 1,1 0," + r + "Z";
-  }
-  var d3_svg_symbols = d3.map({
-    circle: d3_svg_symbolCircle,
-    cross: function(size) {
-      var r = Math.sqrt(size / 5) / 2;
-      return "M" + -3 * r + "," + -r + "H" + -r + "V" + -3 * r + "H" + r + "V" + -r + "H" + 3 * r + "V" + r + "H" + r + "V" + 3 * r + "H" + -r + "V" + r + "H" + -3 * r + "Z";
-    },
-    diamond: function(size) {
-      var ry = Math.sqrt(size / (2 * d3_svg_symbolTan30)), rx = ry * d3_svg_symbolTan30;
-      return "M0," + -ry + "L" + rx + ",0" + " 0," + ry + " " + -rx + ",0" + "Z";
-    },
-    square: function(size) {
-      var r = Math.sqrt(size) / 2;
-      return "M" + -r + "," + -r + "L" + r + "," + -r + " " + r + "," + r + " " + -r + "," + r + "Z";
-    },
-    "triangle-down": function(size) {
-      var rx = Math.sqrt(size / d3_svg_symbolSqrt3), ry = rx * d3_svg_symbolSqrt3 / 2;
-      return "M0," + ry + "L" + rx + "," + -ry + " " + -rx + "," + -ry + "Z";
-    },
-    "triangle-up": function(size) {
-      var rx = Math.sqrt(size / d3_svg_symbolSqrt3), ry = rx * d3_svg_symbolSqrt3 / 2;
-      return "M0," + -ry + "L" + rx + "," + ry + " " + -rx + "," + ry + "Z";
-    }
-  });
-  d3.svg.symbolTypes = d3_svg_symbols.keys();
-  var d3_svg_symbolSqrt3 = Math.sqrt(3), d3_svg_symbolTan30 = Math.tan(30 * d3_radians);
-  function d3_transition(groups, id) {
-    d3_subclass(groups, d3_transitionPrototype);
-    groups.id = id;
-    return groups;
-  }
-  var d3_transitionPrototype = [], d3_transitionId = 0, d3_transitionInheritId, d3_transitionInherit;
-  d3_transitionPrototype.call = d3_selectionPrototype.call;
-  d3_transitionPrototype.empty = d3_selectionPrototype.empty;
-  d3_transitionPrototype.node = d3_selectionPrototype.node;
-  d3_transitionPrototype.size = d3_selectionPrototype.size;
-  d3.transition = function(selection) {
-    return arguments.length ? d3_transitionInheritId ? selection.transition() : selection : d3_selectionRoot.transition();
-  };
-  d3.transition.prototype = d3_transitionPrototype;
-  d3_transitionPrototype.select = function(selector) {
-    var id = this.id, subgroups = [], subgroup, subnode, node;
-    selector = d3_selection_selector(selector);
-    for (var j = -1, m = this.length; ++j < m; ) {
-      subgroups.push(subgroup = []);
-      for (var group = this[j], i = -1, n = group.length; ++i < n; ) {
-        if ((node = group[i]) && (subnode = selector.call(node, node.__data__, i, j))) {
-          if ("__data__" in node) subnode.__data__ = node.__data__;
-          d3_transitionNode(subnode, i, id, node.__transition__[id]);
-          subgroup.push(subnode);
-        } else {
-          subgroup.push(null);
-        }
-      }
-    }
-    return d3_transition(subgroups, id);
-  };
-  d3_transitionPrototype.selectAll = function(selector) {
-    var id = this.id, subgroups = [], subgroup, subnodes, node, subnode, transition;
-    selector = d3_selection_selectorAll(selector);
-    for (var j = -1, m = this.length; ++j < m; ) {
-      for (var group = this[j], i = -1, n = group.length; ++i < n; ) {
-        if (node = group[i]) {
-          transition = node.__transition__[id];
-          subnodes = selector.call(node, node.__data__, i, j);
-          subgroups.push(subgroup = []);
-          for (var k = -1, o = subnodes.length; ++k < o; ) {
-            if (subnode = subnodes[k]) d3_transitionNode(subnode, k, id, transition);
-            subgroup.push(subnode);
-          }
-        }
-      }
-    }
-    return d3_transition(subgroups, id);
-  };
-  d3_transitionPrototype.filter = function(filter) {
-    var subgroups = [], subgroup, group, node;
-    if (typeof filter !== "function") filter = d3_selection_filter(filter);
-    for (var j = 0, m = this.length; j < m; j++) {
-      subgroups.push(subgroup = []);
-      for (var group = this[j], i = 0, n = group.length; i < n; i++) {
-        if ((node = group[i]) && filter.call(node, node.__data__, i, j)) {
-          subgroup.push(node);
-        }
-      }
-    }
-    return d3_transition(subgroups, this.id);
-  };
-  d3_transitionPrototype.tween = function(name, tween) {
-    var id = this.id;
-    if (arguments.length < 2) return this.node().__transition__[id].tween.get(name);
-    return d3_selection_each(this, tween == null ? function(node) {
-      node.__transition__[id].tween.remove(name);
-    } : function(node) {
-      node.__transition__[id].tween.set(name, tween);
-    });
-  };
-  function d3_transition_tween(groups, name, value, tween) {
-    var id = groups.id;
-    return d3_selection_each(groups, typeof value === "function" ? function(node, i, j) {
-      node.__transition__[id].tween.set(name, tween(value.call(node, node.__data__, i, j)));
-    } : (value = tween(value), function(node) {
-      node.__transition__[id].tween.set(name, value);
-    }));
-  }
-  d3_transitionPrototype.attr = function(nameNS, value) {
-    if (arguments.length < 2) {
-      for (value in nameNS) this.attr(value, nameNS[value]);
-      return this;
-    }
-    var interpolate = nameNS == "transform" ? d3_interpolateTransform : d3_interpolate, name = d3.ns.qualify(nameNS);
-    function attrNull() {
-      this.removeAttribute(name);
-    }
-    function attrNullNS() {
-      this.removeAttributeNS(name.space, name.local);
-    }
-    function attrTween(b) {
-      return b == null ? attrNull : (b += "", function() {
-        var a = this.getAttribute(name), i;
-        return a !== b && (i = interpolate(a, b), function(t) {
-          this.setAttribute(name, i(t));
-        });
-      });
-    }
-    function attrTweenNS(b) {
-      return b == null ? attrNullNS : (b += "", function() {
-        var a = this.getAttributeNS(name.space, name.local), i;
-        return a !== b && (i = interpolate(a, b), function(t) {
-          this.setAttributeNS(name.space, name.local, i(t));
-        });
-      });
-    }
-    return d3_transition_tween(this, "attr." + nameNS, value, name.local ? attrTweenNS : attrTween);
-  };
-  d3_transitionPrototype.attrTween = function(nameNS, tween) {
-    var name = d3.ns.qualify(nameNS);
-    function attrTween(d, i) {
-      var f = tween.call(this, d, i, this.getAttribute(name));
-      return f && function(t) {
-        this.setAttribute(name, f(t));
-      };
-    }
-    function attrTweenNS(d, i) {
-      var f = tween.call(this, d, i, this.getAttributeNS(name.space, name.local));
-      return f && function(t) {
-        this.setAttributeNS(name.space, name.local, f(t));
-      };
-    }
-    return this.tween("attr." + nameNS, name.local ? attrTweenNS : attrTween);
-  };
-  d3_transitionPrototype.style = function(name, value, priority) {
-    var n = arguments.length;
-    if (n < 3) {
-      if (typeof name !== "string") {
-        if (n < 2) value = "";
-        for (priority in name) this.style(priority, name[priority], value);
-        return this;
-      }
-      priority = "";
-    }
-    function styleNull() {
-      this.style.removeProperty(name);
-    }
-    function styleString(b) {
-      return b == null ? styleNull : (b += "", function() {
-        var a = d3_window.getComputedStyle(this, null).getPropertyValue(name), i;
-        return a !== b && (i = d3_interpolate(a, b), function(t) {
-          this.style.setProperty(name, i(t), priority);
-        });
-      });
-    }
-    return d3_transition_tween(this, "style." + name, value, styleString);
-  };
-  d3_transitionPrototype.styleTween = function(name, tween, priority) {
-    if (arguments.length < 3) priority = "";
-    function styleTween(d, i) {
-      var f = tween.call(this, d, i, d3_window.getComputedStyle(this, null).getPropertyValue(name));
-      return f && function(t) {
-        this.style.setProperty(name, f(t), priority);
-      };
-    }
-    return this.tween("style." + name, styleTween);
-  };
-  d3_transitionPrototype.text = function(value) {
-    return d3_transition_tween(this, "text", value, d3_transition_text);
-  };
-  function d3_transition_text(b) {
-    if (b == null) b = "";
-    return function() {
-      this.textContent = b;
-    };
-  }
-  d3_transitionPrototype.remove = function() {
-    return this.each("end.transition", function() {
-      var p;
-      if (this.__transition__.count < 2 && (p = this.parentNode)) p.removeChild(this);
-    });
-  };
-  d3_transitionPrototype.ease = function(value) {
-    var id = this.id;
-    if (arguments.length < 1) return this.node().__transition__[id].ease;
-    if (typeof value !== "function") value = d3.ease.apply(d3, arguments);
-    return d3_selection_each(this, function(node) {
-      node.__transition__[id].ease = value;
-    });
-  };
-  d3_transitionPrototype.delay = function(value) {
-    var id = this.id;
-    if (arguments.length < 1) return this.node().__transition__[id].delay;
-    return d3_selection_each(this, typeof value === "function" ? function(node, i, j) {
-      node.__transition__[id].delay = +value.call(node, node.__data__, i, j);
-    } : (value = +value, function(node) {
-      node.__transition__[id].delay = value;
-    }));
-  };
-  d3_transitionPrototype.duration = function(value) {
-    var id = this.id;
-    if (arguments.length < 1) return this.node().__transition__[id].duration;
-    return d3_selection_each(this, typeof value === "function" ? function(node, i, j) {
-      node.__transition__[id].duration = Math.max(1, value.call(node, node.__data__, i, j));
-    } : (value = Math.max(1, value), function(node) {
-      node.__transition__[id].duration = value;
-    }));
-  };
-  d3_transitionPrototype.each = function(type, listener) {
-    var id = this.id;
-    if (arguments.length < 2) {
-      var inherit = d3_transitionInherit, inheritId = d3_transitionInheritId;
-      d3_transitionInheritId = id;
-      d3_selection_each(this, function(node, i, j) {
-        d3_transitionInherit = node.__transition__[id];
-        type.call(node, node.__data__, i, j);
-      });
-      d3_transitionInherit = inherit;
-      d3_transitionInheritId = inheritId;
-    } else {
-      d3_selection_each(this, function(node) {
-        var transition = node.__transition__[id];
-        (transition.event || (transition.event = d3.dispatch("start", "end"))).on(type, listener);
-      });
-    }
-    return this;
-  };
-  d3_transitionPrototype.transition = function() {
-    var id0 = this.id, id1 = ++d3_transitionId, subgroups = [], subgroup, group, node, transition;
-    for (var j = 0, m = this.length; j < m; j++) {
-      subgroups.push(subgroup = []);
-      for (var group = this[j], i = 0, n = group.length; i < n; i++) {
-        if (node = group[i]) {
-          transition = Object.create(node.__transition__[id0]);
-          transition.delay += transition.duration;
-          d3_transitionNode(node, i, id1, transition);
-        }
-        subgroup.push(node);
-      }
-    }
-    return d3_transition(subgroups, id1);
-  };
-  function d3_transitionNode(node, i, id, inherit) {
-    var lock = node.__transition__ || (node.__transition__ = {
-      active: 0,
-      count: 0
-    }), transition = lock[id];
-    if (!transition) {
-      var time = inherit.time;
-      transition = lock[id] = {
-        tween: new d3_Map(),
-        time: time,
-        ease: inherit.ease,
-        delay: inherit.delay,
-        duration: inherit.duration
-      };
-      ++lock.count;
-      d3.timer(function(elapsed) {
-        var d = node.__data__, ease = transition.ease, delay = transition.delay, duration = transition.duration, timer = d3_timer_active, tweened = [];
-        timer.t = delay + time;
-        if (delay <= elapsed) return start(elapsed - delay);
-        timer.c = start;
-        function start(elapsed) {
-          if (lock.active > id) return stop();
-          lock.active = id;
-          transition.event && transition.event.start.call(node, d, i);
-          transition.tween.forEach(function(key, value) {
-            if (value = value.call(node, d, i)) {
-              tweened.push(value);
-            }
-          });
-          d3.timer(function() {
-            timer.c = tick(elapsed || 1) ? d3_true : tick;
-            return 1;
-          }, 0, time);
-        }
-        function tick(elapsed) {
-          if (lock.active !== id) return stop();
-          var t = elapsed / duration, e = ease(t), n = tweened.length;
-          while (n > 0) {
-            tweened[--n].call(node, e);
-          }
-          if (t >= 1) {
-            transition.event && transition.event.end.call(node, d, i);
-            return stop();
-          }
-        }
-        function stop() {
-          if (--lock.count) delete lock[id]; else delete node.__transition__;
-          return 1;
-        }
-      }, 0, time);
-    }
-  }
-  d3.svg.axis = function() {
-    var scale = d3.scale.linear(), orient = d3_svg_axisDefaultOrient, innerTickSize = 6, outerTickSize = 6, tickPadding = 3, tickArguments_ = [ 10 ], tickValues = null, tickFormat_;
-    function axis(g) {
-      g.each(function() {
-        var g = d3.select(this);
-        var scale0 = this.__chart__ || scale, scale1 = this.__chart__ = scale.copy();
-        var ticks = tickValues == null ? scale1.ticks ? scale1.ticks.apply(scale1, tickArguments_) : scale1.domain() : tickValues, tickFormat = tickFormat_ == null ? scale1.tickFormat ? scale1.tickFormat.apply(scale1, tickArguments_) : d3_identity : tickFormat_, tick = g.selectAll(".tick").data(ticks, scale1), tickEnter = tick.enter().insert("g", ".domain").attr("class", "tick").style("opacity", ε), tickExit = d3.transition(tick.exit()).style("opacity", ε).remove(), tickUpdate = d3.transition(tick.order()).style("opacity", 1), tickTransform;
-        var range = d3_scaleRange(scale1), path = g.selectAll(".domain").data([ 0 ]), pathUpdate = (path.enter().append("path").attr("class", "domain"), 
-        d3.transition(path));
-        tickEnter.append("line");
-        tickEnter.append("text");
-        var lineEnter = tickEnter.select("line"), lineUpdate = tickUpdate.select("line"), text = tick.select("text").text(tickFormat), textEnter = tickEnter.select("text"), textUpdate = tickUpdate.select("text");
-        switch (orient) {
-         case "bottom":
-          {
-            tickTransform = d3_svg_axisX;
-            lineEnter.attr("y2", innerTickSize);
-            textEnter.attr("y", Math.max(innerTickSize, 0) + tickPadding);
-            lineUpdate.attr("x2", 0).attr("y2", innerTickSize);
-            textUpdate.attr("x", 0).attr("y", Math.max(innerTickSize, 0) + tickPadding);
-            text.attr("dy", ".71em").style("text-anchor", "middle");
-            pathUpdate.attr("d", "M" + range[0] + "," + outerTickSize + "V0H" + range[1] + "V" + outerTickSize);
-            break;
-          }
-
-         case "top":
-          {
-            tickTransform = d3_svg_axisX;
-            lineEnter.attr("y2", -innerTickSize);
-            textEnter.attr("y", -(Math.max(innerTickSize, 0) + tickPadding));
-            lineUpdate.attr("x2", 0).attr("y2", -innerTickSize);
-            textUpdate.attr("x", 0).attr("y", -(Math.max(innerTickSize, 0) + tickPadding));
-            text.attr("dy", "0em").style("text-anchor", "middle");
-            pathUpdate.attr("d", "M" + range[0] + "," + -outerTickSize + "V0H" + range[1] + "V" + -outerTickSize);
-            break;
-          }
-
-         case "left":
-          {
-            tickTransform = d3_svg_axisY;
-            lineEnter.attr("x2", -innerTickSize);
-            textEnter.attr("x", -(Math.max(innerTickSize, 0) + tickPadding));
-            lineUpdate.attr("x2", -innerTickSize).attr("y2", 0);
-            textUpdate.attr("x", -(Math.max(innerTickSize, 0) + tickPadding)).attr("y", 0);
-            text.attr("dy", ".32em").style("text-anchor", "end");
-            pathUpdate.attr("d", "M" + -outerTickSize + "," + range[0] + "H0V" + range[1] + "H" + -outerTickSize);
-            break;
-          }
-
-         case "right":
-          {
-            tickTransform = d3_svg_axisY;
-            lineEnter.attr("x2", innerTickSize);
-            textEnter.attr("x", Math.max(innerTickSize, 0) + tickPadding);
-            lineUpdate.attr("x2", innerTickSize).attr("y2", 0);
-            textUpdate.attr("x", Math.max(innerTickSize, 0) + tickPadding).attr("y", 0);
-            text.attr("dy", ".32em").style("text-anchor", "start");
-            pathUpdate.attr("d", "M" + outerTickSize + "," + range[0] + "H0V" + range[1] + "H" + outerTickSize);
-            break;
-          }
-        }
-        if (scale1.rangeBand) {
-          var x = scale1, dx = x.rangeBand() / 2;
-          scale0 = scale1 = function(d) {
-            return x(d) + dx;
-          };
-        } else if (scale0.rangeBand) {
-          scale0 = scale1;
-        } else {
-          tickExit.call(tickTransform, scale1);
-        }
-        tickEnter.call(tickTransform, scale0);
-        tickUpdate.call(tickTransform, scale1);
-      });
-    }
-    axis.scale = function(x) {
-      if (!arguments.length) return scale;
-      scale = x;
-      return axis;
-    };
-    axis.orient = function(x) {
-      if (!arguments.length) return orient;
-      orient = x in d3_svg_axisOrients ? x + "" : d3_svg_axisDefaultOrient;
-      return axis;
-    };
-    axis.ticks = function() {
-      if (!arguments.length) return tickArguments_;
-      tickArguments_ = arguments;
-      return axis;
-    };
-    axis.tickValues = function(x) {
-      if (!arguments.length) return tickValues;
-      tickValues = x;
-      return axis;
-    };
-    axis.tickFormat = function(x) {
-      if (!arguments.length) return tickFormat_;
-      tickFormat_ = x;
-      return axis;
-    };
-    axis.tickSize = function(x) {
-      var n = arguments.length;
-      if (!n) return innerTickSize;
-      innerTickSize = +x;
-      outerTickSize = +arguments[n - 1];
-      return axis;
-    };
-    axis.innerTickSize = function(x) {
-      if (!arguments.length) return innerTickSize;
-      innerTickSize = +x;
-      return axis;
-    };
-    axis.outerTickSize = function(x) {
-      if (!arguments.length) return outerTickSize;
-      outerTickSize = +x;
-      return axis;
-    };
-    axis.tickPadding = function(x) {
-      if (!arguments.length) return tickPadding;
-      tickPadding = +x;
-      return axis;
-    };
-    axis.tickSubdivide = function() {
-      return arguments.length && axis;
-    };
-    return axis;
-  };
-  var d3_svg_axisDefaultOrient = "bottom", d3_svg_axisOrients = {
-    top: 1,
-    right: 1,
-    bottom: 1,
-    left: 1
-  };
-  function d3_svg_axisX(selection, x) {
-    selection.attr("transform", function(d) {
-      return "translate(" + x(d) + ",0)";
-    });
-  }
-  function d3_svg_axisY(selection, y) {
-    selection.attr("transform", function(d) {
-      return "translate(0," + y(d) + ")";
-    });
-  }
-  d3.svg.brush = function() {
-    var event = d3_eventDispatch(brush, "brushstart", "brush", "brushend"), x = null, y = null, xExtent = [ 0, 0 ], yExtent = [ 0, 0 ], xExtentDomain, yExtentDomain, xClamp = true, yClamp = true, resizes = d3_svg_brushResizes[0];
-    function brush(g) {
-      g.each(function() {
-        var g = d3.select(this).style("pointer-events", "all").style("-webkit-tap-highlight-color", "rgba(0,0,0,0)").on("mousedown.brush", brushstart).on("touchstart.brush", brushstart);
-        var background = g.selectAll(".background").data([ 0 ]);
-        background.enter().append("rect").attr("class", "background").style("visibility", "hidden").style("cursor", "crosshair");
-        g.selectAll(".extent").data([ 0 ]).enter().append("rect").attr("class", "extent").style("cursor", "move");
-        var resize = g.selectAll(".resize").data(resizes, d3_identity);
-        resize.exit().remove();
-        resize.enter().append("g").attr("class", function(d) {
-          return "resize " + d;
-        }).style("cursor", function(d) {
-          return d3_svg_brushCursor[d];
-        }).append("rect").attr("x", function(d) {
-          return /[ew]$/.test(d) ? -3 : null;
-        }).attr("y", function(d) {
-          return /^[ns]/.test(d) ? -3 : null;
-        }).attr("width", 6).attr("height", 6).style("visibility", "hidden");
-        resize.style("display", brush.empty() ? "none" : null);
-        var gUpdate = d3.transition(g), backgroundUpdate = d3.transition(background), range;
-        if (x) {
-          range = d3_scaleRange(x);
-          backgroundUpdate.attr("x", range[0]).attr("width", range[1] - range[0]);
-          redrawX(gUpdate);
-        }
-        if (y) {
-          range = d3_scaleRange(y);
-          backgroundUpdate.attr("y", range[0]).attr("height", range[1] - range[0]);
-          redrawY(gUpdate);
-        }
-        redraw(gUpdate);
-      });
-    }
-    brush.event = function(g) {
-      g.each(function() {
-        var event_ = event.of(this, arguments), extent1 = {
-          x: xExtent,
-          y: yExtent,
-          i: xExtentDomain,
-          j: yExtentDomain
-        }, extent0 = this.__chart__ || extent1;
-        this.__chart__ = extent1;
-        if (d3_transitionInheritId) {
-          d3.select(this).transition().each("start.brush", function() {
-            xExtentDomain = extent0.i;
-            yExtentDomain = extent0.j;
-            xExtent = extent0.x;
-            yExtent = extent0.y;
-            event_({
-              type: "brushstart"
-            });
-          }).tween("brush:brush", function() {
-            var xi = d3_interpolateArray(xExtent, extent1.x), yi = d3_interpolateArray(yExtent, extent1.y);
-            xExtentDomain = yExtentDomain = null;
-            return function(t) {
-              xExtent = extent1.x = xi(t);
-              yExtent = extent1.y = yi(t);
-              event_({
-                type: "brush",
-                mode: "resize"
-              });
-            };
-          }).each("end.brush", function() {
-            xExtentDomain = extent1.i;
-            yExtentDomain = extent1.j;
-            event_({
-              type: "brush",
-              mode: "resize"
-            });
-            event_({
-              type: "brushend"
-            });
-          });
-        } else {
-          event_({
-            type: "brushstart"
-          });
-          event_({
-            type: "brush",
-            mode: "resize"
-          });
-          event_({
-            type: "brushend"
-          });
-        }
-      });
-    };
-    function redraw(g) {
-      g.selectAll(".resize").attr("transform", function(d) {
-        return "translate(" + xExtent[+/e$/.test(d)] + "," + yExtent[+/^s/.test(d)] + ")";
-      });
-    }
-    function redrawX(g) {
-      g.select(".extent").attr("x", xExtent[0]);
-      g.selectAll(".extent,.n>rect,.s>rect").attr("width", xExtent[1] - xExtent[0]);
-    }
-    function redrawY(g) {
-      g.select(".extent").attr("y", yExtent[0]);
-      g.selectAll(".extent,.e>rect,.w>rect").attr("height", yExtent[1] - yExtent[0]);
-    }
-    function brushstart() {
-      var target = this, eventTarget = d3.select(d3.event.target), event_ = event.of(target, arguments), g = d3.select(target), resizing = eventTarget.datum(), resizingX = !/^(n|s)$/.test(resizing) && x, resizingY = !/^(e|w)$/.test(resizing) && y, dragging = eventTarget.classed("extent"), dragRestore = d3_event_dragSuppress(), center, origin = d3.mouse(target), offset;
-      var w = d3.select(d3_window).on("keydown.brush", keydown).on("keyup.brush", keyup);
-      if (d3.event.changedTouches) {
-        w.on("touchmove.brush", brushmove).on("touchend.brush", brushend);
-      } else {
-        w.on("mousemove.brush", brushmove).on("mouseup.brush", brushend);
-      }
-      g.interrupt().selectAll("*").interrupt();
-      if (dragging) {
-        origin[0] = xExtent[0] - origin[0];
-        origin[1] = yExtent[0] - origin[1];
-      } else if (resizing) {
-        var ex = +/w$/.test(resizing), ey = +/^n/.test(resizing);
-        offset = [ xExtent[1 - ex] - origin[0], yExtent[1 - ey] - origin[1] ];
-        origin[0] = xExtent[ex];
-        origin[1] = yExtent[ey];
-      } else if (d3.event.altKey) center = origin.slice();
-      g.style("pointer-events", "none").selectAll(".resize").style("display", null);
-      d3.select("body").style("cursor", eventTarget.style("cursor"));
-      event_({
-        type: "brushstart"
-      });
-      brushmove();
-      function keydown() {
-        if (d3.event.keyCode == 32) {
-          if (!dragging) {
-            center = null;
-            origin[0] -= xExtent[1];
-            origin[1] -= yExtent[1];
-            dragging = 2;
-          }
-          d3_eventPreventDefault();
-        }
-      }
-      function keyup() {
-        if (d3.event.keyCode == 32 && dragging == 2) {
-          origin[0] += xExtent[1];
-          origin[1] += yExtent[1];
-          dragging = 0;
-          d3_eventPreventDefault();
-        }
-      }
-      function brushmove() {
-        var point = d3.mouse(target), moved = false;
-        if (offset) {
-          point[0] += offset[0];
-          point[1] += offset[1];
-        }
-        if (!dragging) {
-          if (d3.event.altKey) {
-            if (!center) center = [ (xExtent[0] + xExtent[1]) / 2, (yExtent[0] + yExtent[1]) / 2 ];
-            origin[0] = xExtent[+(point[0] < center[0])];
-            origin[1] = yExtent[+(point[1] < center[1])];
-          } else center = null;
-        }
-        if (resizingX && move1(point, x, 0)) {
-          redrawX(g);
-          moved = true;
-        }
-        if (resizingY && move1(point, y, 1)) {
-          redrawY(g);
-          moved = true;
-        }
-        if (moved) {
-          redraw(g);
-          event_({
-            type: "brush",
-            mode: dragging ? "move" : "resize"
-          });
-        }
-      }
-      function move1(point, scale, i) {
-        var range = d3_scaleRange(scale), r0 = range[0], r1 = range[1], position = origin[i], extent = i ? yExtent : xExtent, size = extent[1] - extent[0], min, max;
-        if (dragging) {
-          r0 -= position;
-          r1 -= size + position;
-        }
-        min = (i ? yClamp : xClamp) ? Math.max(r0, Math.min(r1, point[i])) : point[i];
-        if (dragging) {
-          max = (min += position) + size;
-        } else {
-          if (center) position = Math.max(r0, Math.min(r1, 2 * center[i] - min));
-          if (position < min) {
-            max = min;
-            min = position;
-          } else {
-            max = position;
-          }
-        }
-        if (extent[0] != min || extent[1] != max) {
-          if (i) yExtentDomain = null; else xExtentDomain = null;
-          extent[0] = min;
-          extent[1] = max;
-          return true;
-        }
-      }
-      function brushend() {
-        brushmove();
-        g.style("pointer-events", "all").selectAll(".resize").style("display", brush.empty() ? "none" : null);
-        d3.select("body").style("cursor", null);
-        w.on("mousemove.brush", null).on("mouseup.brush", null).on("touchmove.brush", null).on("touchend.brush", null).on("keydown.brush", null).on("keyup.brush", null);
-        dragRestore();
-        event_({
-          type: "brushend"
-        });
-      }
-    }
-    brush.x = function(z) {
-      if (!arguments.length) return x;
-      x = z;
-      resizes = d3_svg_brushResizes[!x << 1 | !y];
-      return brush;
-    };
-    brush.y = function(z) {
-      if (!arguments.length) return y;
-      y = z;
-      resizes = d3_svg_brushResizes[!x << 1 | !y];
-      return brush;
-    };
-    brush.clamp = function(z) {
-      if (!arguments.length) return x && y ? [ xClamp, yClamp ] : x ? xClamp : y ? yClamp : null;
-      if (x && y) xClamp = !!z[0], yClamp = !!z[1]; else if (x) xClamp = !!z; else if (y) yClamp = !!z;
-      return brush;
-    };
-    brush.extent = function(z) {
-      var x0, x1, y0, y1, t;
-      if (!arguments.length) {
-        if (x) {
-          if (xExtentDomain) {
-            x0 = xExtentDomain[0], x1 = xExtentDomain[1];
-          } else {
-            x0 = xExtent[0], x1 = xExtent[1];
-            if (x.invert) x0 = x.invert(x0), x1 = x.invert(x1);
-            if (x1 < x0) t = x0, x0 = x1, x1 = t;
-          }
-        }
-        if (y) {
-          if (yExtentDomain) {
-            y0 = yExtentDomain[0], y1 = yExtentDomain[1];
-          } else {
-            y0 = yExtent[0], y1 = yExtent[1];
-            if (y.invert) y0 = y.invert(y0), y1 = y.invert(y1);
-            if (y1 < y0) t = y0, y0 = y1, y1 = t;
-          }
-        }
-        return x && y ? [ [ x0, y0 ], [ x1, y1 ] ] : x ? [ x0, x1 ] : y && [ y0, y1 ];
-      }
-      if (x) {
-        x0 = z[0], x1 = z[1];
-        if (y) x0 = x0[0], x1 = x1[0];
-        xExtentDomain = [ x0, x1 ];
-        if (x.invert) x0 = x(x0), x1 = x(x1);
-        if (x1 < x0) t = x0, x0 = x1, x1 = t;
-        if (x0 != xExtent[0] || x1 != xExtent[1]) xExtent = [ x0, x1 ];
-      }
-      if (y) {
-        y0 = z[0], y1 = z[1];
-        if (x) y0 = y0[1], y1 = y1[1];
-        yExtentDomain = [ y0, y1 ];
-        if (y.invert) y0 = y(y0), y1 = y(y1);
-        if (y1 < y0) t = y0, y0 = y1, y1 = t;
-        if (y0 != yExtent[0] || y1 != yExtent[1]) yExtent = [ y0, y1 ];
-      }
-      return brush;
-    };
-    brush.clear = function() {
-      if (!brush.empty()) {
-        xExtent = [ 0, 0 ], yExtent = [ 0, 0 ];
-        xExtentDomain = yExtentDomain = null;
-      }
-      return brush;
-    };
-    brush.empty = function() {
-      return !!x && xExtent[0] == xExtent[1] || !!y && yExtent[0] == yExtent[1];
-    };
-    return d3.rebind(brush, event, "on");
-  };
-  var d3_svg_brushCursor = {
-    n: "ns-resize",
-    e: "ew-resize",
-    s: "ns-resize",
-    w: "ew-resize",
-    nw: "nwse-resize",
-    ne: "nesw-resize",
-    se: "nwse-resize",
-    sw: "nesw-resize"
-  };
-  var d3_svg_brushResizes = [ [ "n", "e", "s", "w", "nw", "ne", "se", "sw" ], [ "e", "w" ], [ "n", "s" ], [] ];
-  var d3_time_format = d3_time.format = d3_locale_enUS.timeFormat;
-  var d3_time_formatUtc = d3_time_format.utc;
-  var d3_time_formatIso = d3_time_formatUtc("%Y-%m-%dT%H:%M:%S.%LZ");
-  d3_time_format.iso = Date.prototype.toISOString && +new Date("2000-01-01T00:00:00.000Z") ? d3_time_formatIsoNative : d3_time_formatIso;
-  function d3_time_formatIsoNative(date) {
-    return date.toISOString();
-  }
-  d3_time_formatIsoNative.parse = function(string) {
-    var date = new Date(string);
-    return isNaN(date) ? null : date;
-  };
-  d3_time_formatIsoNative.toString = d3_time_formatIso.toString;
-  d3_time.second = d3_time_interval(function(date) {
-    return new d3_date(Math.floor(date / 1e3) * 1e3);
-  }, function(date, offset) {
-    date.setTime(date.getTime() + Math.floor(offset) * 1e3);
-  }, function(date) {
-    return date.getSeconds();
-  });
-  d3_time.seconds = d3_time.second.range;
-  d3_time.seconds.utc = d3_time.second.utc.range;
-  d3_time.minute = d3_time_interval(function(date) {
-    return new d3_date(Math.floor(date / 6e4) * 6e4);
-  }, function(date, offset) {
-    date.setTime(date.getTime() + Math.floor(offset) * 6e4);
-  }, function(date) {
-    return date.getMinutes();
-  });
-  d3_time.minutes = d3_time.minute.range;
-  d3_time.minutes.utc = d3_time.minute.utc.range;
-  d3_time.hour = d3_time_interval(function(date) {
-    var timezone = date.getTimezoneOffset() / 60;
-    return new d3_date((Math.floor(date / 36e5 - timezone) + timezone) * 36e5);
-  }, function(date, offset) {
-    date.setTime(date.getTime() + Math.floor(offset) * 36e5);
-  }, function(date) {
-    return date.getHours();
-  });
-  d3_time.hours = d3_time.hour.range;
-  d3_time.hours.utc = d3_time.hour.utc.range;
-  d3_time.month = d3_time_interval(function(date) {
-    date = d3_time.day(date);
-    date.setDate(1);
-    return date;
-  }, function(date, offset) {
-    date.setMonth(date.getMonth() + offset);
-  }, function(date) {
-    return date.getMonth();
-  });
-  d3_time.months = d3_time.month.range;
-  d3_time.months.utc = d3_time.month.utc.range;
-  function d3_time_scale(linear, methods, format) {
-    function scale(x) {
-      return linear(x);
-    }
-    scale.invert = function(x) {
-      return d3_time_scaleDate(linear.invert(x));
-    };
-    scale.domain = function(x) {
-      if (!arguments.length) return linear.domain().map(d3_time_scaleDate);
-      linear.domain(x);
-      return scale;
-    };
-    function tickMethod(extent, count) {
-      var span = extent[1] - extent[0], target = span / count, i = d3.bisect(d3_time_scaleSteps, target);
-      return i == d3_time_scaleSteps.length ? [ methods.year, d3_scale_linearTickRange(extent.map(function(d) {
-        return d / 31536e6;
-      }), count)[2] ] : !i ? [ d3_time_scaleMilliseconds, d3_scale_linearTickRange(extent, count)[2] ] : methods[target / d3_time_scaleSteps[i - 1] < d3_time_scaleSteps[i] / target ? i - 1 : i];
-    }
-    scale.nice = function(interval, skip) {
-      var domain = scale.domain(), extent = d3_scaleExtent(domain), method = interval == null ? tickMethod(extent, 10) : typeof interval === "number" && tickMethod(extent, interval);
-      if (method) interval = method[0], skip = method[1];
-      function skipped(date) {
-        return !isNaN(date) && !interval.range(date, d3_time_scaleDate(+date + 1), skip).length;
-      }
-      return scale.domain(d3_scale_nice(domain, skip > 1 ? {
-        floor: function(date) {
-          while (skipped(date = interval.floor(date))) date = d3_time_scaleDate(date - 1);
-          return date;
-        },
-        ceil: function(date) {
-          while (skipped(date = interval.ceil(date))) date = d3_time_scaleDate(+date + 1);
-          return date;
-        }
-      } : interval));
-    };
-    scale.ticks = function(interval, skip) {
-      var extent = d3_scaleExtent(scale.domain()), method = interval == null ? tickMethod(extent, 10) : typeof interval === "number" ? tickMethod(extent, interval) : !interval.range && [ {
-        range: interval
-      }, skip ];
-      if (method) interval = method[0], skip = method[1];
-      return interval.range(extent[0], d3_time_scaleDate(+extent[1] + 1), skip < 1 ? 1 : skip);
-    };
-    scale.tickFormat = function() {
-      return format;
-    };
-    scale.copy = function() {
-      return d3_time_scale(linear.copy(), methods, format);
-    };
-    return d3_scale_linearRebind(scale, linear);
-  }
-  function d3_time_scaleDate(t) {
-    return new Date(t);
-  }
-  var d3_time_scaleSteps = [ 1e3, 5e3, 15e3, 3e4, 6e4, 3e5, 9e5, 18e5, 36e5, 108e5, 216e5, 432e5, 864e5, 1728e5, 6048e5, 2592e6, 7776e6, 31536e6 ];
-  var d3_time_scaleLocalMethods = [ [ d3_time.second, 1 ], [ d3_time.second, 5 ], [ d3_time.second, 15 ], [ d3_time.second, 30 ], [ d3_time.minute, 1 ], [ d3_time.minute, 5 ], [ d3_time.minute, 15 ], [ d3_time.minute, 30 ], [ d3_time.hour, 1 ], [ d3_time.hour, 3 ], [ d3_time.hour, 6 ], [ d3_time.hour, 12 ], [ d3_time.day, 1 ], [ d3_time.day, 2 ], [ d3_time.week, 1 ], [ d3_time.month, 1 ], [ d3_time.month, 3 ], [ d3_time.year, 1 ] ];
-  var d3_time_scaleLocalFormat = d3_time_format.multi([ [ ".%L", function(d) {
-    return d.getMilliseconds();
-  } ], [ ":%S", function(d) {
-    return d.getSeconds();
-  } ], [ "%I:%M", function(d) {
-    return d.getMinutes();
-  } ], [ "%I %p", function(d) {
-    return d.getHours();
-  } ], [ "%a %d", function(d) {
-    return d.getDay() && d.getDate() != 1;
-  } ], [ "%b %d", function(d) {
-    return d.getDate() != 1;
-  } ], [ "%B", function(d) {
-    return d.getMonth();
-  } ], [ "%Y", d3_true ] ]);
-  var d3_time_scaleMilliseconds = {
-    range: function(start, stop, step) {
-      return d3.range(Math.ceil(start / step) * step, +stop, step).map(d3_time_scaleDate);
-    },
-    floor: d3_identity,
-    ceil: d3_identity
-  };
-  d3_time_scaleLocalMethods.year = d3_time.year;
-  d3_time.scale = function() {
-    return d3_time_scale(d3.scale.linear(), d3_time_scaleLocalMethods, d3_time_scaleLocalFormat);
-  };
-  var d3_time_scaleUtcMethods = d3_time_scaleLocalMethods.map(function(m) {
-    return [ m[0].utc, m[1] ];
-  });
-  var d3_time_scaleUtcFormat = d3_time_formatUtc.multi([ [ ".%L", function(d) {
-    return d.getUTCMilliseconds();
-  } ], [ ":%S", function(d) {
-    return d.getUTCSeconds();
-  } ], [ "%I:%M", function(d) {
-    return d.getUTCMinutes();
-  } ], [ "%I %p", function(d) {
-    return d.getUTCHours();
-  } ], [ "%a %d", function(d) {
-    return d.getUTCDay() && d.getUTCDate() != 1;
-  } ], [ "%b %d", function(d) {
-    return d.getUTCDate() != 1;
-  } ], [ "%B", function(d) {
-    return d.getUTCMonth();
-  } ], [ "%Y", d3_true ] ]);
-  d3_time_scaleUtcMethods.year = d3_time.year.utc;
-  d3_time.scale.utc = function() {
-    return d3_time_scale(d3.scale.linear(), d3_time_scaleUtcMethods, d3_time_scaleUtcFormat);
-  };
-  d3.text = d3_xhrType(function(request) {
-    return request.responseText;
-  });
-  d3.json = function(url, callback) {
-    return d3_xhr(url, "application/json", d3_json, callback);
-  };
-  function d3_json(request) {
-    return JSON.parse(request.responseText);
-  }
-  d3.html = function(url, callback) {
-    throw "disallowed by chromium security";
-    return d3_xhr(url, "text/html", d3_html, callback);
-  };
-  function d3_html(request) {
-    throw "disallowed by chromium security";
-    var range = d3_document.createRange();
-    range.selectNode(d3_document.body);
-    return range.createContextualFragment(request.responseText);
-  }
-  d3.xml = d3_xhrType(function(request) {
-    return request.responseXML;
-  });
-  if (typeof define === "function" && define.amd) {
-    define(d3);
-  } else if (typeof module === "object" && module.exports) {
-    module.exports = d3;
-  } else {
-    this.d3 = d3;
-  }
-}();
diff --git a/third_party/webtreemap/LICENSE b/third_party/webtreemap/LICENSE
deleted file mode 100644
index 261eeb9..0000000
--- a/third_party/webtreemap/LICENSE
+++ /dev/null
@@ -1,201 +0,0 @@
-                                 Apache License
-                           Version 2.0, January 2004
-                        http://www.apache.org/licenses/
-
-   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-   1. Definitions.
-
-      "License" shall mean the terms and conditions for use, reproduction,
-      and distribution as defined by Sections 1 through 9 of this document.
-
-      "Licensor" shall mean the copyright owner or entity authorized by
-      the copyright owner that is granting the License.
-
-      "Legal Entity" shall mean the union of the acting entity and all
-      other entities that control, are controlled by, or are under common
-      control with that entity. For the purposes of this definition,
-      "control" means (i) the power, direct or indirect, to cause the
-      direction or management of such entity, whether by contract or
-      otherwise, or (ii) ownership of fifty percent (50%) or more of the
-      outstanding shares, or (iii) beneficial ownership of such entity.
-
-      "You" (or "Your") shall mean an individual or Legal Entity
-      exercising permissions granted by this License.
-
-      "Source" form shall mean the preferred form for making modifications,
-      including but not limited to software source code, documentation
-      source, and configuration files.
-
-      "Object" form shall mean any form resulting from mechanical
-      transformation or translation of a Source form, including but
-      not limited to compiled object code, generated documentation,
-      and conversions to other media types.
-
-      "Work" shall mean the work of authorship, whether in Source or
-      Object form, made available under the License, as indicated by a
-      copyright notice that is included in or attached to the work
-      (an example is provided in the Appendix below).
-
-      "Derivative Works" shall mean any work, whether in Source or Object
-      form, that is based on (or derived from) the Work and for which the
-      editorial revisions, annotations, elaborations, or other modifications
-      represent, as a whole, an original work of authorship. For the purposes
-      of this License, Derivative Works shall not include works that remain
-      separable from, or merely link (or bind by name) to the interfaces of,
-      the Work and Derivative Works thereof.
-
-      "Contribution" shall mean any work of authorship, including
-      the original version of the Work and any modifications or additions
-      to that Work or Derivative Works thereof, that is intentionally
-      submitted to Licensor for inclusion in the Work by the copyright owner
-      or by an individual or Legal Entity authorized to submit on behalf of
-      the copyright owner. For the purposes of this definition, "submitted"
-      means any form of electronic, verbal, or written communication sent
-      to the Licensor or its representatives, including but not limited to
-      communication on electronic mailing lists, source code control systems,
-      and issue tracking systems that are managed by, or on behalf of, the
-      Licensor for the purpose of discussing and improving the Work, but
-      excluding communication that is conspicuously marked or otherwise
-      designated in writing by the copyright owner as "Not a Contribution."
-
-      "Contributor" shall mean Licensor and any individual or Legal Entity
-      on behalf of whom a Contribution has been received by Licensor and
-      subsequently incorporated within the Work.
-
-   2. Grant of Copyright License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      copyright license to reproduce, prepare Derivative Works of,
-      publicly display, publicly perform, sublicense, and distribute the
-      Work and such Derivative Works in Source or Object form.
-
-   3. Grant of Patent License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      (except as stated in this section) patent license to make, have made,
-      use, offer to sell, sell, import, and otherwise transfer the Work,
-      where such license applies only to those patent claims licensable
-      by such Contributor that are necessarily infringed by their
-      Contribution(s) alone or by combination of their Contribution(s)
-      with the Work to which such Contribution(s) was submitted. If You
-      institute patent litigation against any entity (including a
-      cross-claim or counterclaim in a lawsuit) alleging that the Work
-      or a Contribution incorporated within the Work constitutes direct
-      or contributory patent infringement, then any patent licenses
-      granted to You under this License for that Work shall terminate
-      as of the date such litigation is filed.
-
-   4. Redistribution. You may reproduce and distribute copies of the
-      Work or Derivative Works thereof in any medium, with or without
-      modifications, and in Source or Object form, provided that You
-      meet the following conditions:
-
-      (a) You must give any other recipients of the Work or
-          Derivative Works a copy of this License; and
-
-      (b) You must cause any modified files to carry prominent notices
-          stating that You changed the files; and
-
-      (c) You must retain, in the Source form of any Derivative Works
-          that You distribute, all copyright, patent, trademark, and
-          attribution notices from the Source form of the Work,
-          excluding those notices that do not pertain to any part of
-          the Derivative Works; and
-
-      (d) If the Work includes a "NOTICE" text file as part of its
-          distribution, then any Derivative Works that You distribute must
-          include a readable copy of the attribution notices contained
-          within such NOTICE file, excluding those notices that do not
-          pertain to any part of the Derivative Works, in at least one
-          of the following places: within a NOTICE text file distributed
-          as part of the Derivative Works; within the Source form or
-          documentation, if provided along with the Derivative Works; or,
-          within a display generated by the Derivative Works, if and
-          wherever such third-party notices normally appear. The contents
-          of the NOTICE file are for informational purposes only and
-          do not modify the License. You may add Your own attribution
-          notices within Derivative Works that You distribute, alongside
-          or as an addendum to the NOTICE text from the Work, provided
-          that such additional attribution notices cannot be construed
-          as modifying the License.
-
-      You may add Your own copyright statement to Your modifications and
-      may provide additional or different license terms and conditions
-      for use, reproduction, or distribution of Your modifications, or
-      for any such Derivative Works as a whole, provided Your use,
-      reproduction, and distribution of the Work otherwise complies with
-      the conditions stated in this License.
-
-   5. Submission of Contributions. Unless You explicitly state otherwise,
-      any Contribution intentionally submitted for inclusion in the Work
-      by You to the Licensor shall be under the terms and conditions of
-      this License, without any additional terms or conditions.
-      Notwithstanding the above, nothing herein shall supersede or modify
-      the terms of any separate license agreement you may have executed
-      with Licensor regarding such Contributions.
-
-   6. Trademarks. This License does not grant permission to use the trade
-      names, trademarks, service marks, or product names of the Licensor,
-      except as required for reasonable and customary use in describing the
-      origin of the Work and reproducing the content of the NOTICE file.
-
-   7. Disclaimer of Warranty. Unless required by applicable law or
-      agreed to in writing, Licensor provides the Work (and each
-      Contributor provides its Contributions) on an "AS IS" BASIS,
-      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-      implied, including, without limitation, any warranties or conditions
-      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-      PARTICULAR PURPOSE. You are solely responsible for determining the
-      appropriateness of using or redistributing the Work and assume any
-      risks associated with Your exercise of permissions under this License.
-
-   8. Limitation of Liability. In no event and under no legal theory,
-      whether in tort (including negligence), contract, or otherwise,
-      unless required by applicable law (such as deliberate and grossly
-      negligent acts) or agreed to in writing, shall any Contributor be
-      liable to You for damages, including any direct, indirect, special,
-      incidental, or consequential damages of any character arising as a
-      result of this License or out of the use or inability to use the
-      Work (including but not limited to damages for loss of goodwill,
-      work stoppage, computer failure or malfunction, or any and all
-      other commercial damages or losses), even if such Contributor
-      has been advised of the possibility of such damages.
-
-   9. Accepting Warranty or Additional Liability. While redistributing
-      the Work or Derivative Works thereof, You may choose to offer,
-      and charge a fee for, acceptance of support, warranty, indemnity,
-      or other liability obligations and/or rights consistent with this
-      License. However, in accepting such obligations, You may act only
-      on Your own behalf and on Your sole responsibility, not on behalf
-      of any other Contributor, and only if You agree to indemnify,
-      defend, and hold each Contributor harmless for any liability
-      incurred by, or claims asserted against, such Contributor by reason
-      of your accepting any such warranty or additional liability.
-
-   END OF TERMS AND CONDITIONS
-
-   APPENDIX: How to apply the Apache License to your work.
-
-      To apply the Apache License to your work, attach the following
-      boilerplate notice, with the fields enclosed by brackets "[]"
-      replaced with your own identifying information. (Don't include
-      the brackets!)  The text should be enclosed in the appropriate
-      comment syntax for the file format. We also recommend that a
-      file or class name and description of purpose be included on the
-      same "printed page" as the copyright notice for easier
-      identification within third-party archives.
-
-   Copyright [yyyy] [name of copyright owner]
-
-   Licensed under the Apache License, Version 2.0 (the "License");
-   you may not use this file except in compliance with the License.
-   You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
diff --git a/third_party/webtreemap/README.fuchsia b/third_party/webtreemap/README.fuchsia
deleted file mode 100644
index 5a84af6..0000000
--- a/third_party/webtreemap/README.fuchsia
+++ /dev/null
@@ -1,4 +0,0 @@
-Name: webtreemap
-URL: https://github.com/evmar/webtreemap/tree/v1
-LICENSE: Apache2
-Upstream Git: https://github.com/evmar/webtreemap/tree/v1
diff --git a/third_party/webtreemap/webtreemap.css b/third_party/webtreemap/webtreemap.css
deleted file mode 100644
index 702df5e..0000000
--- a/third_party/webtreemap/webtreemap.css
+++ /dev/null
@@ -1,84 +0,0 @@
-.webtreemap-node {
-  /* Required attributes. */
-  position: absolute;
-  overflow: hidden;   /* To hide overlong captions. */
-  background: white;  /* Nodes must be opaque for zIndex layering. */
-  border: solid 1px black;  /* Calculations assume 1px border. */
-
-  /* Optional: CSS animation. */
-  transition: top    0.3s,
-              left   0.3s,
-              width  0.3s,
-              height 0.3s;
-}
-
-/* Optional: highlight nodes on mouseover. */
-.webtreemap-node:hover {
-  background: #eee;
-}
-
-/* Optional: Different background colors depending on symbol. */
-.webtreemap-symbol-bss {
-  background: #66C2A5;
-}
-.webtreemap-symbol-data {
-  background: #FC8D62;
-}
-.webtreemap-symbol-read-only_data {
-  background: #8DA0CB;
-}
-.webtreemap-symbol-code {
-  background: #E78AC3;
-}
-.webtreemap-symbol-weak_symbol {
-  background: #A6D854;
-}
-.webtreemap-symbol-bss.webtreemap-aggregate {
-  background: #B3E2CD;
-}
-.webtreemap-symbol-data.webtreemap-aggregate {
-  background: #FDCDAC;
-}
-.webtreemap-symbol-read-only_data.webtreemap-aggregate {
-  background: #CBD5E8;
-}
-.webtreemap-symbol-code.webtreemap-aggregate {
-  background: #F4CAE4;
-}
-.webtreemap-symbol-weak_symbol.webtreemap-aggregate {
-  background: #E6F5C9;
-}
-
-#legend > * {
-  border: solid 1px #444;
-}
-
-/* Optional: Different borders depending on level. */
-.webtreemap-level0 {
-  border: solid 1px #444;
-}
-.webtreemap-level1 {
-  border: solid 1px #666;
-}
-.webtreemap-level2 {
-  border: solid 1px #888;
-}
-.webtreemap-level3 {
-  border: solid 1px #aaa;
-}
-.webtreemap-level4 {
-  border: solid 1px #ccc;
-}
-
-/* Optional: styling on node captions. */
-.webtreemap-caption {
-  font-family: sans-serif;
-  font-size: 11px;
-  padding: 2px;
-  text-align: center;
-}
-
-/* Optional: styling on captions on mouse hover. */
-/*.webtreemap-node:hover > .webtreemap-caption {
-  text-decoration: underline;
-}*/
diff --git a/third_party/webtreemap/webtreemap.js b/third_party/webtreemap/webtreemap.js
deleted file mode 100644
index 2d25667..0000000
--- a/third_party/webtreemap/webtreemap.js
+++ /dev/null
@@ -1,269 +0,0 @@
-/* @license
- *  Copyright 2013 Google Inc. All Rights Reserved.
- *
- *  Licensed under the Apache License, Version 2.0 (the "License");
- *  you may not use this file except in compliance with the License.
- *  You may obtain a copy of the License at
- *
- *      http:www.apache.org/licenses/LICENSE-2.0
- *
- *  Unless required by applicable law or agreed to in writing, software
- *  distributed under the License is distributed on an "AS IS" BASIS,
- *  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- *  See the License for the specific language governing permissions and
- *  limitations under the License.
- */
-
-;(function(root, factory) {
-  if (typeof define === 'function' && define.amd)
-    define([], factory);
-  else if (typeof module === 'object' && module.exports)
-    module.exports = factory();
-  else
-    root.appendTreemap = factory();
-}(this, function() {
-// Size of border around nodes.
-// We could support arbitrary borders using getComputedStyle(), but I am
-// skeptical the extra complexity (and performance hit) is worth it.
-var kBorderWidth = 1;
-
-// Padding around contents.
-// TODO: do this with a nested div to allow it to be CSS-styleable.
-var kPadding = 4;
-
-// x/y ratio to aim for -- wider rectangles are better for text display
-var kAspectRatio = 1.2;
-
-var focused = null;
-
-function focus(tree) {
-  focused = tree;
-
-  // Hide all visible siblings of all our ancestors by lowering them.
-  var level = 0;
-  var root = tree;
-  while (root.parent) {
-    root = root.parent;
-    level += 1;
-    for (var i = 0, sibling; sibling = root.children[i]; ++i) {
-      if (sibling.dom)
-        sibling.dom.style.zIndex = 0;
-    }
-  }
-  var width = root.dom.offsetWidth;
-  var height = root.dom.offsetHeight;
-  // Unhide (raise) and maximize us and our ancestors.
-  for (var t = tree; t.parent; t = t.parent) {
-    // Shift off by border so we don't get nested borders.
-    // TODO: actually make nested borders work (need to adjust width/height).
-    position(t.dom, -kBorderWidth, -kBorderWidth, width, height);
-    t.dom.style.zIndex = 1;
-  }
-  // And layout into the topmost box.
-  layout(tree, level, width, height);
-}
-
-function makeDom(tree, level) {
-  var dom = document.createElement('div');
-  dom.style.zIndex = 1;
-  dom.className = 'webtreemap-node webtreemap-level' + Math.min(level, 4);
-  if (tree.data['$symbol']) {
-    dom.className += (' webtreemap-symbol-' +
-	tree.data['$symbol'].replace(' ', '_'));
-  }
-  if (tree.data['$dominant_symbol']) {
-    dom.className += (' webtreemap-symbol-' +
-	tree.data['$dominant_symbol'].replace(' ', '_'));
-    dom.className += (' webtreemap-aggregate');
-  }
-
-  for(key in tree.data){
-    if(key != '$area'){
-      dom.setAttribute('data-' + key, tree.data[key]);
-    }
-  }
-
-  dom.onmousedown = function(e) {
-    if (e.button == 0) {
-      if (focused && tree == focused && focused.parent) {
-        focus(focused.parent);
-      } else {
-        focus(tree);
-      }
-    }
-    e.stopPropagation();
-    return true;
-  };
-
-  var caption = document.createElement('div');
-  caption.className = 'webtreemap-caption';
-  caption.innerHTML = tree.name;
-  dom.appendChild(caption);
-  dom.title = tree.name;
-
-  tree.dom = dom;
-  return dom;
-}
-
-function position(dom, x, y, width, height) {
-  // CSS width/height does not include border.
-  width -= kBorderWidth*2;
-  height -= kBorderWidth*2;
-
-  dom.style.left   = x + 'px';
-  dom.style.top    = y + 'px';
-  dom.style.width  = Math.max(width, 0) + 'px';
-  dom.style.height = Math.max(height, 0) + 'px';
-}
-
-// Given a list of rectangles |nodes|, the 1-d space available
-// |space|, and a starting rectangle index |start|, compute an span of
-// rectangles that optimizes a pleasant aspect ratio.
-//
-// Returns [end, sum], where end is one past the last rectangle and sum is the
-// 2-d sum of the rectangles' areas.
-function selectSpan(nodes, space, start) {
-  // Add rectangle one by one, stopping when aspect ratios begin to go
-  // bad.  Result is [start,end) covering the best run for this span.
-  // http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.36.6685
-  var node = nodes[start];
-  var rmin = node.data['$area'];  // Smallest seen child so far.
-  var rmax = rmin;                // Largest child.
-  var rsum = 0;                   // Sum of children in this span.
-  var last_score = 0;             // Best score yet found.
-  for (var end = start; node = nodes[end]; ++end) {
-    var size = node.data['$area'];
-    if (size < rmin)
-      rmin = size;
-    if (size > rmax)
-      rmax = size;
-    rsum += size;
-
-    // This formula is from the paper, but you can easily prove to
-    // yourself it's taking the larger of the x/y aspect ratio or the
-    // y/x aspect ratio.  The additional magic fudge constant of kAspectRatio
-    // lets us prefer wider rectangles to taller ones.
-    var score = Math.max(space*space*rmax / (rsum*rsum),
-                         kAspectRatio*rsum*rsum / (space*space*rmin));
-    if (last_score && score > last_score) {
-      rsum -= size;  // Undo size addition from just above.
-      break;
-    }
-    last_score = score;
-  }
-  return [end, rsum];
-}
-
-function layout(tree, level, width, height) {
-  if (!('children' in tree))
-    return;
-
-  var total = tree.data['$area'];
-
-  // XXX why do I need an extra -1/-2 here for width/height to look right?
-  var x1 = 0, y1 = 0, x2 = width - 1, y2 = height - 2;
-  x1 += kPadding; y1 += kPadding;
-  x2 -= kPadding; y2 -= kPadding;
-  y1 += 14;  // XXX get first child height for caption spacing
-
-  var pixels_to_units = Math.sqrt(total / ((x2 - x1) * (y2 - y1)));
-
-  for (var start = 0, child; child = tree.children[start]; ++start) {
-    if (x2 - x1 < 60 || y2 - y1 < 40) {
-      if (child.dom) {
-        child.dom.style.zIndex = 0;
-        position(child.dom, -2, -2, 0, 0);
-      }
-      continue;
-    }
-
-    // Dynamically decide whether to split in x or y based on aspect ratio.
-    var ysplit = ((y2 - y1) / (x2 - x1)) > kAspectRatio;
-
-    var space;  // Space available along layout axis.
-    if (ysplit)
-      space = (y2 - y1) * pixels_to_units;
-    else
-      space = (x2 - x1) * pixels_to_units;
-
-    var span = selectSpan(tree.children, space, start);
-    var end = span[0], rsum = span[1];
-
-    // Now that we've selected a span, lay out rectangles [start,end) in our
-    // available space.
-    var x = x1, y = y1;
-    for (var i = start; i < end; ++i) {
-      child = tree.children[i];
-      if (!child.dom) {
-        child.parent = tree;
-        child.dom = makeDom(child, level + 1);
-        tree.dom.appendChild(child.dom);
-      } else {
-        child.dom.style.zIndex = 1;
-      }
-      var size = child.data['$area'];
-      var frac = size / rsum;
-      if (ysplit) {
-        width = rsum / space;
-        height = size / width;
-      } else {
-        height = rsum / space;
-        width = size / height;
-      }
-      width /= pixels_to_units;
-      height /= pixels_to_units;
-      width = Math.round(width);
-      height = Math.round(height);
-      position(child.dom, x, y, width, height);
-      if ('children' in child) {
-        layout(child, level + 1, width, height);
-      }
-      if (ysplit)
-        y += height;
-      else
-        x += width;
-    }
-
-    // Shrink our available space based on the amount we used.
-    if (ysplit)
-      x1 += Math.round((rsum / space) / pixels_to_units);
-    else
-      y1 += Math.round((rsum / space) / pixels_to_units);
-
-    // end points one past where we ended, which is where we want to
-    // begin the next iteration, but subtract one to balance the ++ in
-    // the loop.
-    start = end - 1;
-  }
-}
-
-// The algorithm does best at laying out items from largest to smallest.
-// Recursively sort the tree to ensure this.
-function treeSort(tree) {
-  tree.children.sort(function (a, b) {
-    return b.data['$area'] - a.data['$area'];
-  });
-  for (var i = 0; i < tree.children.length; ++i) {
-    var child = tree.children[i];
-    if ('children' in child) {
-      treeSort(child);
-    }
-  }
-}
-
-function appendTreemap(dom, data, options) {
-  var style = getComputedStyle(dom, null);
-  var width = parseInt(style.width);
-  var height = parseInt(style.height);
-  if (options === undefined || options.sort !== false) {
-    treeSort(data);
-  }
-  if (!data.dom)
-    makeDom(data, 0);
-  dom.appendChild(data.dom);
-  position(data.dom, 0, 0, width, height);
-  layout(data, 0, width, height);
-}
-
-return appendTreemap;
-}));
diff --git a/update-manifest.go b/update-manifest.go
deleted file mode 100644
index 6d45472..0000000
--- a/update-manifest.go
+++ /dev/null
@@ -1,140 +0,0 @@
-// Copyright 2017 The Fuchsia Authors. All rights reserved.
-// Use of this source code is governed by a BSD-style license that can be
-// found in the LICENSE file.
-
-// Update repository revisions in the manifest.
-package main
-
-import (
-	"bufio"
-	"encoding/xml"
-	"flag"
-	"fmt"
-	"io/ioutil"
-	"log"
-	"os"
-	"os/exec"
-	"path/filepath"
-	"strings"
-)
-
-type stringsValue []string
-
-func (i *stringsValue) String() string {
-	return strings.Join(*i, ",")
-}
-
-func (i *stringsValue) Set(value string) error {
-	*i = strings.Split(value, ",")
-	return nil
-}
-
-var (
-	manifestVar string
-	projectsVar = stringsValue{}
-)
-
-func init() {
-	flag.StringVar(&manifestVar, "manifest", "", "Name of the manifest file")
-	flag.Var(&projectsVar, "projects", "List of projects to update")
-	flag.Usage = func() {
-		fmt.Fprintf(os.Stderr, "usage: update-manifest\n")
-		flag.PrintDefaults()
-	}
-}
-
-type Manifest struct {
-	Projects []Project `xml:"projects>project"`
-	XMLName  struct{}  `xml:"manifest"`
-}
-
-type Project struct {
-	Name         string   `xml:"name,attr,omitempty"`
-	Remote       string   `xml:"remote,attr,omitempty"`
-	RemoteBranch string   `xml:"remotebranch,attr,omitempty"`
-	Revision     string   `xml:"revision,attr,omitempty"`
-	XMLName      struct{} `xml:"project"`
-}
-
-func manifestFromBytes(data []byte) (*Manifest, error) {
-	m := new(Manifest)
-	if err := xml.Unmarshal(data, m); err != nil {
-		return nil, err
-	}
-	return m, nil
-}
-
-func getLatestRevision(manifest, remote, branch string) (string, error) {
-	cmd := exec.Command("git", "ls-remote", remote, fmt.Sprintf("refs/heads/%s", branch))
-	cmd.Dir = filepath.Dir(manifest)
-	stdout, err := cmd.StdoutPipe()
-	if err != nil {
-		return "", err
-	}
-	if err := cmd.Start(); err != nil {
-		return "", err
-	}
-	r := bufio.NewReader(stdout)
-	out, _, err := r.ReadLine()
-	if err != nil {
-		return "", err
-	}
-	if err := cmd.Wait(); err != nil {
-		return "", err
-	}
-	return strings.Fields(string(out))[0], nil
-}
-
-func updateManifest(manifest string, projects map[string]bool) error {
-	content, err := ioutil.ReadFile(filepath.Join(manifest))
-	if err != nil {
-		return fmt.Errorf("Could not read from %s: %s", manifest, err)
-	}
-
-	m, err := manifestFromBytes(content)
-	if err != nil {
-		return fmt.Errorf("Cannot parse manifest %s: %s", manifest, err)
-	}
-
-	str := string(content)
-	for _, p := range m.Projects {
-		if len(projects) > 0 {
-			if _, ok := projects[p.Name]; !ok {
-				continue
-			}
-		}
-		if p.Revision != "" {
-			branch := "master"
-			if p.RemoteBranch != "" {
-				branch = p.RemoteBranch
-			}
-			revision, err := getLatestRevision(manifest, p.Remote, branch)
-			if err != nil {
-				return err
-			}
-			str = strings.Replace(str, p.Revision, string(revision), 1)
-		}
-	}
-
-	if err := ioutil.WriteFile(manifest, []byte(str), os.ModePerm); err != nil {
-		return fmt.Errorf("Could not write to %s: %s", manifest, err)
-	}
-	return nil
-}
-
-func main() {
-	flag.Parse()
-
-	if _, err := os.Stat(manifestVar); os.IsNotExist(err) {
-		log.Fatalf("Manifest %s does not exist", manifestVar)
-	}
-
-	projects := map[string]bool{}
-	for _, p := range projectsVar {
-		projects[p] = true
-	}
-
-	if err := updateManifest(manifestVar, projects); err != nil {
-		log.Fatal(err)
-	}
-}
diff --git a/vim/README.md b/vim/README.md
deleted file mode 100644
index bae9a01..0000000
--- a/vim/README.md
+++ /dev/null
@@ -1,67 +0,0 @@
-# Helpful Vim tools for Fuchsia development
-
-## Features
-
-* Configure YouCompleteMe to provide error checking, completion and source
-  navigation within the Fuchsia tree.
-* Set path so that `:find` and `gf` know how to find files.
-* Fidl syntax highlighting (using /lib/fidl/tools/vim/).
-
-## Installation
-
-1. Update your login script
-
-   Steps #2 and #3 depend on configuration set by the `fx set` command. Add
-   these lines to your startup script (typically `~/.bashrc`).
-
-   ```shell
-   export FUCHSIA_DIR=/path/to/fuchsia-dir
-   fx set x64
-   ```
-
-1. Update your vim startup file
-
-   If this line exists in your ~/.vimrc file, remove it:
-
-   ```
-   filetype plugin indent on
-   ```
-
-   Then add these lines to your `~/.vimrc`.
-
-   ```
-   if $FUCHSIA_DIR != ""
-     source $FUCHSIA_DIR/scripts/vim/fuchsia.vim
-   endif
-   filetype plugin indent on
-   ```
-
-1. Install YouCompleteMe (ycm)
-
-   Optionally [install YouCompleteMe](
-   https://fuchsia.googlesource.com/scripts/+/master/youcompleteme/README.md)
-   for fancy completion, source navigation and inline errors.
-
-   If it's installed, `fuchsia.vim` will configure it properly.
-
-   If everything is working properly, you can place the cursor on an
-   identifier in a .cc or .h file, hit Ctrl-], and YCM will take you
-   to the definition of the identifier.
-
-   If you build a compilation database, YCM will use it, which may be more
-   reliable and efficient than the default `ycm_extra_config.py` configuration.
-   Use `fx compdb` to build a compilation database.
-
-## See also
-
-[Zircon editor integration](
-https://fuchsia.googlesource.com/zircon/+/master/docs/editors.md)
-
-## TODO
-
-In the future it would be nice to support:
-* Fidl indentation
-* GN indentation
-* Dart, Go and Rust support
-* Build system integration
-* Navigate between generated files and fidl source
diff --git a/vim/ftdetect/gnfiletype.vim b/vim/ftdetect/gnfiletype.vim
deleted file mode 100644
index d7a760b..0000000
--- a/vim/ftdetect/gnfiletype.vim
+++ /dev/null
@@ -1,27 +0,0 @@
-" Copyright (c) 2014 The Fuchsia Authors. All rights reserved.
-" Use of this source code is governed by a BSD-style license that can be
-" found in the LICENSE file.
-
-" We take care to preserve the user's fileencodings and fileformats,
-" because those settings are global (not buffer local), yet we want
-" to override them for loading GN files, which should be UTF-8.
-let s:current_fileformats = ''
-let s:current_fileencodings = ''
-
-" define fileencodings to open as utf-8 encoding even if it's ascii.
-function! s:gnfiletype_pre()
-  let s:current_fileformats = &g:fileformats
-  let s:current_fileencodings = &g:fileencodings
-  set fileencodings=utf-8 fileformats=unix
-  setlocal filetype=gn
-endfunction
-
-" restore fileencodings as others
-function! s:gnfiletype_post()
-  let &g:fileformats = s:current_fileformats
-  let &g:fileencodings = s:current_fileencodings
-endfunction
-
-au BufNewFile *.gn,*.gni setlocal filetype=gn fileencoding=utf-8 fileformat=unix
-au BufRead *.gn,*.gni call s:gnfiletype_pre()
-au BufReadPost *.gn,*.gni call s:gnfiletype_post()
diff --git a/vim/fuchsia.vim b/vim/fuchsia.vim
deleted file mode 100644
index dcf9132..0000000
--- a/vim/fuchsia.vim
+++ /dev/null
@@ -1,97 +0,0 @@
-" Copyright (c) 2017 The Fuchsia Authors. All rights reserved.
-" Use of this source code is governed by a BSD-style license that can be
-" found in the LICENSE file.
-
-" Look for the fuchsia root containing the current directory by looking for a
-" .jiri_manifest file
-let jiri_manifest = findfile(".jiri_manifest", ".;")
-if jiri_manifest != ""
-  let g:fuchsia_dir = fnamemodify(jiri_manifest, ":h")
-  " Get the current build dir from fx
-  let g:fuchsia_build_dir = systemlist(g:fuchsia_dir . "/scripts/fx get-build-dir")[0]
-  " Get the current buildtools dir from paths.py
-  let g:fuchsia_buildtools_dir = systemlist(g:fuchsia_dir . "/scripts/youcompleteme/paths.py BUILDTOOLS_PATH")[0]
-  " Tell YCM where to find its configuration script
-  let g:ycm_global_ycm_extra_conf = g:fuchsia_dir . '/scripts/youcompleteme/ycm_extra_conf.py'
-  " Do not load fuchsia/.ycm_extra_conf in case the user created a symlink for
-  " other editors.
-  let g:ycm_extra_conf_globlist = [ '!' . g:fuchsia_dir . '/*']
-  " Google-internal options - use clangd completer if the user has a compilation
-  " database (built with `fx compdb`).
-  if filereadable(g:fuchsia_dir . '/compile_commands.json')
-    let g:ycm_use_clangd = 1
-    let g:ycm_clangd_binary_path = g:fuchsia_buildtools_dir . "/clang/bin/clangd"
-  else
-    let g:ycm_use_clangd = 0
-  endif
-
-
-  let &runtimepath .= "," .
-        \ g:fuchsia_dir . "/scripts/vim/," .
-        \ g:fuchsia_dir . "/garnet/public/lib/fidl/tools/vim/"
-
-  " The "filetype plugin" line must come AFTER the changes to runtimepath
-  " above (so the proper directories are searched), but must come BEFORE the
-  " FuchsiaBuffer function below (to work around a bug on MacOS where
-  " Ctrl-] does not work because filetype is undefined instead of being
-  " equal to "cpp".)
-  filetype plugin indent on
-
-  function! FuchsiaBuffer()
-    let full_path = expand("%:p")
-    let extension = expand("%:e")
-
-    " Only run if the buffer is inside the Fuchsia dir
-    if full_path !~ "^" . g:fuchsia_dir
-      return
-    endif
-
-    let b:is_fuchsia = 1
-
-    " Set up path so that 'gf' and :find do what we want.
-    " This includes the directory of the file, cwd, all layers, layer public
-    " directories, the build directory, the gen directory and the zircon
-    " sysroot include directory.
-    let &l:path = ".,," .
-          \ $PWD . "/**/," .
-          \ g:fuchsia_dir . "," .
-          \ g:fuchsia_dir . "/*/," .
-          \ g:fuchsia_dir . "/*/public/," .
-          \ g:fuchsia_build_dir . "," .
-          \ g:fuchsia_build_dir . "/gen," .
-          \ g:fuchsia_dir . "/out/build-zircon/*/sysroot/include"
-
-    " Make sure Dart files are recognized as such.
-    if extension == "dart"
-      set filetype=dart
-    endif
-
-    " Treat files in a packages or products directory (or subdirectory) without
-    " a filetype that don't have an extension as JSON files.
-    if &filetype == "" && full_path =~ "/\\(packages\\|products\\)/" && extension == ""
-      set filetype=json sw=4
-    endif
-
-    " The Buf* autocmds sometimes run before and sometimes after FileType.
-    if &filetype == "cpp"
-      call FuchsiaCppBuffer()
-    endif
-  endfunction
-
-  " This may be called twice because autocmds arrive in different orders on
-  " different platforms.
-  function! FuchsiaCppBuffer()
-    if exists('g:loaded_youcompleteme')
-      " Replace the normal go to tag key with YCM when editing C/CPP.
-      nnoremap <C-]> :YcmCompleter GoTo<cr>
-    endif
-  endfunction
-
-  augroup fuchsia
-    au!
-    autocmd BufRead,BufNewFile * call FuchsiaBuffer()
-    autocmd FileType cpp call FuchsiaCppBuffer()
-    autocmd BufNewFile,BufRead *.cmx set syntax=json
-  augroup END
-
-endif
diff --git a/vim/syntax/gn.vim b/vim/syntax/gn.vim
deleted file mode 100644
index 4e25d30..0000000
--- a/vim/syntax/gn.vim
+++ /dev/null
@@ -1,81 +0,0 @@
-" Copyright (c) 2014 The Fuchsia Authors. All rights reserved.
-" Use of this source code is governed by a BSD-style license that can be
-" found in the LICENSE file.
-"
-" gn.vim: Vim syntax file for GN.
-"
-" Quit when a (custom) syntax file was already loaded
-"if exists("b:current_syntax")
-  "finish
-"endif
-
-syn case match
-
-" Keywords within functions
-syn keyword     gnConditional       if else
-hi def link     gnConditional       Conditional
-
-" Predefined variables
-syn keyword     gnPredefVar current_cpu current_os current_toolchain
-syn keyword     gnPredefVar default_toolchain host_cpu host_os
-syn keyword     gnPredefVar root_build_dir root_gen_dir root_out_dir
-syn keyword     gnPredefVar target_cpu target_gen_dir target_out_dir
-syn keyword     gnPredefVar target_os
-syn keyword     gnPredefVar true false
-hi def link     gnPredefVar         Constant
-
-" Target declarations
-syn keyword     gnTarget action action_foreach copy executable group
-syn keyword     gnTarget shared_library source_set static_library
-hi def link     gnTarget            Type
-
-" Buildfile functions
-syn keyword     gnFunctions assert config declare_args defined exec_script
-syn keyword     gnFunctions foreach get_label_info get_path_info
-syn keyword     gnFunctions get_target_outputs getenv import print
-syn keyword     gnFunctions process_file_template read_file rebase_path
-syn keyword     gnFunctions set_default_toolchain set_defaults
-syn keyword     gnFunctions set_sources_assignment_filter template tool
-syn keyword     gnFunctions toolchain toolchain_args write_file
-hi def link     gnFunctions         Macro
-
-" Variables
-syn keyword     gnVariable all_dependent_configs allow_circular_includes_from
-syn keyword     gnVariable args cflags cflags_c cflags_cc cflags_objc
-syn keyword     gnVariable cflags_objcc check_includes complete_static_lib
-syn keyword     gnVariable configs data data_deps defines depfile deps
-syn keyword     gnVariable forward_dependent_configs_from include_dirs inputs
-syn keyword     gnVariable ldflags lib_dirs libs output_extension output_name
-syn keyword     gnVariable outputs public public_configs public_deps script
-syn keyword     gnVariable sources testonly visibility
-hi def link     gnVariable          Keyword
-
-" Strings
-syn region	    gnString start=+L\="+ skip=+\\\\\|\\"+ end=+"+ contains=@Spell
-hi def link     gnString            String
-
-" Comments
-syn keyword     gnTodo              contained TODO FIXME XXX BUG NOTE
-syn cluster     gnCommentGroup      contains=gnTodo
-syn region      gnComment           start="#" end="$" contains=@gnCommentGroup,@Spell
-
-hi def link     gnComment           Comment
-hi def link     gnTodo              Todo
-
-" Operators; I think this is a bit too colourful.
-"syn match gnOperator /=/
-"syn match gnOperator /!=/
-"syn match gnOperator />=/
-"syn match gnOperator /<=/
-"syn match gnOperator /==/
-"syn match gnOperator /+=/
-"syn match gnOperator /-=/
-"syn match gnOperator /\s>\s/
-"syn match gnOperator /\s<\s/
-"syn match gnOperator /\s+\s/
-"syn match gnOperator /\s-\s/
-"hi def link     gnOperator          Operator
-
-syn sync minlines=500
-
-let b:current_syntax = "gn"
diff --git a/youcompleteme/README.md b/youcompleteme/README.md
deleted file mode 100644
index a96f748..0000000
--- a/youcompleteme/README.md
+++ /dev/null
@@ -1,60 +0,0 @@
-# YouCompleteMe for Fuchsia Developers
-
-You can use [YouCompleteMe](https://github.com/Valloric/YouCompleteMe) to
-provide error checking, completion and source navigation within the Fuchsia
-tree.
-
-YouCompleteMe works natively with Vim but it can also be integrated
-with other editors through [ycmd](https://github.com/Valloric/ycmd).
-
-## Install
-
-See the [installation guide](
-https://github.com/Valloric/YouCompleteMe#installation).
-
-**Note**: Installing YCM on MacOS with Homebrew is not recommended because
-of library compatibility errors. Use the official installation guide instead.
-
-### gLinux (Googlers only)
-
-(compiling on gLinux, even if editing over SSHFS on MacOS) Ignore the above.
-Search the Google intranet for "YouCompleteMe" for installation instructions.
-
-## Configure
-
-### Vim
-
-The general [Vim Fuchsia instructions](
-https://fuchsia.googlesource.com/scripts/+/master/vim/README.md) will do this
-automatically.
-
-The setup will use a compilation database (and the clangd backend if you are a
-Googler) provided one is detected, and fallback on a `ycm_extra_conf.py`
-configuration otherwise. You can build a compilation database with `fx compdb`,
-or `fx -i compdb` if you want it rebuilt automatically as you edit files.
-
-### Other editors (ycmd)
-
-You'll need to set the ycmd config option `global_ycm_extra_conf` to point to
-`${FUCHSIA_DIR}/scripts/youcompleteme/ycm_extra_conf.py`.
-Note you may need to manually replace `${FUCHSIA_DIR}` with the correct path.
-
-Alternatively, you can create a `.ycm_extra_conf.py` symbolic link to let YCM
-automatically find the config for any fuchsia repository:
-
-```
-ln -s $FUCHSIA_DIR/scripts/youcompleteme/ycm_extra_conf.py $FUCHSIA_DIR/.ycm_extra_conf.py
-```
-
-**Googlers only**: you'll also need to setup
-`${FUCHSIA_DIR}/scripts/youcompleteme/default_settings.json` as the default
-settings path in your editor, in order to disable the internal `use_clangd`
-flag. If you want to use clangd, you can additionally edit that file to set
-`use_clangd` to 1, and `clang_binary_path` to
-`${FUCHSIA_BUILDTOOLS_DIR}/clang/bin/clangd`. Remember that in that case, you'll
-need to build a compilation database with `fx compdb`.
-
-## See also
-
-[Zircon editor integration](
-https://fuchsia.googlesource.com/zircon/+/master/docs/editors.md)
diff --git a/youcompleteme/default_settings.json b/youcompleteme/default_settings.json
deleted file mode 100644
index 9e7f11e..0000000
--- a/youcompleteme/default_settings.json
+++ /dev/null
@@ -1,51 +0,0 @@
-{
-  "filepath_completion_use_working_dir": 0,
-  "auto_trigger": 1,
-  "min_num_of_chars_for_completion": 2,
-  "min_num_identifier_candidate_chars": 0,
-  "semantic_triggers": {},
-  "filetype_specific_completion_to_disable": {
-    "gitcommit": 1
-  },
-  "seed_identifiers_with_syntax": 0,
-  "collect_identifiers_from_comments_and_strings": 0,
-  "collect_identifiers_from_tags_files": 0,
-  "max_num_identifier_candidates": 10,
-  "max_num_candidates": 50,
-  "extra_conf_globlist": [],
-  "global_ycm_extra_conf": "",
-  "confirm_extra_conf": 1,
-  "complete_in_comments": 0,
-  "complete_in_strings": 1,
-  "max_diagnostics_to_display": 30,
-  "filetype_whitelist": {
-    "*": 1
-  },
-  "filetype_blacklist": {
-    "tagbar": 1,
-    "qf": 1,
-    "notes": 1,
-    "markdown": 1,
-    "netrw": 1,
-    "unite": 1,
-    "text": 1,
-    "vimwiki": 1,
-    "pandoc": 1,
-    "infolog": 1,
-    "mail": 1
-  },
-  "auto_start_csharp_server": 1,
-  "auto_stop_csharp_server": 1,
-  "use_ultisnips_completer": 1,
-  "csharp_server_port": 0,
-  "hmac_secret": "",
-  "server_keep_logfiles": 0,
-  "gocode_binary_path": "/usr/bin/gocode",
-  "godef_binary_path": "",
-  "rust_src_path": "",
-  "racerd_binary_path": "",
-  "python_binary_path": "",
-  "java_jdtls_use_clean_workspace": 1,
-  "use_clangd": 0,
-  "clangd_binary_path": ""
-}
diff --git a/youcompleteme/paths.py b/youcompleteme/paths.py
deleted file mode 100755
index 84c306f..0000000
--- a/youcompleteme/paths.py
+++ /dev/null
@@ -1,120 +0,0 @@
-#!/usr/bin/env python
-# vim: set expandtab:ts=2:sw=2
-# Copyright 2016 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import platform
-import re
-import sys
-
-SCRIPT_DIR = os.path.abspath(os.path.dirname(__file__))
-FUCHSIA_ROOT = os.path.abspath(os.path.join(SCRIPT_DIR, os.pardir, os.pardir))
-GN_PATH = os.path.join(FUCHSIA_ROOT, 'buildtools', 'gn')
-MKBOOTFS_PATH = os.path.join(FUCHSIA_ROOT, 'out', 'build-zircon', 'tools', 'mkbootfs')
-BUILDTOOLS_PATH = os.path.join(FUCHSIA_ROOT, 'buildtools', '%s-%s' % (
-    platform.system().lower().replace('darwin', 'mac'),
-    {
-        'x86_64': 'x64',
-        'aarch64': 'arm64',
-    }[platform.machine()],
-))
-DEBUG_OUT_DIR = os.path.join(FUCHSIA_ROOT, 'out', 'debug-x64')
-RELEASE_OUT_DIR = os.path.join(FUCHSIA_ROOT, 'out', 'release-x64')
-
-def recursive_search(root, pattern):
-  """Looks for a particular directory pattern within a directory tree.
-
-  Ignores files and git directories.
-
-  Returns:
-    the containing directory of the match or None if not found.
-  """
-
-  search_queue = [root]
-  while search_queue:
-    # List the children.
-    current_path = search_queue.pop(0)
-    for child in os.listdir(current_path):
-      full_path = os.path.join(current_path, child)
-      # Ignore files.
-      if not os.path.isdir(full_path):
-        continue
-      # Ignore git.
-      if child == '.git':
-        continue
-      # See if we found it.
-      if pattern in full_path:
-        return full_path
-      # Otherwise, enqueue the path for searching.
-      search_queue.append(full_path)
-  return None
-
-def search_clang_path(root):
-  """clang can change location, so we search where it landed.
-
-  For now there is only one clang integration, so this should always find the
-  correct one.
-  This could potentially look over a number of directories, but this will only
-  be run once at YCM server startup, so it should not affect overall
-  performance.
-
-  Returns:
-    clang path or None.
-  """
-
-  # This is the root where we should search for the clang installation.
-  clang_lib_path = recursive_search(root, 'clang/lib/clang')
-  if not clang_lib_path:
-    print('Could not find clang installation')
-    return None
-  # Now that we have the clang lib location, we need to find where the
-  # actual include files are.
-  installation_path = recursive_search(clang_lib_path, 'include')
-  # recursive_search returns the include path, so we need to remove it.
-  return os.path.dirname(installation_path)
-
-# We start seaching from the correct buildtools.
-CLANG_PATH = search_clang_path(BUILDTOOLS_PATH)
-
-_BUILD_TOOLS = {}
-
-def build_tool(package, tool):
-  """Return the full path of TOOL binary in PACKAGE.
-
-  This function memoizes its results, so there's not much need to
-  cache its results in calling code.
-
-  Raises:
-    AssertionError: if the binary doesn't exist.
-  """
-
-  path = _BUILD_TOOLS.get((package, tool))
-  if path is None:
-    path = os.path.join(BUILDTOOLS_PATH, package, 'bin', tool)
-    assert os.path.exists(path), 'No "%s" tool in "%s"' % (tool, package)
-    _BUILD_TOOLS[package, tool] = path
-  return path
-
-def main():
-  variable_re = re.compile('^[A-Z][A-Z_]*$')
-  def usage():
-    print('Usage: path.py VARIABLE')
-    print('Available variables:')
-    print('\n'.join(filter(variable_re.match, globals().keys())))
-  if len(sys.argv) != 2:
-    usage()
-    return
-  variable = sys.argv[1]
-  if not variable_re.match(variable) or variable not in globals().keys():
-    usage()
-    return
-  print(globals()[variable])
-
-if __name__ == '__main__':
-  main()
diff --git a/youcompleteme/ycm_extra_conf.py b/youcompleteme/ycm_extra_conf.py
deleted file mode 100644
index 4c8ab23..0000000
--- a/youcompleteme/ycm_extra_conf.py
+++ /dev/null
@@ -1,168 +0,0 @@
-# vim: set expandtab:ts=2:sw=2
-# Copyright (c) 2017 The Fuchsia Authors. All rights reserved.
-# Use of this source code is governed by a BSD-style license that can be
-# found in the LICENSE file.
-
-# Autocompletion config for YouCompleteMe in Fuchsia
-
-import os
-import re
-import stat
-import subprocess
-import ycm_core
-
-# NOTE: paths.py is a direct copy from //build/gn/paths.py
-# If there is an issue with the paths not being valid, just pull a new copy.
-import sys
-sys.path.append(os.path.dirname(os.path.realpath(__file__)))
-import paths as fuchsia_paths
-
-fuchsia_root = os.path.realpath(fuchsia_paths.FUCHSIA_ROOT)
-zircon_database = None
-zircon_dir = os.path.join(fuchsia_root, 'zircon')
-# This doc explains how to generate compile_commands.json for Zircon:
-# https://fuchsia.googlesource.com/zircon/+/HEAD/docs/editors.md
-if os.path.exists(os.path.join(zircon_dir, 'compile_commands.json')):
-  zircon_database = ycm_core.CompilationDatabase(zircon_dir)
-
-os.chdir(fuchsia_root)
-fuchsia_build = subprocess.check_output(
-    [os.path.join(fuchsia_paths.FUCHSIA_ROOT, 'scripts/fx'),
-     'get-build-dir']
-    ).strip().decode('utf-8')
-
-fuchsia_clang = os.path.realpath(fuchsia_paths.CLANG_PATH)
-fuchsia_sysroot = os.path.join(fuchsia_paths.BUILDTOOLS_PATH, 'sysroot')
-ninja_path = os.path.join(fuchsia_root, 'buildtools', 'ninja')
-
-# Get the name of the zircon project from GN args.
-# Reading the args.gn is significantly faster than running `gn args` so we do
-# that.
-target_cpu = None
-args = open(os.path.join(fuchsia_build, 'args.gn')).read()
-match = re.search(r'target_cpu\s*=\s*"([^"]+)"', args)
-if match:
-  target_cpu = match.groups()[0]
-
-common_flags = [
-    '-std=c++14',
-    '-xc++',
-    '-I', fuchsia_root,
-    '-I', os.path.join(fuchsia_build, 'gen'),
-    '-isystem', os.path.join(fuchsia_sysroot, 'usr', 'include'),
-    '-isystem', os.path.join(fuchsia_clang, 'include'),
-    '-isystem', os.path.join(fuchsia_clang, 'include', 'c++', 'v1'),
-]
-
-# Add the sysroot include if we found the zircon project
-if target_cpu:
-  arch_flags = ['-I' + os.path.join(fuchsia_root,
-                                    'out/build-zircon',
-                                    'build-' + target_cpu,
-                                    'sysroot/include')]
-
-def GetClangCommandFromNinjaForFilename(filename):
-  """Returns the command line to build |filename|.
-
-  Asks ninja how it would build the source file. If the specified file is a
-  header, tries to find its companion source file first.
-
-  Args:
-    filename: (String) Path to source file being edited.
-
-  Returns:
-    (List of Strings) Command line arguments for clang.
-  """
-
-  # By default we start with the common to every file flags
-  fuchsia_flags = common_flags
-
-  # Header files can't be built. Instead, try to match a header file to its
-  # corresponding source file.
-  if filename.endswith('.h'):
-    alternates = ['.cc', '.cpp', '_unittest.cc']
-    for alt_extension in alternates:
-      alt_name = filename[:-2] + alt_extension
-      if os.path.exists(alt_name):
-        filename = alt_name
-        break
-    else:
-      # If this is a standalone .h file with no source, the best we can do is
-      # try to use the default flags.
-      return fuchsia_flags
-
-  # Ninja needs the path to the source file from the output build directory.
-  # Cut off the common part and /. Also ensure that paths are real and don't
-  # contain symlinks that throw the len() calculation off.
-  filename = os.path.realpath(filename)
-  subdir_filename = filename[len(fuchsia_root) + 1:]
-  rel_filename = os.path.join('..', '..', subdir_filename)
-
-  # Ask ninja how it would build our source file.
-  ninja_command = [
-      ninja_path, '-v', '-C', fuchsia_build, '-t', 'commands',
-      rel_filename + '^'
-  ]
-  p = subprocess.Popen(ninja_command, stdout=subprocess.PIPE)
-  stdout, stderr = p.communicate()
-  if p.returncode:
-    return fuchsia_flags
-  stdout = stdout.decode('utf-8')
-
-  # Ninja might execute several commands to build something. We want the last
-  # clang command.
-  clang_line = None
-  for line in reversed(stdout.split('\n')):
-    if 'clang' in line:
-      clang_line = line
-      break
-  else:
-    return fuchsia_flags
-
-  # Parse out the -I and -D flags. These seem to be the only ones that are
-  # important for YCM's purposes.
-  for flag in clang_line.split(' '):
-    if flag.startswith('-I'):
-      # Relative paths need to be resolved, because they're relative to the
-      # output dir, not the source.
-      if flag[2] == '/':
-        fuchsia_flags.append(flag)
-      else:
-        abs_path = os.path.normpath(os.path.join(fuchsia_build, flag[2:]))
-        fuchsia_flags.append('-I' + abs_path)
-    elif ((flag.startswith('-') and flag[1] in 'DWFfmO') or
-          flag.startswith('-std=') or flag.startswith('--target=') or
-          flag.startswith('--sysroot=')):
-      fuchsia_flags.append(flag)
-    else:
-      print('Ignoring flag: %s' % flag)
-
-  return fuchsia_flags
-
-
-def FlagsForFile(filename):
-  """This is the main entry point for YCM. Its interface is fixed.
-
-  Args:
-    filename: (String) Path to source file being edited.
-
-  Returns:
-    (Dictionary)
-      'flags': (List of Strings) Command line flags.
-      'do_cache': (Boolean) True if the result should be cached.
-  """
-  if zircon_database and ('zircon/' in filename):
-    zircon_compilation_info = zircon_database.GetCompilationInfoForFile(
-      filename)
-    if zircon_compilation_info.compiler_flags_:
-      return {
-        'flags': zircon_compilation_info.compiler_flags_,
-        'include_paths_relative_to_dir':
-            zircon_compilation_info.compiler_working_dir_,
-        'do_cache': True
-      }
-  file_flags = GetClangCommandFromNinjaForFilename(filename)
-  # We add the arch specific flags
-  final_flags = file_flags + arch_flags
-
-  return {'flags': final_flags, 'do_cache': True}
diff --git a/zsh-completion/README.md b/zsh-completion/README.md
deleted file mode 100644
index fec34ee..0000000
--- a/zsh-completion/README.md
+++ /dev/null
@@ -1,20 +0,0 @@
-# Zsh Completion
-
-Partial Zsh completion support for the `fx` tool.
-
-## Use
-
-Add `//scripts/zsh-completion/` to your `fpath` before running `compinit`. For
-example:
-```
-fpath+=( ~/fuchsia/scripts/zsh-completion )
-```
-
-## Improve
-
-Subcommands are completed by looking in `//scripts/devshell/` but there isn't
-completion for most subcommand arguments. To add completion for `fx foo` write a
-new autoload function in `//scripts/zsh-completion/_fx_foo`. It will be called
-by the `_fx` completion function when needed. The `${fuchsia_dir}` and
-`${fuchsia_build_dir}` local variables will be available to the subcommand
-completion function.
diff --git a/zsh-completion/_fx b/zsh-completion/_fx
deleted file mode 100644
index 4fe6db3..0000000
--- a/zsh-completion/_fx
+++ /dev/null
@@ -1,86 +0,0 @@
-#compdef fx
-
-__fx_nodename() {
-  # TODO: allow configuration of node names with zstyle
-  local -a nodenames=( $(${fuchsia_dir}/out/build-zircon/tools/netls | awk '/device/ { print $2; }') )
-  _describe 'nodename' nodenames
-}
-
-__fx_amber_package() {
-  # packages are directories in the build dir under amber-files/repository/targets
-  _values $(cd ${fuchsia_build_dir}/amber-files/repository/targets >/dev/null 2>&1 && echo *(/))
-}
-
-__fx_build_dir() {
-  # build dirs are directories under out/ with an args.gn
-  compadd $(cd ${fuchsia_dir} >/dev/null 2>&1; echo out/*/args.gn(N:h))
-}
-
-__fx_gn_target() {
-  # use a cache of "gn ls" that's updated when build.ninja changes.
-  local -r absolute_build_dir="${fuchsia_dir}/${fuchsia_build_dir}"
-  local -r targets_file="${absolute_build_dir}/.gn_ls"
-  local -r ninja_file="${absolute_build_dir}/build.ninja"
-  if [ ! -e "${targets_file}" -o ${ninja_file} -nt ${targets_file} ]; then
-    local -r tmp_targets_file="$(mktemp -p "${absolute_build_dir}")"
-    "${fuchsia_dir}/buildtools/gn" ls "${absolute_build_dir}" > "${tmp_targets_file}"
-    mv "${tmp_targets_file}" "${targets_file}"
-  fi
-  _values $(cat ${absolute_build_dir}/.gn_ls)
-}
-
-_fx() {
-  typeset -A opt_args
-
-  local fuchsia_dir="${FUCHSIA_DIR}"
-  if [[ -z "${fuchsia_dir}" ]]; then
-    fuchsia_dir="$(pwd)"
-    while [[ ! -d "${fuchsia_dir}/.jiri_root" ]]; do
-      fuchsia_dir="$(dirname "${fuchsia_dir}")"
-      if [[ "${fuchsia_dir}" == "/" ]]; then
-        _message -r "Cannot find Fuchsia source tree containing $(pwd)"
-        return
-      fi
-    done
-  fi
-
-  # list of commands based on //scripts/devshell/
-  # each file is read to find the description line (starts with "### ").
-  local -a commands lines
-  local desc command
-  for command in ${fuchsia_dir}/scripts/devshell/*(.); do
-    lines=("${(f)$(<${command})}")
-    desc=${${lines[${lines[(i)\#\#\# *]}]}#????}
-    commands+=("${command#*devshell/}:${desc}")
-  done
-  commands+=("gn:invoke the gn command")
-  commands+=("ninja:invoke the ninja command")
-
-  _arguments \
-    "--config[config file]:filename:_files" \
-    "--dir[build directory]:directory:_files -/" \
-    "-x[print commands]" \
-    "1: :{_describe 'command' commands}" \
-    "*:: :->args"
-
-  if [[ $state != "args" ]]; then
-    return
-  fi
-
-  # get the config file location from --config, $FUCHSIA_CONFIG or ${fuchsia_dir}/.config
-  typeset -l fuchsia_config="${opt_args[--config]:-${FUCHSIA_CONFIG:-${fuchsia_dir}/.config}}"
-  # if a config file is found read the build dir into a local variable
-  typeset -l fuchsia_build_dir=
-  if [[ -e ${fuchsia_config} ]]; then
-    fuchsia_build_dir="$(source ${fuchsia_config};echo ${FUCHSIA_BUILD_DIR})"
-  fi
-
-  # look for a completion function
-  local f
-  f=_fx_$words[1]
-  if [[ -e ${fuchsia_dir}/scripts/zsh-completion/$f ]]; then
-    autoload $f; $f
-  fi
-}
-
-_fx
diff --git a/zsh-completion/_fx_boot b/zsh-completion/_fx_boot
deleted file mode 100644
index ea5ac67..0000000
--- a/zsh-completion/_fx_boot
+++ /dev/null
@@ -1,14 +0,0 @@
-_arguments -S \
-  '-1[only boot once, then exit]' \
-  '-a[only boot device with this IPv6 address]:size:' \
-  '-b[tftp block size (default=1024, ignored with --netboot)]:size:' \
-  '-i[number of microseconds between packets (default=20, ignored with --tftp)]:msecs:' \
-  '-n[only boot device with this nodename]:nodename:__fx_nodename' \
-  '-w[tftp window size (default=1024, ignored with --netboot)]:size:' \
-  '--netboot[use the netboot protocol]' \
-  '--tftp[use the tftp protocol (default)]' \
-  '--nocolor[disable ANSI color (false)]' \
-  '1::kernel:_files' \
-  '2::ramdisk:_files'
-
-# TODO: complete kernel args?
diff --git a/zsh-completion/_fx_build b/zsh-completion/_fx_build
deleted file mode 100644
index 0ebee12..0000000
--- a/zsh-completion/_fx_build
+++ /dev/null
@@ -1,2 +0,0 @@
-# complete build by asking ninja for a list of available targets and trimming the target deps off
-_values `"${fuchsia_dir}/buildtools/ninja" -C ${fuchsia_dir}/${fuchsia_build_dir} -t targets | sed -e 's/: .*//' | sed -e 's/:/\\\\:/'`
diff --git a/zsh-completion/_fx_build-push b/zsh-completion/_fx_build-push
deleted file mode 100644
index 71c8556..0000000
--- a/zsh-completion/_fx_build-push
+++ /dev/null
@@ -1,4 +0,0 @@
-_arguments \
-  {-d,--device}'[target device]:nodename:__fx_nodename' \
-  "--no-push[don't push, just build]" \
-  '*:package:__fx_amber_package'
diff --git a/zsh-completion/_fx_cp b/zsh-completion/_fx_cp
deleted file mode 100644
index 1bc1e5e..0000000
--- a/zsh-completion/_fx_cp
+++ /dev/null
@@ -1,46 +0,0 @@
-__fx_cp_remote_files() {
-  local -a dirs files
-  local -a remote_matches
-  # ask dash on the device to glob the prefix
-  remote_matches=($(fx shell echo "$PREFIX*/" "$PREFIX*"))
-  # remove the unmatched globs
-  remote_matches=(${remote_matches:#*\**})
-  # files don't have a trailing /
-  files=(${remote_matches:#*/})
-  # directories have a trailing slash filter those and remove the slash
-  dirs=(${${(M)remote_matches:#*/}%/})
-  # remove directory names from the files list
-  files=(${files:|dirs})
-
-  # we want to complete the next path component not the whole path
-  compset -P '*/'
-  compset -S '/*'
-
-  # add directories to completion
-  compadd -S/ -d dirs -- ${dirs##*/}
-  # add files to completion
-  compadd -- ${files##*/}
-}
-
-__fx_cp_src() {
-  if [ -n "${words[(r)--to-host]}" ]; then
-    __fx_cp_remote_files
-  else
-    _files
-  fi
-}
-
-__fx_cp_dest() {
-  if [ -n "${words[(r)--to-host]}" ]; then
-    _files
-  else
-    __fx_cp_remote_files
-  fi
-}
-
-_arguments \
-  '(--to-host)--to-target[Copy a file from the host to the target.]' \
-  '(--to-target)--to-host[Copy a file from the target to the host.]' \
-  '1:src:__fx_cp_src' \
-  '2:dest:__fx_cp_dest' \
-
diff --git a/zsh-completion/_fx_exec b/zsh-completion/_fx_exec
deleted file mode 100644
index 7195407..0000000
--- a/zsh-completion/_fx_exec
+++ /dev/null
@@ -1 +0,0 @@
-_files
diff --git a/zsh-completion/_fx_format-code b/zsh-completion/_fx_format-code
deleted file mode 100644
index a0efd8a..0000000
--- a/zsh-completion/_fx_format-code
+++ /dev/null
@@ -1,7 +0,0 @@
-_arguments \
-  '--dry-run[Stops the program short of running the formatters]' \
-  '--all[Formats all code in the git repo under the current working directory]' \
-  '--git[The default; it uses `git diff-index` against the newest parent commit in the upstream branch (or against HEAD if no such commit is found).  Files that are locally modified, staged or touched by any commits introduced on the local branch are formatted.]' \
-  '--files=-[Allows the user to specify files.  Files are comma separated. Globs are dealt with by bash; fx format-code "--files=foo/*" will work as expected.]:files:_sequence _files' \
-  '--target=-[Allows the user to specify a gn target]:gn_target:__fx_gn_target'
-
diff --git a/zsh-completion/_fx_log b/zsh-completion/_fx_log
deleted file mode 100644
index 429a905..0000000
--- a/zsh-completion/_fx_log
+++ /dev/null
@@ -1 +0,0 @@
-__fx_nodename
diff --git a/zsh-completion/_fx_mkzedboot b/zsh-completion/_fx_mkzedboot
deleted file mode 100644
index ad40984..0000000
--- a/zsh-completion/_fx_mkzedboot
+++ /dev/null
@@ -1,2 +0,0 @@
-local -a disks=( ${(f)$(fx list-usb-disks |sed -e 's/:/\\:/g' -e 's/ - /:/')} )
-_describe "USB Disk" disks
diff --git a/zsh-completion/_fx_netaddr b/zsh-completion/_fx_netaddr
deleted file mode 100644
index bd80072..0000000
--- a/zsh-completion/_fx_netaddr
+++ /dev/null
@@ -1,6 +0,0 @@
-_arguments \
-    '--help[Print help message.]' \
-    '--timeout=[Set discovery timeout to <msec>]:msec' \
-    '--nowait[Do not wait for first packet before timing out.]' \
-    '--fuchsia[Use fuchsia link local addresses.]' \
-    '::nodename:'
diff --git a/zsh-completion/_fx_push-package b/zsh-completion/_fx_push-package
deleted file mode 100644
index 1551822..0000000
--- a/zsh-completion/_fx_push-package
+++ /dev/null
@@ -1,3 +0,0 @@
-_arguments \
-  {-d,--device}'[target device]:nodename:__fx_nodename' \
-  '*:package:__fx_amber_package'
diff --git a/zsh-completion/_fx_push-package-no-publish b/zsh-completion/_fx_push-package-no-publish
deleted file mode 100644
index 0b85d40..0000000
--- a/zsh-completion/_fx_push-package-no-publish
+++ /dev/null
@@ -1,4 +0,0 @@
-_arguments \
-  {-d,--device}'[target device]:nodename:__fx_nodename' \
-  '*:package:__fx_amber_package'
-
diff --git a/zsh-completion/_fx_reboot b/zsh-completion/_fx_reboot
deleted file mode 100644
index 8c7a9fc..0000000
--- a/zsh-completion/_fx_reboot
+++ /dev/null
@@ -1,4 +0,0 @@
-_arguments \
-  {-r,--recovery}'[Reboot into recovery image]' \
-  {-b,--bootloader}'[Reboot into bootloader]' \
-  '::nodename:__fx_nodename'
diff --git a/zsh-completion/_fx_set b/zsh-completion/_fx_set
deleted file mode 100644
index ac0ea1b..0000000
--- a/zsh-completion/_fx_set
+++ /dev/null
@@ -1,83 +0,0 @@
-# _fx_set__board completes a board name
-_fx_set__board() {
-  compadd ${(u)$(cd ${fuchsia_dir} >/dev/null 2>&1;echo */boards/*.gni(N:t:r) vendor/*/boards/*.gni(N:t:r))}
-}
-
-# _fx_set__product completes a product name
-_fx_set__product() {
-  compadd ${(u)$(cd ${fuchsia_dir} >/dev/null 2>&1;echo */products/*.gni(N:t:r) vendor/*/products/*.gni(N:t:r))}
-}
-
-# _fx_set__package completes a package name
-_fx_set__package() {
-  compadd $(cd ${fuchsia_dir} >/dev/null 2>&1 && echo */packages/**/^*.*(.N) vendor/*/packages/**/^*.*(.N))
-}
-
-# _packages completes a comma separated list of packages
-_packages() {
-  # packages are files without extensions in */packages/ and vendor/*/packages/
-  _values -s , $(cd ${fuchsia_dir} >/dev/null 2>&1 && echo */packages/**/^*.*(.N) vendor/*/packages/**/^*.*(.N))
-}
-
-# _products completes a comma separated list of products
-_products() {
-  # products are .gni files in */products/ and vendor/*/products/
-  _values -s , $(cd ${fuchsia_dir} >/dev/null 2>&1 && echo */products/**/*.gni(.N) vendor/*/products/**/*.gni(.N))
-}
-
-_gn_args_caching_policy() {
-  test ${ninja_file} -nt $1
-}
-
-_gn_args() {
-  if [ -z ${fuchsia_build_dir} ]; then
-    return
-  fi
-
-  # apply a default caching policy if one isn't configured
-  local cache_policy
-  zstyle -s ":completion:${curcontext}:" cache-policy cache_policy
-  zstyle ":completion:${curcontext}:" cache-policy \
-      ${cache_policy:-_gn_args_caching_policy}
-
-  # if this file is newer than the cache file then the cache is stale
-  local ninja_file=${fuchsia_dir}/${fuchsia_build_dir}/build.ninja
-
-  local -a gn_args
-  if ! _retrieve_cache gn_args ; then
-    gn_args=( $(${fuchsia_dir}/buildtools/gn args ${fuchsia_dir}/${fuchsia_build_dir} --list --short|sed -e 's/ .*//') )
-    _store_cache gn_args gn_args
-  fi
-
-  # complete the list of gn args with an = suffix
-  compadd -S = ${gn_args}
-}
-
-# list of supported fuchsia architectures
-local -a archs
-archs=(
-  'x64:64 bit Intel'
-  'arm64:64 bit ARM'
-)
-
-# TODO: --help-args --zircon-arg
-
-# arguments to fx set
-_arguments '1:arch:{_describe "arch" archs}' \
-  '2::build_dir:{__fx_build_dir}' \
-  '(:)--board[Use the listed board configuration]:board:_fx_set__board' \
-  '(:)--product[Include the listed product in the build]:product:_fx_set__product' \
-  '(:)*--available[Package to be available for pushing]:package:_fx_set__package' \
-  '(:)*--preinstall[Package to be included in the system image]:package:_fx_set__package' \
-  '(:)*--monolith[Package to be included in the monolithic system image]:package:_fx_set__package' \
-  '(:)*--variant[Pass a select_variant GN arg]: ' \
-  '(:)*--fuzz-with[A sanitizer name to fuzz with]: ' \
-  '(:)--args[Arguments for GN gen]:args:{_gn_args}' \
-  '(:)--goma[Use GOMA]' \
-  "(:)--no-goma[Don't use GOMA]" \
-  "(:)--no-ensure-goma[Don't make sure GOMA is running]" \
-  '(:)--goma-dir[GOMA directory to use]:directory:_files -/' \
-  "(:)--ccache[Use ccache]" \
-  "(:)--no-ccache[Don't use ccache]" \
-  '(:)--ide[Generate files for an IDE]:ide:(eclipse vs vs2013 vs2015 vs2017 xcode qcreator json)' \
-  '(:)--release[Release build]'
diff --git a/zsh-completion/_fx_set-device b/zsh-completion/_fx_set-device
deleted file mode 100644
index 429a905..0000000
--- a/zsh-completion/_fx_set-device
+++ /dev/null
@@ -1 +0,0 @@
-__fx_nodename
diff --git a/zsh-completion/_fx_symbolize b/zsh-completion/_fx_symbolize
deleted file mode 100644
index 7195407..0000000
--- a/zsh-completion/_fx_symbolize
+++ /dev/null
@@ -1 +0,0 @@
-_files
diff --git a/zsh-completion/_fx_use b/zsh-completion/_fx_use
deleted file mode 100644
index febdd9c..0000000
--- a/zsh-completion/_fx_use
+++ /dev/null
@@ -1,3 +0,0 @@
-if (( $CURRENT == 2 )); then
-  __fx_build_dir
-fi
diff --git a/zsh-completion/_fx_verify-build-packages b/zsh-completion/_fx_verify-build-packages
deleted file mode 100644
index fb61e2f..0000000
--- a/zsh-completion/_fx_verify-build-packages
+++ /dev/null
@@ -1,5 +0,0 @@
-if (( $CURRENT == 2 )); then
-  local vendors=$(cd ${FUCHSIA_DIR} >/dev/null 2>&1; ls -1d vendor/*(/N) | grep -v third_party)
-  compadd garnet peridot topaz $vendors
-fi
-