tree: 43a0ed47a54c9ae3ca9ad780600de8f2b20c3d2d [path history] [tgz]
  1. g3doc/
  2. lite/
  3. python/
  4. quantization/
  5. stablehlo/
  6. tensorflow/
  7. tf2xla/
  8. tfr/
  9. tfrt/
  10. tools/
  11. tosa/
  12. utils/
  13. BUILD
  14. glob_lit_test.bzl
  15. init_mlir.cc
  16. init_mlir.h
  17. mlir_graph_optimization_pass.cc
  18. mlir_graph_optimization_pass.h
  19. mlir_graph_optimization_pass_registration.cc
  20. mlir_graph_optimization_pass_test.cc
  21. op_or_arg_name_mapper.cc
  22. op_or_arg_name_mapper.h
  23. README.md
  24. register_common_dialects.cc
  25. register_common_dialects.h
  26. register_common_dialects_test.cc
  27. runlit.cfg.py
  28. runlit.site.cfg.py
  29. tf_mlir_opt_main.cc
  30. tf_mlir_reduce_main.cc
  31. tf_mlir_translate_main.cc
tensorflow/compiler/mlir/README.md

MLIR dialects and utilities for TensorFlow, TensorFlow Lite and XLA.

This module contains the MLIR (Multi-Level Intermediate Representation) dialects and utilities for

  1. TensorFlow
  2. XLA
  3. TF Lite

See MLIR's website for complete documentation.

Getting started

Building dialects and utilities here follow the standard approach using bazel as the rest of TensorFlow.

Using local LLVM repo

To develop across MLIR core and TensorFlow, it is useful to override the repo to use a local version instead of fetching from head. This can be achieved by setting up your local repository for Bazel build. For this you will need to create bazel workspace and build files:

LLVM_SRC=... # this the path to the LLVM local source directory you intend to use.
touch ${LLVM_SRC}/BUILD.bazel ${LLVM_SRC}/WORKSPACE

You can then use this overlay to build TensorFlow:

bazel build --override_repository="llvm-raw=${LLVM_SRC}" \
  -c opt tensorflow/compiler/mlir:tf-opt