我正在尝试这里描述的XLA教程:https://www.tensorflow.org/performance/xla/jit
我正在使用以下选项从https://raw.githubusercontent.com/tensorflow/tensorflow/r1.1/tensorflow/examples/tutorials/mnist/mnist_softmax_xla.py运行mnist_softmax_xla.py
:
TF_CPP_MIN_VLOG_LEVEL=2 TF_XLA_FLAGS='--xla_generate_hlo_graph=.*' python mnist_softmax_xla.py
不幸的是,我在输出中得到了一堆“自定义创建者错误:无效参数:没有_XlaCompile for Const”错误(对于每种其他类型的操作符都是如此)。此外,没有创建hlo_graph_xx.dot文件(正如教程所说的那样)。
我的python安装是Ubuntu 16.04 LTS上的Anaconda 4.3.1(Anaconda3-4.3.1-Linux-x86_64.sh)。
TensorFlow是从源代码编译的1.1.0版,具有以下命令:
$ echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
$ curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add -
$ sudo apt-get update && sudo apt-get install bazel
$ git clone https://github.com/tensorflow/tensorflow
$ cd tensorflow/
$ git checkout v1.1.0
$ ./configure
Please specify the location of python. [Default is /home/ubuntu/anaconda3/bin/python]:
Please specify optimization flags to use during compilation when bazel option "--config=opt" is specified [Default is -march=native]:
Do you wish to use jemalloc as the malloc implementation? [Y/n]
jemalloc enabled
Do you wish to build TensorFlow with Google Cloud Platform support? [y/N]
No Google Cloud Platform support will be enabled for TensorFlow
Do you wish to build TensorFlow with Hadoop File System support? [y/N]
No Hadoop File System support will be enabled for TensorFlow
Do you wish to build TensorFlow with the XLA just-in-time compiler (experimental)? [y/N] y
XLA JIT support will be enabled for TensorFlow
Found possible Python library paths:
/home/ubuntu/anaconda3/lib/python3.6/site-packages
Please input the desired Python library path to use. Default is [/home/ubuntu/anaconda3/lib/python3.6/site-packages]
Using python library path: /home/ubuntu/anaconda3/lib/python3.6/site-packages
Do you wish to build TensorFlow with OpenCL support? [y/N]
No OpenCL support will be enabled for TensorFlow
Do you wish to build TensorFlow with CUDA support? [y/N]
No CUDA support will be enabled for TensorFlow
Configuration finished
............
INFO: Starting clean (this may take a while). Consider using --expunge_async if the clean takes more than several minutes.
...........
INFO: All external dependencies fetched successfully.
$ bazel build --config=opt //tensorflow/tools/pip_package:build_pip_package
$ bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
$ pip install /tmp/tensorflow_pkg/tensorflow-1.1.0-cp36-cp36m-linux_x86_64.whl
为什么XLA无法使用此设置?
如何使用XLA安装TensorFlow?
答案 0 :(得分:0)
解决!
这个原因在XLA教程中有点模糊:
注意:不会导致在会话级别启用JIT 正在为CPU编译的操作。 CPU的JIT编译 操作必须通过下面记录的手动方法完成。这个 由于CPU后端是单线程的,因此做出了决定。
这不适用于CPU:
config = tf.ConfigProto()
config.graph_options.optimizer_options.global_jit_level = tf.OptimizerOptions.ON_1
sess = tf.Session(config=config)
...
这适用于CPU(以及GPU):
jit_scope = tf.contrib.compiler.jit.experimental_jit_scope
with jit_scope():
...