在tf句子的Custom Op TensorFlow Serving中找不到模型

时间:2019-11-15 07:33:40

标签: tensorflow bazel tensorflow-serving

我遵循了a suggested solution for issue #325 on sentencepiece,并更改了一些部分以适应RHEL。但是,我在最后一步中遇到了错误。这是我的脚本:

CWD=$(pwd)
GPROTOBUF_URL=https://github.com/protocolbuffers/protobuf/releases/download/v3.7.0/protobuf-cpp-3.7.0.tar.gz

compile_gprotobuf () {
    sudo yum install autoconf automake libtool curl make gcc-c++ unzip && # gcc-c++ instead of (Ubuntu) g++
    wget $GPROTOBUF_URL &&
    fname=protobuf-cpp-3.7.0.tar.gz &&
    tar xvzf $fname && rm $fname &&
    cd protobuf-3.7.0 &&
    ./configure CXXFLAGS="-D_GLIBCXX_USE_CXX11_ABI=0" &&
    make &&
    make check &&
    sudo make install &&
    sudo ldconfig # refresh shared library cache. 
    cd $CWD
}

build_tensorflow_serving_and_sentencepiece () {
    git clone -b 'r1.14' --single-branch --depth 1 https://github.com/tensorflow/serving.git &&
    mkdir -p serving/tensorflow_serving/custom_ops/sentencepiece_processor &&
    git clone https://github.com/google/sentencepiece.git serving/tensorflow_serving/custom_ops/sentencepiece_processor/sentencepiece &&
    sudo yum install gcc gcc-c++ kernel-devel && # instead of (Ubuntu) sudo apt-get install build-essential
    sudo yum install cmake && 
    sudo yum provides */pkg-config &&
    sudo yum install google-perftools google-perftools-devel && # (Ubuntu) sudo apt-get install libgoogle-perftools-dev &&
    cd serving/tensorflow_serving/custom_ops/sentencepiece_processor/sentencepiece &&
    mkdir build &&
    cd build &&

    cmake -DSPM_USE_BUILTIN_PROTOBUF=OFF -DSPM_ENABLE_TENSORFLOW_SHARED=ON .. &&
    make -j $(nproc) &&
    sudo make install &&
    sudo ldconfig -v &&

    cd $CWD &&
    # cp ./build ./serving/tensorflow_serving/custom_ops/sentencepiece_processor/build && # I can't find this BUILD directory
    sed -i.bak '/@org_tensorflow\/\/tensorflow\/contrib:contrib_ops_op_lib/a\    "\/\/tensorflow_serving\/custom_ops\/sentencepiece_processor:sentencepiece_processor_ops",' ./serving/tensorflow_serving/model_servers/BUILD &&
    sed -i '/name = "tensorflow_model_server",/a\    linkopts = ["-Wl,--allow-multiple-definition", "-Wl,-rpath,/usr/lib"],' ./serving/tensorflow_serving/model_servers/BUILD

    wget https://copr.fedorainfracloud.org/coprs/vbatts/bazel/repo/epel-7/vbatts-bazel-epel-7.repo
    sudo cp vbatts-bazel-epel-7.repo /etc/yum.repos.d/vbatts-bazel-epel-7.repo
    sudo yum install bazel
    # https://docs.bazel.build/versions/master/install-redhat.html

    cd serving && 
    tools/run_in_docker.sh bazel build tensorflow_serving/model_servers:tensorflow_model_server
}

main () {
    echo "Workdir: ${CWD}"
    compile_gprotobuf
    build_tensorflow_serving_and_sentencepiece
}

main

我删除了--max_idle_secs,因为它给了我一个错误ERROR: Unrecognized option: --max_idle_secs=60。但是此脚本仍会在Fetching @org_tensorflow中引起问题,该问题显示

ERROR: error loading package '': in /my_path/serving/tensorflow_serving/workspace.bzl: Encountered error while reading extension file 'tensorflow/workspace.bzl': no such package '@org_tensorflow//tensorflow': java.io.IOException: Error downloading [https://mirror.bazel.build/github.com/tensorflow/tensorflow/archive/87989f69597d6b2d60de8f112e1e3cea23be7298.tar.gz, https://github.com/tensorflow/tensorflow/archive/87989f69597d6b2d60de8f112e1e3cea23be7298.tar.gz] to /my_path/serving/.cache/_bazel_opc/3fa7ae51721c1323b37adf18f7a53821/external/org_tensorflow/87989f69597d6b2d60de8f112e1e3cea23be7298.tar.gz: All mirrors are down: []

我尝试了几种更改http_proxy的方法(使用HTTP_PROXY,下载tar.gz文件并将其放在目录--action_env=HTTP_PROXY=$HTTP_PROXY中,...),但没有一个起作用。< / p>

最后,我在IMAGE="tensorflow/serving:nightly-devel"中将图像从IMAGE="tensorflow/serving:1.14.0"更改为run_in_docker.sh,并且不需要提取bazel。

但是,它显示此错误: unknown argument: bash,还有FileSystemStoragePathSource encountered a filesystem access error: Could not find base path /models/model for servable model。有谁知道如何分配正确的模型库?谢谢。

0 个答案:

没有答案