有自定义操作时如何使用docker image tensorflow / serving服务tensorflow模型?

时间:2018-10-12 05:06:59

标签: docker tensorflow tensorflow-serving

我正在尝试在https://github.com/google/sentencepiece/tree/master/tensorflow的模型中使用tf-sentencepiece操作

构建模型并获取带有变量和资产的saved_model.pb文件没有问题。但是,如果我尝试将docker映像用于tensorflow / serving,它会显示

Loading servable: {name: model version: 1} failed: 
Not found: Op type not registered 'SentencepieceEncodeSparse' in binary running on 0ccbcd3998d1. 
Make sure the Op and Kernel are registered in the binary running in this process. 
Note that if you are loading a saved graph which used ops from tf.contrib, accessing 
(e.g.) `tf.contrib.resampler` should be done before importing the graph, 
as contrib ops are lazily registered when the module is first accessed.

我不熟悉如何手动构建任何东西,希望我可以做很多事情而无需做任何改动。

1 个答案:

答案 0 :(得分:0)

一种方法是:

  1. 拉出docker开发映像

    $ docker pull tensorflow / serving:最新开发

  2. 在容器中,更改代码

    $ docker run -it tensorflow / serving:latest-devel

修改代码以添加op依赖项here

  1. 在容器中,构建TensorFlow Serving

    容器:$ tensorflow_serving / model_servers:tensorflow_model_server && cp bazel-bin / tensorflow_serving / model_servers / tensorflow_model_server / usr / local / bin /

  2. 使用exit命令退出容器

  3. 查找容器ID:

    $ docker ps

  4. 使用该容器ID提交开发映像:

    $ docker commit $ USER / tf-serving-devel-custom-op

  5. 现在使用开发容器作为源构建服务容器

    $ mkdir / tmp / tfserving

    $ cd / tmp / tfserving

    $ git clone https://github.com/tensorflow/serving

    $ docker build -t $ USER / tensorflow-serving --build-arg TF_SERVING_BUILD_IMAGE = $ USER / tf-serving-devel-custom-op -f tensorflow_serving / tools / docker / Dockerfile。

  6. 您现在可以使用$ USER / tensorflow-serving在Docker instructions

  7. 之后投放图片