无法使用导出的模型在GoogleCP Vision AutoML模型中运行docker

时间:2020-03-18 07:47:37

标签: docker tensorflow google-cloud-automl automl

我使用GCP AutoML Vision训练了图像分类模型,我想使用Docker将其部署在自己的Web应用程序中。在tutorial from GCP之后,我将Vision autoML模型导出到 saved_model.pb ,并设法将其复制到本地驱动器。

sudo docker run --rm --name ${CONTAINER_NAME} -p ${PORT}:8501 -v ${YOUR_MODEL_PATH}:/tmp/mounted_model/0001 -t ${CPU_DOCKER_GCR_PATH}

当我尝试运行docker映像时,出现错误。以下错误消息:

2020-03-18 06:52:52.851811: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2020-03-18 06:52:52.851825: I tensorflow_serving/model_servers/server_core.cc:559]  (Re-)adding model: default
2020-03-18 06:52:52.859873: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: default version: 1}
2020-03-18 06:52:52.859923: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: default version: 1}
2020-03-18 06:52:52.859938: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: default version: 1}
2020-03-18 06:52:52.860387: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /tmp/mounted_model/0001
2020-03-18 06:52:52.860426: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /tmp/mounted_model/0001
2020-03-18 06:52:52.861256: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-03-18 06:52:52.861345: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:310] SavedModel load for tags { serve }; Status: fail. Took 916 microseconds.
2020-03-18 06:52:52.861357: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: default version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`

我在网上做了一些研究,看来问题出在模型的导出部分,我导出模型时GCP没有提供任何选择。谢谢大家,我真的可以使用帮助。

1 个答案:

答案 0 :(得分:0)

似乎模型没有对应于serving tag的图形。

我在Tensorflow github page中发现了类似的问题。要检查已保存模型中的可用标签集,可以使用SavedModel CLI,也可以使用saved_model_cli来检查标签:

$ saved_model_cli show --dir ./modelDir

我发现如何从Tensorflow Hub中将服务标签添加到模型中,看来使用Transfer Learning可以帮助您使用serving tag导出或保存模型。