在docker容器中运行Tensorflow Serving时获取以下错误消息
2019-12-12 03:25:13.947401: I tensorflow_serving/model_servers/server.cc:85] Building single TensorFlow model file config: model_name: mymodel model_base_path: /models/mymodel
2019-12-12 03:25:13.947870: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2019-12-12 03:25:13.947891: I tensorflow_serving/model_servers/server_core.cc:573] (Re-)adding model: mymodel
2019-12-12 03:25:14.058166: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: mymodel version: 1}
2019-12-12 03:25:14.058430: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: mymodel version: 1}
2019-12-12 03:25:14.059106: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: mymodel version: 1}
2019-12-12 03:25:14.064459: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: mymodel version: 1} failed: Not found: Specified file path does not appear to contain a SavedModel bundle (should have a file called `saved_model.pb`)
Specified file path: /models/mymodel/1
该模型是使用tensorflow v1.5构建的,没有* .pb文件。是否可以运行此版本的tensorflow模型?任何想法表示赞赏。预先感谢。
答案 0 :(得分:1)
是的,您可以在tfserving上部署在Tensorflow v1.5上训练的模型。
TfServing需要SavedModel
格式。
您的训练脚本可能存在一些配置问题。 (但是,由于您没有提供代码,因此很难查明,为了使他人更好地理解,请始终尝试将代码包含在有关SO的问题中)
要获取SavedModel
格式,请通过official script训练模型。
训练后,您将在指定的模型目录中获得以下目录结构。
<model_dir>
|
|----- variables
| |------- variables.data-00000-of-00001
| |------- variables.index
|
|----- saved_model.pb
然后,您可以直接指定tfserving的<model_dir>
路径,它将使用此模型。