如何将模型导出为服务格式并将其用于客户端?

时间:2017-11-09 02:15:07

标签: tensorflow lstm tensorflow-serving

我使用https://github.com/hzy46/TensorFlow-Time-Series-Examples和 想要导出客户的服务模型格式。

导出估算工具有四个步骤

1.定义估算器的功能。

2.创建功能配置。

3.建立一个适合用于服务的export_input_fn。

4.使用export_savedmodel()导出模型。

我尝试使用

*export_dir_base = "./serving_save_model"
feature_spec = {
'times': tf.placeholder(tf.float32, name='times')
}
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)*

estimator.export_savedmodel(export_dir_base, serving_input_fn)

但我遇到了这样的错误:

Traceback (most recent call last):
File "E:/MyProject/Py/tensorFlow_time_series_predict/train_lstm_multivariate.py", line 231, in 
estimator.export_savedmodel(export_dir_base, serving_input_fn)
File "H:\ProgramFiles\Anaconda3\envs\tensorflow\lib\site-packages\tensorflow\python\estimator\estimator.py", line 504, in export_savedmodel
serving_input_receiver = serving_input_receiver_fn()
File "H:\ProgramFiles\Anaconda3\envs\tensorflow\lib\site-packages\tensorflow\python\estimator\export\export.py", line 142, in serving_input_receiver_fn
features = parsing_ops.parse_example(serialized_tf_example, feature_spec)
File "H:\ProgramFiles\Anaconda3\envs\tensorflow\lib\site-packages\tensorflow\python\ops\parsing_ops.py", line 577, in parse_example
[VarLenFeature, SparseFeature, FixedLenFeature, FixedLenSequenceFeature])
File "H:\ProgramFiles\Anaconda3\envs\tensorflow\lib\site-packages\tensorflow\python\ops\parsing_ops.py", line 291, in _features_to_raw_params
raise ValueError("Invalid feature %s:%s." % (key, feature))
ValueError: Invalid feature times:Tensor("times:0", dtype=float32).

我应该如何正确使用它? 非常感谢

0 个答案:

没有答案