因此,我在训练模型后尝试部署TensorFlow排名库,然后使用服务输入功能将模型导出为save_model格式:
def serving_input_receiver_fn():
feature_names = ["{}".format(i + 1) for i in range(num_features)]
feature_columns = [tf.feature_column.numeric_column(
name, shape=(1,), default_value=0.0) for name in feature_names]
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()
ranker.export_savedmodel('export', serving_input_receiver_fn)
使用此工具,我可以在导出文件夹中获取saed_model.pb和变量。
现在,当我使用save_model_cli工具进行检查时 我
The given SavedModel SignatureDef contains the following input(s):
inputs['examples'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['output'] tensor_info:
dtype: DT_FLOAT
shape: (-1, -1)
name: groupwise_dnn_v2/accumulate_scores/truediv:0
Method name is: tensorflow/serving/predict
&还能够通过运行
获得预测saved_model_cli run \
--dir ./ \
--tag_set serve \
--signature_def predict \
--input_examples 'examples=[{"2":[0.5348837209302325],"3":[0.75],"4":[33.68298368298368],"5":[14.4054],"6":[1707.0],"8":[1.0],"11":[1.0],"14":[1.0],"15":[1.0]},{"2":[0.3409090909090909],"4":[50.0],"5":[0.8047000000000001],"6":[833.0],"12":[1.0]}]'
saved_model_cli run \
--dir ./ \
--tag_set serve \
--signature_def predict \
--input_examples 'instances=[{"5":[1.67],"6":[955.0],"13":[1.0]}]'
用于2和1输入实例。我的数据采用LibSVM格式,我已经收到相关性标志&qid
0 qid:1040 2:0.3333333333333333 4:50 5:1.9323 6:800 12:1 converted to
{"2":[0.3333333333333333], "4":[50] ,"5":[1.932],"6":[800],"12":[1]}
用于预测。
现在,我按照教程https://medium.com/devseed/technical-walkthrough-packaging-ml-models-for-inference-with-tf-serving-2a50f73ce6f8进行操作 但是当我使用curl或通过使用tensorflow_serving_api在python中加载模型来传递代码时,出现格式错误。
有人可以告诉我确切的格式以及我们如何推断出它来帮助我吗?
谢谢