我已经导出了一个DNNClassifier模型,并使用docker在tensorflow服务器上运行它。之后我编写了一个python客户端来与该tensorflow进行交互,为新的预测服务。
我编写了以下代码来获取tensorflow服务器的响应。
host, port = FLAGS.server.split(':')
channel = implementations.insecure_channel(host, int(port))
stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)
request = predict_pb2.PredictRequest()
request.model_spec.name = FLAGS.model
request.model_spec.signature_name = 'serving_default'
feature_dict = {'a': _float_feature(value=400),
'b': _float_feature(value=5),
'c': _float_feature(value=200),
'd': _float_feature(value=30),
'e': _float_feature(value=60),
'f': _float_feature(value=5),
'g': _float_feature(value=7500),
'h': _int_feature(value=1),
'i': _int_feature(value=1234),
'j': _int_feature(value=1),
'k': _int_feature(value=4),
'l': _int_feature(value=1),
'm': _int_feature(value=0)}
example= tf.train.Example(features=tf.train.Features(feature=feature_dict))
serialized = example.SerializeToString()
request.inputs['inputs'].CopyFrom(
tf.contrib.util.make_tensor_proto(serialized, shape=[1]))
result_future = stub.Predict.future(request, 5.0)
print(result_future.result())

答案 0 :(得分:6)
您可以执行以下操作
result = stub.Predict(request, 5.0)
float_val = result.outputs['outputs'].float_val
请注意,此方法会调用stub.Predict
而不是stub.Predict.future
答案 1 :(得分:1)
这是@Maxime De Bruyn
给出的答案的补充,
在使用mobilenet / inception模型的具有多个预测输出的预测API中,以下代码段对我不起作用。
result = stub.Predict(request,5.0)
float_val = result.outputs ['输出'] .float_val
print(“ Output:”,float_val)
Output: []
相反,我必须在输出中使用“预测”键。
result = stub.Predict(request, 5.0)
predictions = result.outputs['prediction'].float_val
print("Output: ", predictions)
Output: [0.016111543402075768, 0.2446805089712143, 0.06016387417912483, 0.12880375981330872, 0.035926613956689835, 0.026000071316957474, 0.04009509086608887, 0.35264086723327637, 0.0762331634759903, 0.019344471395015717]
答案 2 :(得分:0)
In case you have more than one outputs, you do something like the following which basically creates a dictionary with keys corresponding to the outputs and values corresponding to a list of whatever the model returns.
results = dict()
for output in output_names:
results[output] = response.outputs[output].float_val
答案 3 :(得分:0)
您正在寻找的可能是tf.make_ndarray
,它从TensorProto创建一个numpy数组(即,是tf.make_tensor_proto
的反函数)。这样,您的输出将恢复其应有的形状,因此,基于茉莉花的答案,您可以使用以下命令将多个输出存储在字典中:
response = prediction_service.Predict(request, 5.0)
results = {}
for output in response.outputs.keys():
results[output] = tf.make_ndarray(response.outputs[output])