如何从PredictResponse对象获取float_val?

时间:2017-08-07 22:16:57

标签: python tensorflow protocol-buffers tensorflow-serving

我遇到了this问题:

在张量流服务模型上运行预测后,我将这个PredictResponse对象作为输出返回:

outputs {
  key: "scores"
  value {
    dtype: DT_FLOAT
    tensor_shape {
      dim {
        size: 1
      }
      dim {
        size: 2
      }
    }
    float_val: 0.407728463411
    float_val: 0.592271506786
  }    
}

正如该问题所示,我尝试使用:     result.outputs [ '输出']。float_val

然后返回类型<type google.protobuf.pyext._message.RepeatedScalarContainer>

它是由这段代码生成的,受到inception_client.py示例的启发:

channel = implementations.insecure_channel(host, int(port))
stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)
result = stub.Predict(request, 10.0)  # 10 secs timeout

提前致谢!

2 个答案:

答案 0 :(得分:5)

result.outputs['scores'].float_val[0]result.outputs['scores'].float_val[1]是此响应中的浮点值。

为了将来参考,documentation for the python bindings to protocol buffers解释了这个问题和其他问题。

答案 1 :(得分:0)

In case you have more than one outputs with their names stored in the output_names list, you can do something like the following in order to create a dictionary with output names as keys and a list with whatever the model returns as values.

results = dict()
for output in output_names:
    results[output] = response.outputs[output].float_val