我已经成功地为带有张量流服务的模型提供了服务,并从中获得了预测。但是,为了提高效率,我想尝试输入图像的base64编码。我尝试了以下方法,但无法使其正常工作。有人能指出我正确的方向吗?
我加载一个已经训练好的模型,并准备将其与TF Serve一起使用:
import keras
import tensorflow
from keras import backend as K
K.set_learning_phase(0)
# model = load_model()
if fixed_shape is not None:
shape = [None, *fixed_shape]
else:
shape = [None, None, None, 3]
string_inp = tensorflow.placeholder(
tensorflow.string, shape=(None,)) # string input for the base64 encoded image
imgs_map = tensorflow.map_fn(
tensorflow.image.decode_image,
string_inp,
dtype=tensorflow.float32
) # decode the jpeg
imgs_map.set_shape(shape)
pred = model(imgs_map)
model_inputs = list(string_inp)
model_outputs = list(pred)
inputs = {i.name + "_bytes": i for i in model_inputs}
outputs = {f"output {i}:" + o.name: o for i, o in enumerate(model_outputs)}
signature = tensorflow.saved_model.signature_def_utils.predict_signature_def(
inputs=inputs, outputs=outputs
)
builder = tensorflow.saved_model.builder.SavedModelBuilder("out_dir")
builder.add_meta_graph_and_variables(
sess=K.get_session(),
tags=[tensorflow.saved_model.tag_constants.SERVING],
signature_def_map={
tensorflow.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
signature
})
builder.save()
然后为我使用的客户端
image_byte = base64.b64encode(batch[0]).decode("utf-8")
payload = '{"instances" : [{"b64": "%s"}]}' % image_byte
response = requests.post(predict_url, data=payload)
,结果为requests.exceptions.HTTPError: 400 Client Error: Bad Request for url:
。但是,如果我在邮递员中尝试相同的请求,则会得到:
{
"error": "2 root error(s) found.\n (0) Invalid argument: assertion failed: [Unable to decode bytes as JPEG, PNG, GIF, or BMP]\n\t [[{{node map_5/while/decode_image/cond_jpeg/cond_png/cond_gif/Assert_1/Assert}}]]\n\t [[GroupCrossDeviceControlEdges_0/map_5/while/decode_image/cond_jpeg/cond_png/cond_gif/DecodeBmp/_4702]]\n (1) Invalid argument: assertion failed: [Unable to decode bytes as JPEG, PNG, GIF, or BMP]\n\t [[{{node map_5/while/decode_image/cond_jpeg/cond_png/cond_gif/Assert_1/Assert}}]]\n0 successful operations.\n0 derived errors ignored."
}