考虑以下代码,创建一个saved_model
import tensorflow as tf
from tensorflow.keras.layers import Input, Lambda
from tensorflow.keras.models import Model
inp = Input((), dtype=tf.string, name="image_bytes")
net = Lambda(lambda t: tf.map_fn(lambda x: tf.io.decode_jpeg(x), t, dtype=tf.uint8))(inp)
net = Lambda(lambda t: tf.map_fn(lambda x: tf.io.encode_jpeg(x), t, dtype=tf.string), name="output")(net)
model = Model(inp, net, name="test_network")
tf.keras.experimental.export_saved_model(model, "runs/cmle_test_model", serving_only=True)
现在,我正在尝试获取在线预测请求以使用有效载荷
{"image_bytes":{"b64":"/9j/..."}}
https://gist.github.com/suyash/00d6846ab1a82e74f312ebb43b384c12上的完整负载
错误
{
"error": "Prediction failed: Error during model execution: AbortionError(code=StatusCode.INVALID_ARGUMENT, details=\"Expected input[1] == 'test_network/output/map/TensorArrayUnstack/TensorListFromTensor/element_shape:output:0' to be a control input.\n\tIn {{node test_network/lambda/map/TensorArrayV2Stack/TensorListStack}}\n\t [[{{node StatefulPartitionedCall}}]]\n\t [[{{node StatefulPartitionedCall}}]]\")"
}
但是,如果我只是简单地在本地
out = model(data)
工作正常
更新:
我有一些合作对象
inp = Input((), dtype=tf.string, name="image_bytes")
net = Lambda(lambda t: tf.expand_dims(tf.io.decode_jpeg(t[0]), 0))(inp)
net = Lambda(lambda t: tf.expand_dims(tf.io.encode_base64(tf.io.encode_jpeg(t[0])), 0), name="output")(net)
但是将服务的可用批处理大小固定为1。理想情况下,我想将Lambda
层与tf.map_fn
一起使用