我正在尝试将tf2.keras模型转换为tflite,但出现以下错误:
ValueError: Invalid input size: expected 2 items got 1 items.
我的网络是暹罗语-它有2个输入,都输入到同一主干中:
input_shape = (image_size, image_size, 3)
left_input = tf.keras.layers.Input(shape=input_shape, name='left_input')
right_input = tf.keras.layers.Input(shape=input_shape, name='right_input')
# define base model:
general_input = tf.keras.layers.Input(shape=input_shape)
x = build_mobilenet(inputs=general_input) # builds a standart mobilenet model
backbone_model = tf.keras.Model(general_input, x)
# run both examples:
left_features = backbone_model(left_input)
right_features = backbone_model(right_input)
output = tf.keras.layers.Subtract(name='diff')([left_features, right_features])
# continue run some more actions over the output tensor...
在训练期间,我的数据集对象返回输入字典和标签:{'left_input': im_left, 'right_input': im_right}, label
当尝试对模型进行分类时,我有一个代表性的数据集对象,该对象仅返回输入(不带标签):return {'left_input': left, 'right_input': right}
。
用于qunatization的tflite代码:
data_generator = DataProvider(num_images=10)
model = tf.keras.models.load_model(float32_model_path, compile=False)
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.OPTIMIZE_FOR_SIZE]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.int8
converter.inference_output_type = tf.int8
converter.representative_dataset = data_generator
tflite_model = converter.convert()
调用converter.convert()
时发生错误。有人知道这可能是什么问题吗?
谢谢!