我正在尝试使用以下命令行量化Tensorflow SavedModel :
tflite_convert \
--output_file=/tmp/foo.tflite \
--saved_model_dir=/tmp/saved_model
但是出现以下错误:
ValueError: No 'serving_default' in the SavedModel's SignatureDefs. Possible values are 'my model name'
我已经检查过,导出模型时定义了 signature_def_map 。
命令:
saved_model_cli show --dir /tmp/mobilenet/1 --tag_set serve
返回
The given SavedModel MetaGraphDef contains SignatureDefs with the following keys:
SignatureDef key: 'name_of_my_model'
和:
The given SavedModel SignatureDef contains the following input(s):
inputs['is_training'] tensor_info:
dtype: DT_BOOL
shape: ()
name: is_training:0
inputs['question1_embedding'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 35, 300)
name: question1_embedding:0
inputs['question2_embedding'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 35, 300)
name: question2_embedding:0
The given SavedModel SignatureDef contains the following output(s):
outputs['prediction'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: prediction:0
Method name is: tensorflow/serving/predict
答案 0 :(得分:0)
转换时,您应该可以使用saved_model_signature_key
指定签名名称
tflite_convert \
--output_file=/tmp/foo.tflite \
--saved_model_dir=/tmp/saved_model \
--saved_model_signature_key='my model name'