在张量流中将模型 SSD MobileNetv1 转换为 tflite 时精度低

时间:2021-04-06 18:50:17

标签: python tensorflow

我在 tensorflow 中使用 tensorflow 2.4.1. 对象检测 API 我修改了 SSD Mobilenetv1 预训练模型,并且我正在尝试将其转换为 tensorflow lite。但是,我得到了一个不同的输出,其性能比以前低得多。

converter = tf.lite.TFLiteConverter.from_saved_model('saved_model')
converter.experimental_new_converter = True
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# #converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.target_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.target_spec.supported_types = [tf.int8]
converter.inference_input_type = tf.uint8
# #converter.inference_output_type = tf.uint8
# converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8,
                                        tf.lite.OpsSet.TFLITE_BUILTINS]
images_list= os.listdir("/content/MyDrive/MyDrive/representative_data/")
def load_image_into_numpy_array(path):
    return np.array(Image.open(path))
def representative_data_gen():
  for i in range(72):
    image=cv2.imread("/content/MyDrive/MyDrive/representative_data/"+images_list[i])
    images = tf.expand_dims(tf.cast(image, tf.uint8), 0)
    #print(images)
    images = tf.compat.v1.image.resize_bilinear(images, [640, 640], align_corners=False)
    yield [images]

converter.representative_dataset=representative_data_gen
tflite_model = converter.convert()

# Serialize the model
open('model.tflite', 'wb').write(tflite_model)

saved_model 是我在本地存储的模型。当我尝试使用 model.tflite 文件进行推理时,我得到的值与原始模型不同。在一张图像的原始模型上,我得到了这个输出:

[[0.24337725 0.6351127  0.38516122 0.7076594 ]
 [0.2673622  0.36415854 0.39408043 0.44303796]
 [0.2854549  0.15576798 0.3912888  0.21461737]
 [0.54561293 0.9263026  0.6120709  0.9571031 ]
 [0.53607523 0.93863803 0.613716   0.9708498 ]
 [0.5300186  0.95096165 0.6101491  0.9829783 ]
 [0.         0.25321186 0.21036099 0.4150495 ]
 [0.97977203 0.94243306 0.999891   0.9985258 ]
 [0.5688855  0.98845106 0.68230766 1.        ]
 [0.26439393 0.         0.3748169  0.0052964 ]
 [0.523868   0.95313215 0.5843951  0.9843085 ]
 [0.5561901  0.9268055  0.6328303  0.9596125 ]
 [0.56333405 0.9297103  0.61194354 0.97363955]
 [0.5242113  0.94434035 0.57968974 0.9733478 ]
 [0.24705899 0.58940554 0.36841512 0.6353431 ]
 [0.6998688  0.95695746 0.935374   0.9985087 ]
 [0.5676778  0.944553   0.61072737 0.989732  ]
 [0.52943265 0.9224371  0.60241354 0.9664552 ]
 [0.29492486 0.5143376  0.33912945 0.54321367]
 [0.35466743 0.38035372 0.39884788 0.43569848]
 [0.29579696 0.         0.34290215 0.00980917]
 [0.52158904 0.9574179  0.6159549  0.99213105]
 [0.272836   0.3718056  0.36658022 0.455931  ]
 [0.8567797  0.99169993 1.         0.9983983 ]
 [0.2602372  0.4269493  0.3221747  0.45679078]
 [0.30840918 0.         0.39378992 0.00409367]
 [0.54609305 0.9337664  0.5939192  0.97229046]
 [0.5369955  0.9213194  0.588888   0.9528671 ]
 [0.55414003 0.9327579  0.6188144  0.9981975 ]
 [0.24911809 0.         0.39413297 0.01658437]
 [0.54612064 0.98965347 0.63472795 1.        ]
 [0.3359316  0.6405184  0.386935   0.69778955]
 [0.28348807 0.         0.33257052 0.00864748]
 [0.55936444 0.93675387 0.63903964 0.9675981 ]
 [0.56318516 0.9523619  0.64057285 0.9842929 ]
 [0.5385444  0.9243466  0.58092284 0.9635858 ]
 [0.5591617  0.9668263  0.6213115  0.99683106]
 [0.526799   0.93967676 0.5644086  0.97707677]
 [0.2894494  0.5127831  0.3259099  0.5462169 ]
 [0.24881686 0.432522   0.3440869  0.4542246 ]
 [0.53722316 0.93652713 0.58082765 0.9754019 ]
 [0.9142558  0.8036172  0.9953465  0.87431747]
 [0.5822151  0.98919743 0.7393905  0.99906236]
 [0.550308   0.9311302  0.6526129  0.9735651 ]
 [0.9681886  0.95588744 0.9971228  0.99962986]
 [0.2510388  0.63873756 0.293042   0.70286167]
 [0.35704705 0.         0.43682608 0.00618043]
 [0.5451097  0.96104217 0.6516574  0.99736154]
 [0.24710357 0.6098395  0.3842827  0.65444416]
 [0.5442216  0.9250356  0.6096328  0.980747  ]
 [0.24251309 0.6482344  0.27971676 0.7081458 ]
 [0.5770702  0.97158587 0.6416947  0.99976385]
 [0.2676804  0.         0.3186664  0.00815568]
 [0.5273864  0.9546605  0.56628644 0.9894375 ]
 [0.54153293 0.9635044  0.6070141  0.99266726]
 [0.28929484 0.15221418 0.33980817 0.21486779]
 [0.51905644 0.8281502  0.56287026 0.8531384 ]
 [0.5725315  0.96221906 0.6105607  0.9990316 ]
 [0.32806867 0.43091613 0.39902395 0.45719367]
 [0.52576494 0.92725724 0.5765526  0.9592081 ]
 [0.5616086  0.9661043  0.6829738  1.        ]
 [0.5533599  0.94553447 0.6016622  0.9873934 ]
 [0.26665998 0.40935722 0.3285365  0.44829842]
 [0.5310731  0.9906043  0.6118139  1.        ]
 [0.34110123 0.6248279  0.38744426 0.6725014 ]
 [0.97393394 0.93869317 0.99925125 0.9814497 ]
 [0.5185562  0.9410181  0.6063331  0.9772024 ]
 [0.57235855 0.97020566 0.7334549  1.        ]
 [0.32493848 0.         0.413765   0.0036816 ]
 [0.52695143 0.9689187  0.58879185 0.9933872 ]
 [0.55055976 0.9458971  0.658028   0.99045086]
 [0.2927954  0.5268235  0.33840135 0.5597528 ]
 [0.9787069  0.9571627  1.         1.        ]
 [0.3356672  0.         0.4292863  0.0049988 ]
 [0.3076461  0.37833217 0.39782172 0.4637287 ]
 [0.568      0.9812284  0.70725244 1.        ]
 [0.5294949  0.92605036 0.5946635  0.98638374]
 [0.28798655 0.5223129  0.32700756 0.56118226]
 [0.98412776 0.92357624 0.999544   0.9758645 ]
 [0.32098186 0.         0.36190385 0.00738669]
 [0.25278595 0.         0.3067849  0.00931515]
 [0.51394624 0.87113017 0.56284195 0.8938641 ]
 [0.22352192 0.57925904 0.36973855 0.6182579 ]
 [0.5131188  0.9547114  0.5616221  0.98266846]
 [0.97893876 0.3624356  0.9991333  0.42952105]
 [0.2654712  0.38837025 0.30062047 0.447562  ]
 [0.2904047  0.80922455 0.34148172 0.83394593]
 [0.00354181 0.98609024 0.09195718 0.9972933 ]
 [0.3118856  0.62980604 0.39645123 0.65791214]
 [0.24747658 0.6742873  0.37653744 0.7318976 ]
 [0.2475342  0.65337795 0.31528282 0.703096  ]
 [0.24661763 0.43916363 0.33284774 0.46883237]
 [0.5544908  0.9149674  0.6185392  0.9745927 ]
 [0.56396806 0.918386   0.6118021  0.9565282 ]
 [0.29127085 0.50906324 0.33064878 0.53714204]
 [0.2144     0.         0.35042924 0.005808  ]
 [0.36967522 0.         0.41315717 0.010372  ]
 [0.25519958 0.414246   0.31013736 0.44864067]
 [0.50648683 0.9921565  0.59263617 1.        ]
 [0.50682306 0.96514183 0.6035181  0.99603957]]
[1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1]
[0.99707234 0.9919825  0.98964745 0.28732723 0.10270327 0.08018646
 0.05177296 0.04935146 0.04088344 0.03827675 0.03675034 0.03669814
 0.03602394 0.03536698 0.03451921 0.03371903 0.03059173 0.02965695
 0.02786072 0.02698742 0.02459854 0.02405082 0.02397973 0.02341228
 0.0230511  0.02214395 0.0220136  0.02189807 0.02145434 0.02136814
 0.02095257 0.02083707 0.020369   0.01975479 0.01961022 0.01947124
 0.01936822 0.0193326  0.01923051 0.0191756  0.01907553 0.01905495
 0.01904101 0.01900733 0.01886942 0.01877986 0.01860638 0.0182397
 0.01818053 0.0178097  0.01772521 0.01766635 0.01741816 0.01731062
 0.01728319 0.01716009 0.01715893 0.01703525 0.01689868 0.01661517
 0.01655028 0.01654574 0.01640137 0.01629853 0.01593319 0.01581809
 0.01569834 0.01559935 0.01559684 0.015499   0.0150541  0.01491728
 0.01487628 0.01473732 0.01454685 0.01452865 0.0145071  0.01448266
 0.01439441 0.01434552 0.01427445 0.0141159  0.01389657 0.01376216
 0.01351939 0.01348546 0.01337216 0.0133338  0.01326786 0.01315651
 0.01309206 0.0130407  0.01296759 0.01295712 0.01287825 0.0127016
 0.01267427 0.01267344 0.01254466 0.01246487]
100
array([[[113, 111, 116],
        [110, 108, 113],
        [ 98,  96, 101],
        ...,
        [ 40,  59, 102],
        [ 72,  98, 157],
        [ 86, 116, 186]],

       [[105, 100, 106],
        [ 99,  97, 100],
        [ 88,  86,  89],
        ...,
        [ 39,  58,  98],
        [ 48,  73, 129],
        [ 79, 109, 173]],

       [[ 96,  92,  93],
        [ 91,  87,  88],
        [ 83,  79,  80],
        ...,
        [ 38,  57,  90],
        [ 31,  57, 105],
        [ 71, 100, 157]],

       ...,

       [[ 85,  83,  88],
        [ 87,  85,  90],
        [ 91,  89,  92],
        ...,
        [ 28,  27,  32],
        [ 28,  27,  32],
        [ 29,  28,  33]],

       [[ 86,  84,  89],
        [ 89,  87,  92],
        [ 92,  90,  95],
        ...,
        [ 29,  28,  33],
        [ 29,  28,  33],
        [ 30,  29,  34]],

       [[ 93,  91,  96],
        [ 96,  94,  99],
        [100,  98, 103],
        ...,
        [ 30,  29,  34],
        [ 30,  29,  34],
        [ 31,  30,  35]]], dtype=uint8)

但是,使用 tflite 模型,我得到了这个输出:

[[[ 0.5304917   0.549833    0.91950834  0.8638777 ]
  [-0.02295769  0.18331468  0.09861814  0.48379225]
  [-0.03266242  0.07500002  0.16733758  0.475     ]
  [-0.00436713  0.3388003   0.03792062  0.5033066 ]
  [ 0.9821089   0.2072595   1.0032529   0.28237888]
  [ 0.9824238   0.46158594  1.0029379   0.53446764]
  [ 0.9228462   0.71543455  1.003768    0.80956554]
  [ 0.04370334  0.97782534  0.09682078  1.0006025 ]
  [ 0.27969962  0.9863208   0.3433848   1.0042033 ]
  [ 0.9614231   0.17667551  1.001884    0.24036069]]]
[[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]]
[10.]
[[0.03125    0.01171875 0.01171875 0.00390625 0.00390625 0.00390625
  0.00390625 0.00390625 0.00390625 0.00390625]]

当我在 netron 上运行这个模型时,输出显示:

  name: StatefulPartitionedCall:3    
  type: float32    
  quantization: -0.12643077969551086 ≤ q ≤ 1.308648824691772

  name: StatefulPartitionedCall:2
  type: float32
  quantization: 0 ≤ q ≤ 1.1754943508222875e-38

quantization 的这些范围似乎很偏离,我想知道我可能在转换时做错了什么。

  name: StatefulPartitionedCall:1
  type: float32
  quantization: 0.0019955039024353027 ≤ q ≤ 0.07411301136016

  name: StatefulPartitionedCall:0
  type: float32
  quantization: 10 ≤ q ≤ 10

0 个答案:

没有答案