在TFLite Python中运行智能答复模型

时间:2020-08-19 18:08:21

标签: python tensorflow tensorflow-lite tf-lite

我正在使用Python的Smart Reply Lite模型进行实验。在此fantastic tutorial的帮助下,我使用Bazel将运行该模型所需的自定义操作(normalize.cc,predict.cc,extract_features.cc)编译为TFLite,现在我正在尝试进行推理。

这是我使用的代码:

    import tensorflow as tf
    import numpy as np
    tflite_interpreter = tf.lite.Interpreter(model_path='smartreply.tflite')
    
    tflite_interpreter.allocate_tensors()
    input_details = tflite_interpreter.get_input_details()
    output_details = tflite_interpreter.get_output_details()
    
    # print(input_details)
    # print(output_details)
    
    tflite_interpreter.set_tensor(input_details[0]['index'], 'Where are you?')
    # Run inference
    tflite_interpreter.invoke()
    # Get prediction results
    tflite_model_predictions = tflite_interpreter.get_tensor(output_details[0])
    print("Prediction results shape:", tflite_model_predictions)

这样做时出现以下错误:

Traceback (most recent call last):
  File "run.py", line 12, in <module>
    tflite_interpreter.set_tensor(input_details[0]['index'], 'Where are you?')
  File "/home/ubuntu/.local/lib/python3.6/site-packages/tensorflow/lite/python/interpreter.py", line 175, in set_tensor
    self._interpreter.SetTensor(tensor_index, value)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/tensorflow/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 136, in SetTensor
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_SetTensor(self, i, value)
ValueError: numpy array had 56 bytes but expected 0 bytes.

在调用tflite_interpreter.allocate_tensors()之前,我尝试使用此行代码来调整张量的大小:

tflite_interpreter.resize_tensor_input(0, [56])

但这引起了ValueError: Cannot set tensor: Dimension mismatch

我的理解是,字符串被转换为numpy数组(基于模型描述,输入类型为int32-每个字符4个字节)。 我需要对该输入法进行哪些更改才能运行此模型?

0 个答案:

没有答案