无法使用Java中的TFLite模型实例化Tensorflow解释器

时间:2018-12-07 19:07:30

标签: java android tensorflow

我正在使用将Tensorflow模型转换为TFLite的Android应用程序,如下所示:

from tensorflow.contrib import lite
converter = lite.TFLiteConverter.from_keras_model_file("myModel.h5")
tflite_model = converter.convert()
open("myModel.tflite", "wb").write(tflite_model)

此操作有效,我可以实例化tensorflow.contrib.lite.Interpreter,将路径传递给Python中的 myModel.tflite 。 但是,当我尝试用Java制作解释器时,出现以下错误:

java.lang.RuntimeException: java.lang.IllegalArgumentException: Contents of /file:/android_asset/myModel.tflite does not encode a valid TensorFlowLite model: Could not open '/file:/android_asset/myModel.tflite'.The model is not a valid Flatbuffer file

这是我的代码:

File file = new File("file:///android_asset/myModel.tflite")
c.tfLite = new Interpreter(file);

1 个答案:

答案 0 :(得分:0)

您不应直接传递文件,而应使用文件通道

private MappedByteBuffer loadModelFile() throws IOException {
        AssetFileDescriptor fileDescriptor = getAssets().openFd("myModel.tflite");
        FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
        FileChannel fileChannel = inputStream.getChannel();
        long startOffset = fileDescriptor.getStartOffset();
        long declaredLength = fileDescriptor.getDeclaredLength();
        return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);

filemap = loadModelFile()
Interpreter interpreter = new Interpreter(filemap)