我正在尝试从资产目录加载converted_tflite.tflite
。
这给我一个错误java.lang.IllegalArgumentException: Contents of /file:/android_asset/converted_model.tflite does not encode a valid TensorFlowLite model: Could not open '/file:/android_asset/converted_model.tflite'.The model is not a valid Flatbuffer file
File file = new File("file:///android_asset/converted_model.tflite");
try (Interpreter interpreter = new Interpreter(file)) {
interpreter.run(inputData, output);
Log.d("TF LOG", output);
}catch(Exception e){
e.printStackTrace();
}
我根据stackoverflow答案尝试过的事情
aaptOptions {
noCompress "tflite"
}
tensoflow-lite夜间版
implementation 'org.tensorflow:tensorflow-lite:0.1.2-nightly'
答案 0 :(得分:1)
由于错误指示model is not a valid Flatbuffer file
。在您的实现中,模型为File
。应该将其转换为以下实现的flatbuffer文件,
FileInputStream f_input_stream= new FileInputStream(new File("file:///android_asset/converted_model.tflite"));
FileChannel f_channel = f_input_stream.getChannel();
MappedByteBuffer tflite_model = f_channel.map(FileChannel.MapMode.READ_ONLY, 0, f_channel .size());
然后您可以使用此tflite_model
将tflite解释器创建为New Interpreter(...)
。
答案 1 :(得分:1)
我使用了tensorflow-nighty build gradle版本0.1.2
implementation 'org.tensorflow:tensorflow-lite:0.1.2-nightly'
加载模型
/** Memory-map the model file in Assets. */
private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
throws IOException {
AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel = inputStream.getChannel();
long startOffset = fileDescriptor.getStartOffset();
long declaredLength = fileDescriptor.getDeclaredLength();
return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}
答案 2 :(得分:0)
添加到现有答案中,如果您使用最新的 Tensorflow 版本 (2.4.0) 创建了 tflite 模型并面临类似问题,请将以下行添加到 build.gradle 文件的依赖项
implementation 'org.tensorflow:tensorflow-lite:2.4.0'
并使用@LalitSharma 提供的函数从'assets'目录加载模型。
可以在此处找到最新版本 https://bintray.com/google/tensorflow/tensorflow-lite