Firebase自定义模型推断速度与Tensorflow Lite

时间:2019-08-19 09:53:56

标签: android tensorflow-lite firebase-mlkit

我训练了一个对象检测模型(基于ssd mobilenet v1)并将其转换为tflite(不进行量化,该模型仍然是FLOAT)。 我在相机Feed(尺寸调整为300x300)上使用tensorflow-lite(v20.0.1)运行模型。 Kotlin中的代码。

然后,我使用fun loadModel() { ... // Load local model interpreter = FirebaseModelInterpreter.getInstance(modelOptions) dataOptions = FirebaseModelInputOutputOptions.Builder() .setInputFormat(0, dataType, intArrayOf(1, 300, 300, 3)) .setOutputFormat(0, FirebaseModelDataType.FLOAT32, intArrayOf(1, 10, 4)) // Boxes .setOutputFormat(1, FirebaseModelDataType.FLOAT32, intArrayOf(1, 10)) // Classes .setOutputFormat(2, FirebaseModelDataType.FLOAT32, intArrayOf(1, 10)) // Scores .setOutputFormat(3, FirebaseModelDataType.FLOAT32, intArrayOf(1)) // Num detections .build() val numBytesPerChannel = 4 // Float imgData = ByteBuffer.allocateDirect(1 * 300 * 300 * 3 * numBytesPerChannel) imgData.order(ByteOrder.nativeOrder()) ... } fun runModel() { ... imgData.rewind() for (x in 0 until 300) { for (y in 0 until 300) { imgData.putFloat(...) // R imgData.putFloat(...) // G imgData.putFloat(...) // B } } val inputs = FirebaseModelInputs.Builder().add(imgData).build() val start = System.currentTimeMillis() interpreter.run(inputs, dataOptions) .addOnSuccessListener { result -> val elapsed = System.currentTimeMillis() - start Log.i(TAG, "Detection finished $elapsed msec - MLKIT") } .addOnFailureListener { // Handle Error } } (v1.14.0)进行了相同的操作。 Java代码。

与TF相比,firebase的推理速度较慢,并且出现了一些令人不快的尖峰,我试图了解为什么我真的想使用Firebase,因为它已经包含在我的项目中。

这里是推理速度比较(每个100个摄像头帧,与手机静止的帧相同)。 Firebase应用程序使用CameraView组件,而TF应用程序直接使用Camera API。 TF vs MLKIT inference time

这是每个代码的相关代码:

使用Kotlin的Firebase

void loadModel() {
    ...
    tfLite = new Interpreter(modelFile);
    tfLite.setNumThreads(4);

    int numBytesPerChannel = 4 // Float
    imgData = ByteBuffer.allocateDirect(1 * 300 * 300 * 3 * numBytesPerChannel)
    imgData.order(ByteOrder.nativeOrder())
    ...
}

void runModel() {
    ...
    imgData.rewind()
    for (int i = 0; i < 300; ++i) {
        for (int j = 0; j < 300; ++j) {
            imgData.putFloat(...) // R
            imgData.putFloat(...) // G
            imgData.putFloat(...) // B
        }
    }

    outputLocations = new float[1][10][4];
    outputClasses = new float[1][10];
    outputScores = new float[1][10];
    numDetections = new float[1];

    Object[] inputArray = {imgData};
    Map<Integer, Object> outputMap = new HashMap<>();
    outputMap.put(0, outputLocations);
    outputMap.put(1, outputClasses);
    outputMap.put(2, outputScores);
    outputMap.put(3, numDetections);

    long start = System.currentTimeMillis();
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    long elapsed = System.currentTimeMillis() - start;
    LOGGER.w("Detection finished %d msec - TF", elapsed);
}

带有Java的Tensorflow-Lite

 Case 1:       
    In cmd propmt, i will subscribe to the topic,
    mosquitto_sub -h 190.178.4.180 -t “test1”  

    private async void BtnPublish_Click(object sender, RoutedEventArgs e)
    {     
        var message = new MqttApplicationMessage("test1", Encoding.UTF8.GetBytes("Hello"));

        await client.PublishAsync(message, MqttQualityOfService.ExactlyOnce);
    }

    I will receive Hello in cmd propmt.

 Case 2:

    In cmd prompt, i will publish the message in some topic,
    mosquitto_pub -h 192.168.0.180 -t test1 -m "HelloWorld"

    private async void BtnSubscrbe_Click(object sender, RoutedEventArgs e)
    {
        await client.SubscribeAsync("test1", MqttQualityOfService.ExactlyOnce);
        var message = client.MessageStream;
    } 

 If i click the button subscribe, where will i get the published message?

任何想法为什么会发生?

0 个答案:

没有答案