抖动实时人脸检测

时间:2018-07-25 20:26:52

标签: opencv flutter firebase-mlkit

我目前正在开发需要实时面部检测的应用程序。现在,我在应用程序中拥有mlkit库,并且正在使用Firebase人脸检测器。此刻,每次我尝试从文件中检测到人脸时,它都会产生错误:

DynamiteModule(13840): Local module descriptor class for com.google.android.gms.vision.dynamite.face not found.

对于实时部分,我尝试在颤振中使用RepaintBoundary来获取(几乎)每帧相机小部件的屏幕快照,并将其转换为用于面部检测的二进制文件。但是由于某种原因,每次我尝试对相机小部件进行屏幕截图时,抖动都会崩溃。它适用于其他小部件。

在遇到这两个问题并花了相当长的时间试图解决它们之后,我一直在考虑只使用android / iOS本机代码来做应用程序的摄像头部分(我会使用OpenCV来做到这一点,所以我可以进行实时检测)。有没有一种方法可以使用平台通道在Kotlin中实现相机视图并将其快速导入并导入到flutter小部件中?还是有另一种更简单的方法来实现这一点?

2 个答案:

答案 0 :(得分:2)

要实时访问摄像机图像流,我在另一个问题How to access camera frames in flutter quickly中回答了您要使用CameraController#startImageStream

import 'package:camera/camera.dart';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';

void main() => runApp(MaterialApp(home: _MyHomePage()));

class _MyHomePage extends StatefulWidget {
  @override
  _MyHomePageState createState() => _MyHomePageState();
}

class _MyHomePageState extends State<_MyHomePage> {
  dynamic _scanResults;
  CameraController _camera;

  bool _isDetecting = false;
  CameraLensDirection _direction = CameraLensDirection.back;

  @override
  void initState() {
    super.initState();
    _initializeCamera();
  }

  Future<CameraDescription> _getCamera(CameraLensDirection dir) async {
    return await availableCameras().then(
      (List<CameraDescription> cameras) => cameras.firstWhere(
            (CameraDescription camera) => camera.lensDirection == dir,
          ),
    );
  }

  void _initializeCamera() async {
    _camera = CameraController(
      await _getCamera(_direction),
      defaultTargetPlatform == TargetPlatform.iOS
          ? ResolutionPreset.low
          : ResolutionPreset.medium,
    );
    await _camera.initialize();
    _camera.startImageStream((CameraImage image) {
      if (_isDetecting) return;
      _isDetecting = true;
      try {
        // await doOpenCVDectionHere(image)
      } catch (e) {
        // await handleExepction(e)
      } finally {
        _isDetecting = false;
      }
    });
  }
  Widget build(BuildContext context) {
    return null;
  }
}

答案 1 :(得分:0)

我以前用OpenCV做过这样的事情,我的解决方法是:

  1. 分别通过平台渠道在Android和iOS上启动新的Activity或ViewController。示例:

class FaceScanPlugin(val activity: Activity) : MethodCallHandler, PluginRegistry.ActivityResultListener {

var result: Result? = null

companion object {
    @JvmStatic
    fun registerWith(registrar: Registrar): Unit {
        val channel = MethodChannel(registrar.messenger(), "com.example.facescan")
        val plugin = BarcodeScanPlugin(registrar.activity())
        channel.setMethodCallHandler(plugin)
        registrar.addActivityResultListener(plugin)
    }
}

override fun onMethodCall(call: MethodCall, result: Result): Unit {
    if (call.method.equals("scan")) {
        this.result = result
        showFaceScanView()
    } else {
        result.notImplemented()
    }
}

private fun showFaceScanView() {
    val intent = Intent(activity, FaceScannerActivity::class.java)
    activity.startActivityForResult(intent, 100)
}

override fun onActivityResult(code: Int, resultCode: Int, data: Intent?): Boolean {
    if (code == 100) {
        if (resultCode == Activity.RESULT_OK) {
            return true
        }
    }
    return false
}
}

有关如何导航到Android活动或iOS视图的信息,请参见Flutter QR scanner plugin

  1. 然后通过Camera2AVFoundation进行OpenCV实时人脸检测。

除此之外,我想如果要将android或iOS嵌入Flutter应用中,可以尝试使用新的AndroidViewUIKitView