如何在实时帧捕获中使用GoogleMobileVision?

时间:2017-11-17 10:30:00

标签: ios swift image-processing swift3

我试图实时检测微笑概率。使用GoogleMobileVision,但由于我在GMVDetector中传递选项参数,因此应用程序崩溃。

  

由于未捕获的异常'NSInvalidArgumentException'而终止应用程序,原因:' - [_ SwiftValue intValue]

此外,当我将选项传递为nil时,它会给出内存问题。

我的代码:

import UIKit
import GoogleMobileVision

class ViewController: UIViewController, FrameExtractorDelegate {

@IBOutlet weak var lblSmiling: UILabel!
var frameExtractor: FrameExtractor!
var faceDetector = GMVDetector()

@IBOutlet weak var imageView: UIImageView!

@IBAction func flipButton(_ sender: UIButton) {
    frameExtractor.flipCamera()
}

override func viewDidLoad() {
    super.viewDidLoad()
    frameExtractor = FrameExtractor()
    frameExtractor.delegate = self
    let options: NSDictionary = [GMVDetectorFaceLandmarkType: GMVDetectorFaceLandmark.all, GMVDetectorFaceClassificationType: GMVDetectorFaceClassification.all, GMVDetectorFaceTrackingEnabled: true]

    self.faceDetector = GMVDetector(ofType: GMVDetectorTypeFace, options: options as! [AnyHashable : Any])
}

// Getting individual frame image here
func captured(image: UIImage) {
    processImage(image: image)
    imageView.image = image
}

func processImage(image: UIImage) {

    let faces : [GMVFaceFeature] = faceDetector.features(in: image, options: nil) as! [GMVFaceFeature]

    for face in faces {

        if face.hasSmilingProbability && face.smilingProbability > 0.4 {
            lblSmiling.text = String(describing: face.smilingProbability)
        }
    }
}

}

2 个答案:

答案 0 :(得分:0)

我认为您正在使用一些可选值,您需要在此代码之后编写此代码。

let options: NSDictionary = [GMVDetectorFaceLandmarkType: GMVDetectorFaceLandmark.all.rowValue, GMVDetectorFaceClassificationType: GMVDetectorFaceClassification.all.rowValue, GMVDetectorFaceTrackingEnabled: true]

答案 1 :(得分:0)

经过大量搜索并使用旗帜后,我自己解决了这个问题 这是我的工作代码:

import UIKit
import GoogleMobileVision

class ViewController: UIViewController, FrameExtractorDelegate {

@IBOutlet weak var lblSmiling: UILabel!
@IBOutlet weak var imageView: UIImageView!

var newView = UIView()
private let ssQ = DispatchQueue(label: "process queue")
var frameExtractor: FrameExtractor!
var faceDetector: GMVDetector?
var faces = [GMVFaceFeature]()
var imgIsProcessing = false
var sessionCountToClr = 0

override func viewDidLoad() {
    super.viewDidLoad()
    frameExtractor = FrameExtractor()
    frameExtractor.delegate = self
    self.faceDetector = GMVDetector(ofType: GMVDetectorTypeFace, options: [GMVDetectorFaceLandmarkType: GMVDetectorFaceLandmark.all.rawValue,
                                                                           GMVDetectorFaceClassificationType: GMVDetectorFaceClassification.all.rawValue,
                                                                           GMVDetectorFaceMinSize: 0.3,
                                                                           GMVDetectorFaceTrackingEnabled: true])
}

@IBAction func flipButton(_ sender: UIButton) {
    frameExtractor.flipCamera()
}

func captured(image: UIImage) {
    DispatchQueue.main.async {
        self.processImage(image: image)
        self.imageView.image = image
    }
}

func processImage(image: UIImage) {
    if imgIsProcessing {
        return
    }

    imgIsProcessing = true
    ssQ.async { [unowned self] in
           self.faces = self.faceDetector!.features(in: image, options: nil) as! [GMVFaceFeature]
            DispatchQueue.main.async {
                if self.faces.count > 0 {
                    for face in self.faces {
                        let rect = CGRect(x: face.bounds.minX, y: face.bounds.minY+100, width: face.bounds.size.width, height: face.bounds.size.height)

                        self.drawFaceIndicator(rect: rect)
                        self.lblSmiling.text = String(format: "%.3f", face.smilingProbability)
                    }
                    self.sessionCountToClr = 0
                }
                else {
                    if self.sessionCountToClr == 30 {
                        self.newView.removeFromSuperview()
                        self.lblSmiling.text = "0.0"
                        self.sessionCountToClr = 0
                    } else {
                        self.sessionCountToClr+=1
                    }
                }
                self.imgIsProcessing = false
            }
        self.faces = []
    }
}

func drawFaceIndicator(rect: CGRect) {
        newView.removeFromSuperview()
        newView = UIView(frame: rect)
        newView.layer.cornerRadius = 10;
        newView.alpha = 0.3
        newView.layer.borderColor = #colorLiteral(red: 0.3411764801, green: 0.6235294342, blue: 0.1686274558, alpha: 1)
        newView.layer.borderWidth = 4
        self.view.addSubview(newView)
}
}

我已在github上传了整个项目,随时可以使用