使用视觉框架查找面部的俯仰和偏航

时间:2018-01-17 00:28:10

标签: ios swift computer-vision ios11

使用VNFaceObservation获取有关脸部的边界框和地标信息,但无法从观察中找到从哪里获得脸部的俯仰和偏航。

还尝试从CIDetector获取音高和偏航元数据,但同时运行CIDetector和Vision Framework是CPU密集型的。

    let metadataOutput = AVCaptureMetadataOutput()
    let metaQueue = DispatchQueue(label: "MetaDataSession")
    metadataOutput.setMetadataObjectsDelegate(self, queue: metaQueue)
    if captureSession.canAddOutput(metadataOutput) {
        captureSession.addOutput(metadataOutput)
    } else {
        print("Meta data output can not be added.")
    }


    let configurationOptions: [String: AnyObject] = [CIDetectorAccuracy: CIDetectorAccuracyHigh as AnyObject, CIDetectorTracking : true as AnyObject, CIDetectorNumberOfAngles: 11 as AnyObject]
    faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: configurationOptions)

有没有办法使用VNFaceObservation数据来找到脸部的俯仰和偏航?

1 个答案:

答案 0 :(得分:-1)

据我所知,CoreImage中没有提供俯仰和偏航的测量方法(我不熟悉Vision框架)。

然而,可以使用AVFoundation进行检测,如this post

的示例项目所示
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {

    for metadataObject in metadataObjects as! [AVMetadataFaceObject] {
        DispatchQueue.main.async {
            self.avFaceID.text = "face ID: \(metadataObject.faceID)"
            self.avFaceRoll.text = "roll: \(Int(metadataObject.rollAngle))"
            self.avFaceYaw.text = "yaw: \(Int(metadataObject.yawAngle))"
        }
    }

}