CoreImage CIFeature用于检测面部情绪

时间:2016-11-16 21:44:51

标签: ios swift core-image

我正在尝试使用CoreImage CIFeature来检测面部情绪,因为这些是本机API。我创建了一个示例视图控制器项目并更新了相关代码。当我启动这个iOS应用程序时,它会打开相机。当我查看相机并显示微笑情绪时,下面的示例代码检测正常。 我还需要找到其他情绪,如惊喜,悲伤和愤怒的情绪。我知道CoreImage CIFeature没有针对这些其他情感的直接API。但是,是否有可能尝试通过iOS程序来操纵可用的API(例如hasSmile,leftEyeClosed,rightEyeClosed等)来检测其他情绪,例如Surprise,Sad和Angry?

任何人都可以使用此API,方案并解决此问题,请建议并分享您的想法。

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    let opaqueBuffer = Unmanaged<CVImageBuffer>.passUnretained(imageBuffer!).toOpaque()
    let pixelBuffer = Unmanaged<CVPixelBuffer>.fromOpaque(opaqueBuffer).takeUnretainedValue()
    let sourceImage = CIImage(cvPixelBuffer: pixelBuffer, options: nil)
    options = [CIDetectorSmile : true as AnyObject, CIDetectorEyeBlink: true as AnyObject, CIDetectorImageOrientation : 6 as AnyObject]

    let features = self.faceDetector!.features(in: sourceImage, options: options)

    for feature in features as! [CIFaceFeature] {

        if (feature.hasSmile) {

            DispatchQueue.main.async {
                self.updateSmileEmotion()
            }
        }    
        else {
            DispatchQueue.main.async {
                self.resetEmotionLabel()
            }
        }                     
    }

func updateSmileEmotion () {
    self.emtionLabel.text = " "
    self.emtionLabel.text = "HAPPY"
}
func resetEmotionLabel () {
    self.emtionLabel.text = " "
}

1 个答案:

答案 0 :(得分:0)

有许多图书馆可以对图像进行情感分析,其中大多数都依赖于机器学习。只要看一下CIFeature给你的东西,你就不太可能得到同样的结果,因为它与其他面部识别库相比甚至相当有限。请参阅Google Cloud VisonIBM Watson Cloud iOS SDKMicrosoft Cognitive Services