如何使用iOS 5检测视频会话中的人脸

时间:2011-10-31 17:38:35

标签: iphone uiimage detection avcapturesession face-recognition

在我的应用程序中,我正在使用一个可以拍照的视频。不过我想在这个视频中进行人脸检测。我看了Apple的例子“SquareCam”,这正是我正在寻找的。然而,在我的项目中实施他们的代码正在推动我疯狂。

#import "CaptureSessionManager.h"
#import <ImageIO/ImageIO.h>

@implementation CaptureSessionManager

@synthesize captureSession;
@synthesize previewLayer;
@synthesize stillImageOutput;
@synthesize stillImage;

#pragma mark Capture Session Configuration

- (id)init {
    if ((self = [super init])) {
        [self setCaptureSession:[[AVCaptureSession alloc] init]];
    }
    return self;
}
- (void)didReceiveMemoryWarning
{
    // Releases the view if it doesn't have a superview.
    NSLog(@"memorywarning");

    // Release any cached data, images, etc that aren't in use.
}
- (void)addVideoPreviewLayer {
    [self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
    [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

}

- (void)addVideoInput {
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];   
    if ([videoDevice isFocusModeSupported:AVCaptureFocusModeLocked]) {

        NSError *error = nil;

        if ([videoDevice lockForConfiguration:&error]) {
            NSLog(@"focus");
            videoDevice.focusMode = AVCaptureFocusModeLocked;
            videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
            //videoDevice.focusMode = AVCaptureSessionPresetPhoto

            videoDevice.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
            [captureSession setSessionPreset:AVCaptureSessionPresetPhoto];

            [videoDevice unlockForConfiguration];

        }

        else {

        }
    }

            // Respond to the failure as appropriate.
    if (videoDevice) {
        NSError *error;
        AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
        if (!error) {
            if ([[self captureSession] canAddInput:videoIn])
                [[self captureSession] addInput:videoIn];
            else
                NSLog(@"Couldn't add video input");     
        }
        else
            NSLog(@"Couldn't create video input");
    }
    else
        NSLog(@"Couldn't create video capture device");
}

- (void)addStillImageOutput 
{
  [self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
  NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
  [[self stillImageOutput] setOutputSettings:outputSettings];

  AVCaptureConnection *videoConnection = nil;
  for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
      if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
        videoConnection = connection;
        break;
      }
    }
    if (videoConnection) { 
      break; 
    }
  }

  [[self captureSession] addOutput:[self stillImageOutput]];
    [outputSettings release];
}

- (void)captureStillImage
{  
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { 
      break; 
    }
    }

    NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection 
                                                       completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
                                                         CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                         if (exifAttachments) {
                                                           NSLog(@"attachements: %@", exifAttachments);
                                                         } else { 
                                                           NSLog(@"no attachments");
                                                         }
                                                         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

                                                         UIImage *image = [[UIImage alloc] initWithData:imageData];

                                                         [self setStillImage:image];
                                                         [image release];
                                                         [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
                                                       }];
}


- (void)dealloc {

    [[self captureSession] stopRunning];

    [previewLayer release], previewLayer = nil;
    [captureSession release], captureSession = nil;
    [stillImageOutput release], stillImageOutput = nil;
    [stillImage release], stillImage = nil;

    [super dealloc];
}

@end

除了我的视频,我确实成功识别了我在项目中导入的UIImage中的面孔。我用@Abhinav Jha(How to properly instantiate CIDetector class object in iOS 5 face detection API)的例子做到了这一点。

CIImage *ciImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:@"Photo.JPG"]];
if (ciImage == nil)

[imageView setImage:[UIImage imageNamed:@"Photo.JPG"]];

NSDictionary *options = [[NSDictionary alloc] initWithObjectsAndKeys:
                         @"CIDetectorAccuracy", @"CIDetectorAccuracyHigh",nil];
CIDetector *ciDetector = [CIDetector detectorOfType:CIDetectorTypeFace 
                                            context:nil
                                            options:options];
NSArray *features = [ciDetector featuresInImage:ciImage];
NSLog(@"no of face detected: %d", [features count]);

希望有人可以通过结合这两个例子指出我正确的方向!

0 个答案:

没有答案