使用AVFoundation时,对于哪个对象实际包含捕获的图像感到困惑

时间:2014-03-27 22:24:48

标签: ios objective-c uiview uiimage avfoundation

我有一个使用AVFoundation的照片应用程序。到目前为止,一切都很完美。

然而,令我困惑的一件事是,实际包含的捕获图像是什么对象?

我已经对所有对象和它们的一些属性进行了NSLogging,但我仍然无法确定捕获图像的包含位置。

以下是我设置捕获会话的代码:

self.session =[[AVCaptureSession alloc]init];


 [self.session setSessionPreset:AVCaptureSessionPresetPhoto];



 self.inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];


    NSError *error;


     self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.inputDevice error:&error];




     if([self.session canAddInput:self.deviceInput])
    [self.session addInput:self.deviceInput];



  self.previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];


  self.rootLayer = [[self view]layer];


  [self.rootLayer setMasksToBounds:YES];



[self.previewLayer setFrame:CGRectMake(0, 0, self.rootLayer.bounds.size.width, self.rootLayer.bounds.size.height)];


[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];



[self.rootLayer insertSublayer:self.previewLayer atIndex:0];


self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];


[self.session addOutput:self.stillImageOutput];

[self.session startRunning];


}

然后,这是我用于在用户按下捕获按钮时捕获静止图像的代码:

-(IBAction)stillImageCapture {




AVCaptureConnection *videoConnection = nil;

videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;


for (AVCaptureConnection *connection in self.stillImageOutput.connections){
    for (AVCaptureInputPort *port in [connection inputPorts]){

        if ([[port mediaType] isEqual:AVMediaTypeVideo]){

            videoConnection = connection;



            break;
        }
    }
    if (videoConnection) {
        break;
    }
}




[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

    [self.session stopRunning];


}

 ];}

当用户按下捕获按钮并执行上述代码时,捕获的图像会成功显示在iPhone屏幕上,但我无法确定哪个对象实际上正在保存捕获的图像。

感谢您的帮助。

1 个答案:

答案 0 :(得分:2)

CMSampleBuffer是实际包含图像的内容。

captureStillImageAsynchronouslyFromConnection完成处理程序中,您需要以下内容:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage* capturedImage = [[UIImage alloc] initWithData:imageData];

我的工作实施:

- (void)captureStillImage
{
    @try {
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in _stillImageOutput.connections){
            for (AVCaptureInputPort *port in [connection inputPorts]){

                if ([[port mediaType] isEqual:AVMediaTypeVideo]){

                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection) {
                break;
            }
        }
        NSLog(@"About to request a capture from: %@", [self stillImageOutput]);
        [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
                                                             completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

                                                                 // This is here for when we need to implement Exif stuff. 
                                                                 //CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);

                                                                 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

                                                                 // Create a UIImage from the sample buffer data
                                                                 _capturedImage = [[UIImage alloc] initWithData:imageData];


                                                                 BOOL autoSave = YES;
                                                                 if (autoSave)
                                                                 {
                                                                     UIImageWriteToSavedPhotosAlbum(_capturedImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
                                                                 }

                                                             }];
    }
    @catch (NSException *exception) {
        NSlog(@"ERROR: Unable to capture still image from AVFoundation camera: %@", exception);
    }
}