从AVCaptureStillImageOutput预览最后捕获的图​​像

时间:2014-02-25 15:48:38

标签: ios avcapturesession

我想使用AVCaptureSession在我的iOS App中捕获图像。 根据Apple针对AVCam https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html的示例代码,我尝试在每次捕获图像时显示预览ImageView。

// Capture a still image.
    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

        if (imageDataSampleBuffer)
        {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            [self processImage:[UIImage imageWithData:imageData]];
        }
    }];

捕获processImage后调用

- (void) processImage:(UIImage *)image
{
    [[self postButton] setEnabled:YES];
    [[self postButton] setHidden:NO];
    [[self cancelButton] setEnabled:YES];
    [[self cancelButton] setHidden:NO];

    _preview = [[UIImageView alloc] init];
    [_preview setImage:image];

    _preview.hidden = NO;
}

但是ImageView在显示时仍然保持不变/空。

有人可以帮助我吗?

2 个答案:

答案 0 :(得分:0)

此代码适用于我

AVCaptureConnection *connection = [_currentOutput connectionWithMediaType:AVMediaTypeVideo];
[self _setOrientationForConnection:connection];

[_captureOutputPhoto captureStillImageAsynchronouslyFromConnection:connection completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

    if (!imageDataSampleBuffer) {
        DLog(@"failed to obtain image data sample buffer");
        // return delegate error
        return;
    }

    if (error) {
        if ([_delegate respondsToSelector:@selector(vision:capturedPhoto:error:)]) {
            [_delegate vision:self capturedPhoto:nil error:error];
        }
        return;
    }

    NSMutableDictionary *photoDict = [[NSMutableDictionary alloc] init];
    NSDictionary *metadata = nil;

    // add photo metadata (ie EXIF: Aperture, Brightness, Exposure, FocalLength, etc)
    metadata = (__bridge NSDictionary *)CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
    if (metadata) {
        [photoDict setObject:metadata forKey:PBJVisionPhotoMetadataKey];
        CFRelease((__bridge CFTypeRef)(metadata));
    } else {
        DLog(@"failed to generate metadata for photo");
    }

    // add JPEG and image data
    NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
    if (jpegData) {
        // add JPEG
        [photoDict setObject:jpegData forKey:PBJVisionPhotoJPEGKey];

        // add image
        UIImage *image = [self _uiimageFromJPEGData:jpegData];
        if (image) {
            [photoDict setObject:image forKey:PBJVisionPhotoImageKey];
        } else {
            DLog(@"failed to create image from JPEG");
            // TODO: return delegate on error
        }

        // add thumbnail
        UIImage *thumbnail = [self _thumbnailJPEGData:jpegData];
        if (thumbnail) {
            [photoDict setObject:thumbnail forKey:PBJVisionPhotoThumbnailKey];
        } else {
            DLog(@"failed to create a thumnbail");
            // TODO: return delegate on error
        }

    } else {
        DLog(@"failed to create jpeg still image data");
        // TODO: return delegate on error
    }

    if ([_delegate respondsToSelector:@selector(vision:capturedPhoto:error:)]) {
        [_delegate vision:self capturedPhoto:photoDict error:error];
    }

    // run a post shot focus
    [self performSelector:@selector(_focus) withObject:nil afterDelay:0.5f];
}];

答案 1 :(得分:0)

我敢打赌,这里的问题是在主线程上没有调用captureStillImageAsynchronouslyFromConnection的completionHandler。尝试将您的呼叫重新分配到主线程。

dispatch_async(dispatch_get_main_queue(), ^{
    [self processImage:[UIImage imageWithData:imageData]];
});

我还注意到你永远不会将_preview添加为任何事物的子视图。