UIImage从AVCaptureStillImageOutput解压缩

时间:2016-04-29 09:54:42

标签: ios uiimage avfoundation ios-camera core-video

这是我到目前为止尝试配置相机的方法:

    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    [session setSessionPreset:AVCaptureSessionPresetInputPriority];

    AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];

    NSError *errorVideo;

    AVCaptureDeviceFormat *deviceFormat = nil;
    for (AVCaptureDeviceFormat *format in videoDevice.formats) {
        CMVideoDimensions dim = CMVideoFormatDescriptionGetDimensions(format.formatDescription);

        if (dim.width == 2592 && dim.height == 1936) {
            deviceFormat = format;
            break;
        }
    }

    [videoDevice lockForConfiguration:&errorVideo];
    if (deviceFormat) {
        videoDevice.activeFormat = deviceFormat;

        if ([videoDevice isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
            [videoDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
        }

        if ([videoDevice isAutoFocusRangeRestrictionSupported]) {
            [videoDevice setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionFar];
        }
    }
    [videoDevice unlockForConfiguration];

    AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

    if ([session canAddInput:videoDeviceInput]) {
        [session addInput:videoDeviceInput];
    }

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];

    if ([session canAddOutput:stillImageOutput]) {
        [stillImageOutput setOutputSettings:@{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)}];
        [session addOutput:stillImageOutput];
    }

这是我尝试从CMSamplebuffer获取UIImage的原因:

 [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

        if (imageDataSampleBuffer && !error) {
            dispatch_async(dispatch_get_main_queue(), ^{
                UIImage *image = [self imageFromSampleBuffer:imageDataSampleBuffer];
            });
        }
    }];

这是一个Apple示例代码:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress(imageBuffer, 0);

void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);



// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                             bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);


// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

return (image);
}

但是图像总是零。 做了一些调试后。我发现这个函数总是返回nil CMSampleBufferGetImageBuffer(sampleBuffer);

有人可以帮忙吗?

1 个答案:

答案 0 :(得分:1)

这是因为CMSampleBufferRef必须立即处理,因为它可以非常快速有效地解除分配。

以下是我生成图片的代码:

 let connection = imageFileOutput.connectionWithMediaType(AVMediaTypeVideo)

if  connection != nil {
    imageFileOutput.captureStillImageAsynchronouslyFromConnection(connection) { [weak self] (buffer, err) -> Void in
        if CMSampleBufferIsValid(buffer) {
            let imageDataJpeg = self?.imageFromSampleBuffer(buffer)
        } else {
            print(err)
        }
    }
}

正如您所看到的,我仍然在此功能的范围内将其转换为图像。一旦它成为图像,我就把它发送出来进行处理。