如何将CMSampleBufferRef从GPUImageVideoCamera转换为UIImage?获得不需要的结果

时间:2014-03-19 16:59:37

标签: ios objective-c ios7 gpuimage ios7.1

我试图从GPUImageVideoCamera的willOutputSampleBuffer:sampleBuffer中提取帧。但是,我要么崩溃,要么结果图像非常扭曲。

一些背景信息:

  • 我在运行iOS 7.1的iPhone 5上运行此功能

这是我用于处理每一帧的代码:

- (void)processColorImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {

    @autoreleasepool {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

        /*Lock the image buffer*/
        CVPixelBufferLockBaseAddress(imageBuffer,0);

        /*Get information about the image*/
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
//        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);

        /*Create a CGImageRef from the CVImageBufferRef*/
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, 4 * width, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);

        /*We release some components*/
        CGContextRelease(newContext);
        CGColorSpaceRelease(colorSpace);

        UIImage *image = [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationUp];

        /*We relase the CGImageRef*/
        CGImageRelease(newImage);

        if (image) {
            framesTaken++;
            [self imageCaptured: image];
        }

        /*We unlock the  image buffer*/
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    }
}

非常感谢任何帮助!

0 个答案:

没有答案