来自CMSampleBufferRef转换的UIImage,导致UIImage无法正确呈现

时间:2015-05-01 14:41:50

标签: ios objective-c uiimage avfoundation

我正在与AV基金会合作,我正试图保存一个特定的 在某些变量中输出CMSampleBufferRef作为UIImage。我正在使用manatee works示例代码并使用

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange

kCVPixelBufferPixelFormatTypeKey

NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];

NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];

但是当我保存图像时,输出只是零或ImageView的背景。我也尝试过不设置输出设置,只使用默认设置但没有用。图像仍然没有渲染。我也尝试设置kCVPixelFormatType_32BGRA但是海牛的工作停止检测条形码。

我正在使用apple on developer website

提供的示例代码中的上下文设置
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(NULL,
                                             CVPixelBufferGetWidth(imageBuffer),
                                             CVPixelBufferGetHeight(imageBuffer),
                                             8,
                                             0,
                                             CGColorSpaceCreateDeviceRGB(),
                                             kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);

有人可以帮我解决这里出了什么问题吗?它应该很简单,但我对AVFoundation Framework没有多少经验。这是因为上下文使用CGColorSpaceCreateDeviceRGB()

,这是一些颜色空间问题

如果需要,我可以提供更多信息。我搜索了StackOverflow并且有很多关于此的条目,但没有解决我的问题

2 个答案:

答案 0 :(得分:0)

以下是我以前的做法。代码是用swift编写的。但它的确有效。 你应该注意最后一行的方向参数,它取决于视频设置。

extension UIImage {
    /**
    Creates a new UIImage from the video frame sample buffer passed.
    @param sampleBuffer the sample buffer to be converted into a UIImage.
    */
    convenience init?(sampleBuffer: CMSampleBufferRef) {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0)

        // Get the number of bytes per row for the pixel buffer
        let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)

        // Get the number of bytes per row for the pixel buffer
        let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
        // Get the pixel buffer width and height
        let width = CVPixelBufferGetWidth(imageBuffer)
        let height = CVPixelBufferGetHeight(imageBuffer)

        // Create a device-dependent RGB color space
        let colorSpace = CGColorSpaceCreateDeviceRGB()

        // Create a bitmap graphics context with the sample buffer data
        let bitmap = CGBitmapInfo(CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
        let context = CGBitmapContextCreate(baseAddress, width, height, 8,
            bytesPerRow, colorSpace, bitmap)
        // Create a Quartz image from the pixel data in the bitmap graphics context
        let quartzImage = CGBitmapContextCreateImage(context)
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0)

        // Create an image object from the Quartz image
        self.init(CGImage: quartzImage, scale: 1, orientation: UIImageOrientation.LeftMirrored)
    }
}

答案 1 :(得分:0)

我经常使用它:

   UIImage *image = [UIImage imageWithData:[self imageToBuffer:sampleBuffer]];

    - (NSData *) imageToBuffer:(CMSampleBufferRef)source {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
        CVPixelBufferLockBaseAddress(imageBuffer,0);

        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

        NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
        return data;
    }