如何将CGImage转换为CMSampleBufferRef?

时间:2010-09-20 12:57:43

标签: ios avfoundation core-video core-media

我想将CGImage转换为CMSampleBufferRef,并使用AVAssetWriterInput方法将其附加到appendSampleBuffer:。我已设法使用以下代码获取CMSampleBufferRef,但当我提供结果appendSampleBuffer:时,NO只返回CMSampleBufferRef。我做错了什么?

- (void) appendCGImage: (CGImageRef) frame
{
    const int width = CGImageGetWidth(frame);
    const int height = CGImageGetHeight(frame);

    // Create a dummy pixel buffer to try the encoding
    // on something simple.
    CVPixelBufferRef pixelBuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width, height,
        kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);
    NSParameterAssert(status == kCVReturnSuccess && pixelBuffer != NULL);

    // Sample timing info.
    CMTime frameTime = CMTimeMake(1, 30);
    CMTime currentTime = CMTimeAdd(lastSampleTime, frameTime);
    CMSampleTimingInfo timing = {frameTime, currentTime, kCMTimeInvalid};

    OSStatus result = 0;

    // Sample format.
    CMVideoFormatDescriptionRef videoInfo = NULL;
    result = CMVideoFormatDescriptionCreateForImageBuffer(NULL,
         pixelBuffer, &videoInfo);
    NSParameterAssert(result == 0 && videoInfo != NULL);

    // Create sample buffer.
    CMSampleBufferRef sampleBuffer = NULL;
    result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
        pixelBuffer, true, NULL, NULL, videoInfo, &timing, &sampleBuffer);
    NSParameterAssert(result == 0 && sampleBuffer != NULL);

    // Ship out the frame.
    NSParameterAssert(CMSampleBufferDataIsReady(sampleBuffer));
    NSParameterAssert([writerInput isReadyForMoreMediaData]);
    BOOL success = [writerInput appendSampleBuffer:frame];
    NSParameterAssert(success); // no go :(
}

P.S。我知道这段代码中存在内存泄漏,为简单起见,我省略了一些代码。

1 个答案:

答案 0 :(得分:6)

啊哈,我完全错过了AVAssetWriterInputPixelBufferAdaptor类,它专门用于将像素缓冲区传输到写入器输入中。现在代码可以运行,即使没有凌乱的CMSampleBuffer内容。