将CVImageBufferRef转换为CVPixelBufferRef

时间:2013-09-06 15:03:49

标签: iphone ios avfoundation

我是iOS编程和多媒体的新手,我正在进行一个名为RosyWriter的示例项目,该项目由apple在this link提供。在这里,我看到代码中的代码中有一个名为captureOutput:didOutputSampleBuffer:fromConnection的函数:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{   
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);

if ( connection == videoConnection ) {

    // Get framerate
    CMTime timestamp = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );
    [self calculateFramerateAtTimestamp:timestamp];

    // Get frame dimensions (for onscreen display)
    if (self.videoDimensions.width == 0 && self.videoDimensions.height == 0)
        self.videoDimensions = CMVideoFormatDescriptionGetDimensions( formatDescription );

    // Get buffer type
    if ( self.videoType == 0 )
        self.videoType = CMFormatDescriptionGetMediaSubType( formatDescription );

    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // Synchronously process the pixel buffer to de-green it.
    [self processPixelBuffer:pixelBuffer];

    // Enqueue it for preview.  This is a shallow queue, so if image processing is taking too long,
    // we'll drop this frame for preview (this keeps preview latency low).
    OSStatus err = CMBufferQueueEnqueue(previewBufferQueue, sampleBuffer);
    if ( !err ) {        
        dispatch_async(dispatch_get_main_queue(), ^{
            CMSampleBufferRef sbuf = (CMSampleBufferRef)CMBufferQueueDequeueAndRetain(previewBufferQueue);
            if (sbuf) {
                CVImageBufferRef pixBuf = CMSampleBufferGetImageBuffer(sbuf);
                [self.delegate pixelBufferReadyForDisplay:pixBuf];
                CFRelease(sbuf);
            }
        });
    }
}

CFRetain(sampleBuffer);
CFRetain(formatDescription);
dispatch_async(movieWritingQueue, ^{

    if ( assetWriter ) {

        BOOL wasReadyToRecord = (readyToRecordAudio && readyToRecordVideo);

        if (connection == videoConnection) {

            // Initialize the video input if this is not done yet
            if (!readyToRecordVideo)
                readyToRecordVideo = [self setupAssetWriterVideoInput:formatDescription];

            // Write video data to file
            if (readyToRecordVideo && readyToRecordAudio)
                [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
        }
        else if (connection == audioConnection) {

            // Initialize the audio input if this is not done yet
            if (!readyToRecordAudio)
                readyToRecordAudio = [self setupAssetWriterAudioInput:formatDescription];

            // Write audio data to file
            if (readyToRecordAudio && readyToRecordVideo)
                [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeAudio];
        }

        BOOL isReadyToRecord = (readyToRecordAudio && readyToRecordVideo);
        if ( !wasReadyToRecord && isReadyToRecord ) {
            recordingWillBeStarted = NO;
            self.recording = YES;
            [self.delegate recordingDidStart];
        }
    }
    CFRelease(sampleBuffer);
    CFRelease(formatDescription);
});
}

这里调用了一个名为pixelBufferReadyForDisplay的函数,它需要一个CVPixelBufferRef

类型的参数

pixelBufferReadyForDisplay的原型

- (void)pixelBufferReadyForDisplay:(CVPixelBufferRef)pixelBuffer; 

但在上面的代码中,在调用此函数时,它传递的变量pixBuf类型CVImageBufferRef

所以我的问题是,是不是需要使用任何函数或类型转换来将CVImageBufferRef转换为CVPixelBufferRef,还是由编译器隐式完成?

感谢。

1 个答案:

答案 0 :(得分:21)

如果你在Xcode文档中搜索CVPixelBufferRef,你会发现以下内容:

typedef CVImageBufferRef CVPixelBufferRef;

因此CVImageBufferRef是CVPixelBufferRef的同义词。它们是可以互换的。

你正在看一些非常粗糙的代码。 RosyWriter和另一个名为“Chromakey”的示例应用程序对像素缓冲区进行了一些非常低级的处理。如果您是iOS开发的新手,也不熟悉多媒体,那么您可能不希望如此深入,如此之快。这有点像第一年的医学生尝试进行心肺移植手术。