CMSampleBufferRef的深层副本

时间:2016-01-24 04:53:17

标签: ios objective-c avfoundation video-processing core-video

我是否尝试执行CMSampleBufferRef的深层复制以进行音频和视频连接?我需要使用此缓冲区进行延迟处理。有人可以通过指向示例代码帮助这里。

由于

2 个答案:

答案 0 :(得分:2)

我解决了这个问题

我需要长时间访问样本数据。

尝试多种方式:

CVPixelBufferRetain ----->程序坏了 CVPixelBufferPool ----->程序坏了 CVPixelBufferCreateWithBytes ---->它可以解决这个程序,但这会降低性能,Apple不建议这样做

CMSampleBufferCreateCopy --->没关系,苹果推荐它。

列表:为了保持最佳性能,某些样本缓冲区直接引用可能需要由设备系统和其他捕获输入重用的内存池。对于未压缩的设备本机捕获,通常会出现这种情况,其中尽可能少地复制内存块。如果多个样本缓冲区长时间引用此类内存池,则输入将无法再将新样本复制到内存中,并且这些样本将被丢弃。如果您的应用程序通过保留提供的CMSampleBuffer对象太久而导致删除样本,但需要长时间访问样本数据,请考虑将数据复制到新缓冲区中,然后在样本缓冲区上调用CFRelease (如果之前已保留),以便可以重用它引用的内存。

REF:https://developer.apple.com/reference/avfoundation/avcapturefileoutputdelegate/1390096-captureoutput

可能是您所需要的:

pragma mark -captureOutput

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
    if (connection == m_videoConnection) {        
        /* if you did not read m_sampleBuffer ,here you must CFRelease m_sampleBuffer, it is causing samples to be dropped 
        */
        if (m_sampleBuffer) {
            CFRelease(m_sampleBuffer);
            m_sampleBuffer = nil;
        }

        OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer, &m_sampleBuffer);
        if (noErr != status) {
            m_sampleBuffer = nil;
        }
        NSLog(@"m_sampleBuffer = %p sampleBuffer= %p",m_sampleBuffer,sampleBuffer);
    }
}

pragma mark -get CVPixelBufferRef使用很长一段时间

- (ACResult) readVideoFrame: (CVPixelBufferRef *)pixelBuffer{
    while (1) {
        dispatch_sync(m_readVideoData, ^{
            if (!m_sampleBuffer) {
                _readDataSuccess = NO;
                return;
            }

            CMSampleBufferRef sampleBufferCopy = nil;
            OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, m_sampleBuffer, &sampleBufferCopy);
            if ( noErr == status)
            {
                 CVPixelBufferRef buffer  = CMSampleBufferGetImageBuffer(sampleBufferCopy);

                 *pixelBuffer = buffer;

                 _readDataSuccess = YES;

                 NSLog(@"m_sampleBuffer = %p ",m_sampleBuffer);

                 CFRelease(m_sampleBuffer);
                 m_sampleBuffer = nil;

             }
             else{
                 _readDataSuccess = NO;
                 CFRelease(m_sampleBuffer);
                 m_sampleBuffer = nil;
             }
        });

        if (_readDataSuccess) {
            _readDataSuccess = NO;
            return ACResultNoErr;
        }
        else{
            usleep(15*1000);
            continue;
        }
    }
}

那么你可以这样使用它:

-(void)getCaptureVideoDataToEncode{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(){
        while (1) {
            CVPixelBufferRef buffer = NULL;
            ACResult result= [videoCapture readVideoFrame:&buffer];
            if (ACResultNoErr == result) {
                ACResult error = [videoEncode encoder:buffer outputPacket:&streamPacket];
                if (buffer) {
                    CVPixelBufferRelease(buffer);
                    buffer = NULL;
                }
                if (ACResultNoErr == error) {
                NSLog(@"encode success");
                }
            }
        }
    });
}  

答案 1 :(得分:-1)

我这样做。 CMSampleBufferCreateCopy确实可以深层复制 但是出现了一个新问题 captureOutput代表没有工作

相关问题