在将多个帧发送到AVAssetWriter之前,在内存中保留多个帧

时间:2011-06-03 12:32:25

标签: iphone avfoundation avcapturesession avassetwriter

我需要从内存中的captureSession中保存一些视频帧,并在“某事”发生时将它们写入文件。

this solution类似,我使用此代码将帧放入NSMutableArray:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{       
    //...
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
    NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
    [m_frameDataArray addObject:rawFrame];
    [rawFrame release];
    //...
}

这就是写视频文件:

-(void)writeFramesToFile
{
    //...
    NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                    [NSNumber numberWithInt:640], AVVideoWidthKey,
                                    [NSNumber numberWithInt:480], AVVideoHeightKey,
                                    AVVideoCodecH264, AVVideoCodecKey,
                                    nil ];
    AVAssetWriterInput *bufferAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
    AVAssetWriter *bufferAssetWriter = [[AVAssetWriter alloc]initWithURL:pathURL fileType:AVFileTypeQuickTimeMovie error:&error];
    [bufferAssetWriter addInput:bufferAssetWriterInput];

    [bufferAssetWriter startWriting];
    [bufferAssetWriter startSessionAtSourceTime:startTime];
    for (NSInteger i = 1; i < m_frameDataArray.count; i++){
        NSData *rawFrame = [m_frameDataArray objectAtIndex:i];
        CVImageBufferRef imgBuf = [rawFrame bytes];
        [pixelBufferAdaptor appendPixelBuffer:imgBuf withPresentationTime:CMTimeMake(1,10)]; //<-- EXC_BAD_ACCESS
        [rawFrame release];
    }
    //... (finishing video file)
}

但是imgBuf引用有问题。有什么建议?提前致谢。

2 个答案:

答案 0 :(得分:4)

你应该在访问imageBuffer的属性之前锁定基地址。

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
[m_frameDataArray addObject:rawFrame];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

答案 1 :(得分:2)

这是相当古老的,但为了帮助那些后来的人,还有一些问题需要解决:

  1. 按照Alex的回答
  2. 的建议,在您复制时锁定/解锁基地址
  3. CVImageBufferRef是一种抽象基类类型。您希望使用CVPixelBufferCreateWithBytes来创建实例,而不仅仅是对原始像素字节进行类型转换。 (系统需要知道这些像素的大小/格式)
  4. 您应该直接从原始数据创建和存储新的CVPixelBuffer,而不是使用中间NSData进行存储。这样你只需要复制一份而不是两份。