我正在使用带有单个视频输入的AVAssetWriter将视频帧写入电影。 写循环的机制不是问题:但我发现内存管理是。
关于CMSampleBufferRef我将附加: a)我为一些图像数据(原始字节)创建一个CVPixelBufferRef。因此CVPBR拥有数据。 b)然后我将其包装在CMSampleBufferRef中,如此(为简洁起见,删除了错误检查):
+ (CMSampleBufferRef)wrapImageBufferInCMSampleBuffer:(CVImageBufferRef)imageBuffer timingInfo:(CMSampleTimingInfo const *)timingInfo error:(NSError **)error {
// Create a format description for the pixel buffer
CMVideoFormatDescriptionRef formatDescription = 0;
OSStatus result = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, imageBuffer, &formatDescription);
// Finally, create a CMSampleBuffer wrapper around it all
CMSampleBufferRef sampleBuffer = nil;
OSStatus sampleBufferResult = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, imageBuffer, YES, nil, nil, formatDescription, timingInfo, &sampleBuffer);
if (formatDescription) {
CFRelease(formatDescription);
}
return sampleBuffer;
}
这为我创建了一个可以传递给AVWriter的缓冲区。到现在为止还挺好。 问题是,如果我在将缓冲区附加到编写器之后释放缓冲区,我会看到奇怪的扭曲。
如果我这样做:
[avInput appendSampleBuffer:delayedBuffer]
CFRelease(sampleBuffer);
CFRelease(pixelBuffer);
然后它工作,没有内存泄漏,但偶尔I will see corrupted frames。注意有时不同步的帧,以及第391,394帧的明显内存损坏。对我来说,在AVF完成编码之前,内存缓冲区似乎已被释放。
如果我删除CFRelease(pixelBuffer),问题就会消失。 resulting movie is perfectly smooth完全没有腐败。当然;那我有一个多GB内存泄漏的问题!
有没有其他人遇到这样的事情?
btw:如果我使用AVAssetWriterInputPixelBufferAdaptor也没关系。获得了相同的结果。
以下是重现问题的完整代码段:
- (void)recordMovieUsingStandardAVFTo:(NSURL *)url colorSequence:(NSArray *)colorSequence frameDuration:(CMTime)frameDuration size:(NSSize)size {
NSError *error = nil;
AVAssetWriter *writer = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeQuickTimeMovie error:&error];
if ([self checkForError:error doing:@"creation of asset writer"] == NO) {
return;
}
NSMutableDictionary *videoSettings = [NSMutableDictionary dictionary];
[videoSettings setValue:[NSNumber numberWithLong:(long) size.width] forKey:AVVideoWidthKey];
[videoSettings setValue:[NSNumber numberWithLong:(long) size.height] forKey:AVVideoHeightKey];
[videoSettings setValue:AVVideoCodecH264 forKey:AVVideoCodecKey];
AVAssetWriterInput *videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
[writer addInput:videoInput];
[writer startWriting];
[writer startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^{
NSError *localError = nil;
CMTime frameTime = kCMTimeZero;
int frameCounter = 0;
for (int i = 0; i < 4; i++) {
@autoreleasepool {
for (NSColor *color in colorSequence) {
CMSampleTimingInfo timing = kCMTimingInfoInvalid;
timing.presentationTimeStamp = frameTime;
timing.duration = frameDuration;
while (videoInput.isReadyForMoreMediaData == NO) {
[NSThread sleepForTimeInterval:0.1];
}
CVPixelBufferRef pixelBuffer = [self createPixelBufferBufferOfColor:color size:size];
CMSampleBufferRef sampleBuffer = [OSGUtils wrapImageBufferInCMSampleBuffer:pixelBuffer timingInfo:&timing error:&localError];
BOOL recordingSuccess = [videoInput appendSampleBuffer:sampleBuffer];
if (recordingSuccess) {
frameTime = CMTimeAdd(frameTime, frameDuration);
frameCounter++;
if (frameCounter % 60 == 0) {
ApplicationLogInfo(@"Wrote frame at time %@", [NSString niceStringForCMTime:frameTime]);
}
} else {
ApplicationLogError(@"Can't write frame at time %@", localError);
}
CFRelease(sampleBuffer);
CFRelease(pixelBuffer);
}
}
}
[videoInput markAsFinished];
[writer endSessionAtSourceTime:frameTime];
BOOL success = [writer finishWriting];
if (!success) {
ApplicationLogError(@"Failed to finish writing, %@, %d", writer.error, writer.status);
} else {
ApplicationLogInfo(@"Write complete");
}
});
}