我确定我的缓冲区属性有问题,但我不清楚它是什么 - 它没有很好地记录了应该去那里的东西,所以我猜测基于CVPixelBufferPoolCreate
- 和Core Foundation对我来说几乎是一本封闭的书。
// "width" and "height" are const ints
CFNumberRef cfWidth = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &width);
CFNumberRef cfHeight = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &height);
CFStringRef keys[] = {
kCVPixelBufferWidthKey,
kCVPixelBufferHeightKey,
kCVPixelBufferCGImageCompatibilityKey
};
CFTypeRef values[] = {
cfWidth,
cfHeight,
kCFBooleanTrue
};
int numValues = sizeof(keys) / sizeof(keys[0]);
CFDictionaryRef bufferAttributes = CFDictionaryCreate(kCFAllocatorDefault,
(const void **)&keys,
(const void **)&values,
numValues,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks
);
AVAssetWriterInputPixelBufferAdaptor *adaptor = [[AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:(NSDictionary*)bufferAttributes] retain];
CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
NSParameterAssert(bufferPool != NULL); // fails
答案 0 :(得分:14)
当pixelBufferPool返回null时,请检查以下内容:
答案 1 :(得分:3)
我遇到了同样的问题,我认为可能是因为您没有正确配置AVAssetWriterInput
。我做完之后,我的游泳池就开始工作了。特别是,除非我在AVVideoCompressionPropertiesKey
中提供数据,否则池不会给我像素缓冲区。首先,创建并完全配置AVAssetWriter
(
在/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h
中查找密钥和& outputSettings
和compressionSettings
)的值:
NSError * err = 0;
AVAssetWriter * outputWriter = [AVAssetWriter
assetWriterWithURL: [NSURL fileURLWithPath:outputPath]
fileType: AVFileTypeAppleM4V
error: & err];
NSMutableDictionary * outputSettings
= [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264
forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: width_]
forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: height_]
forKey: AVVideoHeightKey];
NSMutableDictionary * compressionProperties
= [[NSMutableDictionary alloc] init];
[compressionProperties setObject: [NSNumber numberWithInt: 1000000]
forKey: AVVideoAverageBitRateKey];
[compressionProperties setObject: [NSNumber numberWithInt: 16]
forKey: AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject: AVVideoProfileLevelH264Main31
forKey: AVVideoProfileLevelKey];
[outputSettings setObject: compressionProperties
forKey: AVVideoCompressionPropertiesKey];
AVAssetWriterInput * writerInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeVideo
outputSettings: outputSettings];
[compressionProperties release];
[outputSettings release];
创建像素缓冲适配器:
NSMutableDictionary * pixBufSettings = [[NSMutableDictionary alloc] init];
[pixBufSettings setObject: [NSNumber numberWithInt: kCVPixelFormatType_32BGRA]
forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[pixBufSettings setObject: [NSNumber numberWithInt: width_]
forKey: (NSString *) kCVPixelBufferWidthKey];
[pixBufSettings setObject: [NSNumber numberWithInt: height_]
forKey: (NSString *) kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor * outputPBA =
[AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: outputInput
sourcePixelBufferAttributes: nil];
然后使用以下方法从其池中检索像素缓冲区:
CVReturn res = CVPixelBufferPoolCreatePixelBuffer (NULL
, [outputPBA pixelBufferPool]
, & outputFrame);
答案 2 :(得分:2)
根据文件:
“在关联的AVAssetWriter对象上第一次调用startSessionAtTime之前,此属性为NULL。”
因此,如果您过早地尝试访问池,则它将为NULL。我自己只是在学习这些东西,所以我现在还不能详细说明。
答案 3 :(得分:1)
对于每个仍在寻找解决方案的人: 首先,通过检查AVAssetWriter的状态来确保其正常工作。我已经遇到了这个问题,在检查了状态之后,虽然我已经打电话开始,但作者还没有开始。(就我而言,我指的是写作路径到现有文件,所以删除后,它就像魅力一样
答案 4 :(得分:0)
我把它全部搞定了!将选项字典设置为兼容性,他们说可以使用缓冲池,这里有工作样本和代码用于写入而没有缓冲区,但它是一个很好的起点。
以下是示例代码link
以下是您需要的代码:
- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);
NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
NSError *error = nil;
unlink([betaCompressionDirectory UTF8String]);
//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
if(error)
NSLog(@"error = %@", [error localizedDescription]);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
if ([videoWriter canAddInput:writerInput])
NSLog(@"I can add this input");
else
NSLog(@"i can't add this input");
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//---
// insert demo debugging code to write the same image repeated as a movie
CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage];
dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block frame = 0;
[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
while ([writerInput isReadyForMoreMediaData])
{
if(++frame >= 120)
{
[writerInput markAsFinished];
[videoWriter finishWriting];
[videoWriter release];
break;
}
CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
if (buffer)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
NSLog(@"FAIL");
else
NSLog(@"Success:%d", frame);
CFRelease(buffer);
}
}
}];
NSLog(@"outside for loop");
}
- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
答案 5 :(得分:0)
当outputURL
的{{1}}上没有文件时,它起作用。
AVAssetWriter
用法
extension FileManager {
func removeItemIfExist(at url: URL) {
do {
if FileManager.default.fileExists(atPath: url.path) {
try FileManager.default.removeItem(at: url)
}
} catch {
fatalError("\(error)")
}
}
}