AVAssetWriter遇到麻烦

时间:2011-05-31 14:58:41

标签: iphone ios avassetwriter

我正在尝试使用AVAssetWriter将CGImages写入文件以从图像创建视频。

我已经在模拟器上以三种不同的方式成功地工作了,但是在运行iOS 4.3的iPhone 4上,每种方法都失败了。

这一切都与像素缓冲区有关。

我的第一种方法是在不使用池的情况下根据需要创建像素缓冲区。这样做有效,但在内存密集,无法在设备上工作。

我的第二种方法是使用推荐的AVAssetWriterInputPixelBufferAdaptor,然后使用CVPixelBufferPoolCreatePixelBuffer从适配器pixelBufferPool中拉出像素缓冲区。

这也适用于模拟器,但在设备上失败,因为永远不会分配适配器的像素缓冲池。我没有收到错误消息。

最后,我尝试使用CVPixelBufferPoolCreate创建自己的像素缓冲池。这也适用于模拟器,但在设备上,一切正常,直到我尝试使用appendPixelBuffer追加像素缓冲区,每次都失败。

我在网上发现了非常少的信息。我的代码基于我发现的例子,但现在几天都没有运气。如果ANYONE有成功使用AVAssetWriter的经验,请查看并告诉我您是否看到任何不合适的地方。

注意:您将看到注释掉的尝试阻止。

首先,设置

- (BOOL) openVideoFile: (NSString *) path withSize:(CGSize)imageSize {
size = CGSizeMake (480.0, 320.0);//imageSize;

NSError *error = nil;
videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
if (error != nil)
    return NO;

NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                            [NSNumber numberWithDouble:size.width], AVVideoCleanApertureWidthKey,
                                            [NSNumber numberWithDouble:size.height], AVVideoCleanApertureHeightKey,
                                            [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
                                            [NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
                                            nil];


NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                          [NSNumber numberWithInt:1], AVVideoPixelAspectRatioHorizontalSpacingKey,
                                          [NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey,
                                          nil];



NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               //[NSNumber numberWithInt:960000], AVVideoAverageBitRateKey,
                              // [NSNumber numberWithInt:1],AVVideoMaxKeyFrameIntervalKey,
                               videoCleanApertureSettings, AVVideoCleanApertureKey,
                               videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
                               //AVVideoProfileLevelH264Main31, AVVideoProfileLevelKey,
                               nil];

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               codecSettings,AVVideoCompressionPropertiesKey,
                               [NSNumber numberWithDouble:size.width], AVVideoWidthKey,
                               [NSNumber numberWithDouble:size.height], AVVideoHeightKey,
                               nil];
writerInput = [[AVAssetWriterInput
                                    assetWriterInputWithMediaType:AVMediaTypeVideo
                                    outputSettings:videoSettings] retain];
NSMutableDictionary * bufferAttributes = [[NSMutableDictionary alloc] init];
[bufferAttributes setObject: [NSNumber numberWithInt: kCVPixelFormatType_32ARGB]
                   forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[bufferAttributes setObject: [NSNumber numberWithInt: 480]
                   forKey: (NSString *) kCVPixelBufferWidthKey];
[bufferAttributes setObject: [NSNumber numberWithInt: 320]
                   forKey: (NSString *) kCVPixelBufferHeightKey];


//NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
//[bufferAttributes setObject: [NSNumber numberWithInt: 640]
//                   forKey: (NSString *) kCVPixelBufferWidthKey];
//[bufferAttributes setObject: [NSNumber numberWithInt: 480]
//                   forKey: (NSString *) kCVPixelBufferHeightKey];
adaptor = [[AVAssetWriterInputPixelBufferAdaptor
            assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
            sourcePixelBufferAttributes:nil] retain];

//CVPixelBufferPoolCreate (kCFAllocatorSystemDefault,NULL,(CFDictionaryRef)bufferAttributes,&pixelBufferPool);
//Create buffer pool
NSMutableDictionary*     attributes;
attributes = [NSMutableDictionary dictionary];

int width = 480;
int height = 320;

[attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey];
CVReturn theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef) attributes, &pixelBufferPool);                                           


NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];

writerInput.expectsMediaDataInRealTime = YES;

//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

buffer = NULL;
lastTime = kCMTimeZero;
presentTime = kCMTimeZero;

return YES;
}

接下来,附加编写器并创建要追加的像素缓冲区的两种方法。

- (void) writeImageToMovie:(CGImageRef)image 
{
    if([writerInput isReadyForMoreMediaData])
    {
//          CMTime frameTime = CMTimeMake(1, 20);
//          CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 24 of the loop above
//          CMTime presentTime=CMTimeAdd(lastTime, frameTime);

        buffer = [self pixelBufferFromCGImage:image];
        BOOL success = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
        if (!success) NSLog(@"Failed to appendPixelBuffer");
        CVPixelBufferRelease(buffer);

        presentTime = CMTimeAdd(lastTime, CMTimeMake(5, 1000));
        lastTime = presentTime;
    }
    else
    {
        NSLog(@"error - writerInput not ready");
    }
}

- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CVPixelBufferRef pxbuffer;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
if (pixelBufferPool == NULL) NSLog(@"pixelBufferPool is null!");
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, pixelBufferPool, &pxbuffer); 
/*if (pxbuffer == NULL) {
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                      size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                      &pxbuffer);

}*/
//NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);


CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
//NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                             size.height, 8, 4*size.width, rgbColorSpace, 
                                             kCGImageAlphaNoneSkipFirst);
//NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(90, 10, CGImageGetWidth(image), 
                                       CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}

2 个答案:

答案 0 :(得分:2)

我找到了解决这个问题的方法。

如果您希望AVAudioPlayer和AVAssetWriter一起正常运行,则必须具有“可混合”的音频会话类别。

您可以使用可混合的类别,如AVAudioSessionCategoryAmbient。

但是,我需要使用AVAudioSessionCategoryPlayAndRecord。

您可以通过实施以下任何类别来设置:

OSStatus propertySetError = 0;

UInt32 allowMixing = true;

propertySetError = AudioSessionSetProperty (
                       kAudioSessionProperty_OverrideCategoryMixWithOthers,  // 1
                       sizeof (allowMixing),                                 // 2
                       &allowMixing                                          // 3
                   );

答案 1 :(得分:1)

好吧,首先你需要在创建适配器对象时传递一些bufferAttributes

    NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:_videoWriterInput
                                                 sourcePixelBufferAttributes:bufferAttributes];

然后删除对CVPixelBufferPoolCreate的调用,已经在适配器对象中创建了一个像素缓冲池,所以请调用它:

                CVPixelBufferRef pixelBuffer = NULL;
            CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pixelBuffer);
            CVPixelBufferLockBaseAddress(pixelBuffer, 0);

            // ...fill the pixelbuffer here

            CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

            CMTime frameTime = CMTimeMake(frameCount,(int32_t) 30);

            BOOL res = [adaptor appendPixelBuffer:pixelBuffer withPresentationTime:frameTime];
            CVPixelBufferRelease(pixelBuffer);
            CFRelease(sampleBuffer);

我认为应该这样做,我在某些时候遇到了类似的错误,我通过创建适配器和像素缓冲区来解决它,如下所示..