AVAssetWriter音频与视频结合在一起

时间:2011-12-07 16:57:09

标签: cocoa cocos2d-iphone

我尝试编写简单的演示来捕获带有iphone音频的视频(就像在游戏录音机中一样)。在寻找一些解决方案之后,我想出了以下内容:

-(void) startScreenRecording 
{   
    NSLog(@"start screen recording");

    // create the AVAssetWriter
    NSString *documentPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    NSString *moviePath = [documentPath stringByAppendingPathComponent: @"/video.mov"];
    NSLog(@"moviePath:%@", moviePath);
    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath]) 
    {   
        [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
    }

    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
    NSError *movieError = nil;

    [assetWriter release];
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
                                            fileType: AVFileTypeQuickTimeMovie
                                               error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];

    [assetWriterPixelBufferAdaptor release];
    assetWriterPixelBufferAdaptor =  [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                      initWithAssetWriterInput:assetWriterInput
                                      sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];

    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];

    // start writing samples to it
    [assetWriterTimer release];
    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                        target:self
                                                      selector:@selector (writeSample:)
                                                      userInfo:nil
                                                       repeats:YES];    
}

-(void) writeSample: (NSTimer*) _timer 
{   
    if (assetWriterInput.readyForMoreMediaData) 
    {
        CVReturn cvErr = kCVReturnSuccess;

        // get screenshot image!
        CGImageRef image = (CGImageRef) [[self createARGBImageFromRGBAImage:[AWScreenshot takeAsImage]] CGImage];

        // prepare the pixel buffer
        CVPixelBufferRef pixelBuffer = NULL;
        CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
        cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                             FRAME_WIDTH,
                                             FRAME_HEIGHT,
                                             kCVPixelFormatType_32ARGB,
                                             (void*)CFDataGetBytePtr(imageData),
                                             CGImageGetBytesPerRow(image),
                                             NULL,
                                             NULL,
                                             NULL,
                                             &pixelBuffer);

        // calculate the time
        CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
        CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
        //NSLog (@"elapsedTime: %f", elapsedTime);
        CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);

        // write the sample
        BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];

        if (appended) 
        {   
            NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
        } 
        else 
        {   
            NSLog (@"failed to append");
            [self stopScreenRecording];
        }
    }
}

视频(.mov)文件已成功生成...

但是,现在我想要从视频中捕获音频(比如播放游戏时的一些音效和bg音乐)以及视频...

我搜索网络,我得到的是关于"如何将已存在的声音文件与已存在的电影合并的解决方案" ...

我是否必须单独录制音频和视频,然后在录制后合并?有没有办法将它们组合在一起?

非常感谢任何建议,谢谢:)

0 个答案:

没有答案