如何使用AVAssetWriter编写带视频和音频的电影?

时间:2011-03-30 02:36:35

标签: iphone video audio export avassetwriter

我想用AVAssetWriter导出电影,但无法弄清楚如何同步包含视频和音轨。仅导出视频效果很好,但是当我添加音频时,生成的电影看起来像这样:

首先我看到视频(没有音频),然后视频冻结(显示最后一个图像帧直到结束),几秒钟后我听到了音频。

我尝试使用CMSampleBufferSetOutputPresentationTimeStamp(从当前减去第一个CMSampleBufferGetPresentationTimeStamp)为音频做了一些事情,但这一切都没有用,我不认为这是正确的方向,因为视频&无论如何,源电影中的音频应该同步......

我的设置简短:我创建了一个AVAssetReader和一个AVAssetReaderTrackOutput(一个用于视频,一个用于音频)并将它们添加到AVAssetReader,然后创建一个{{1 }和2 AVAssetWriter(视频和音频)并将它们添加到AVAssetWriterInput ...我开始使用:

AVAssetWriter

然后我运行2个队列来做样本缓冲区:

[assetReader startReading];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];

在主循环中,我等待两个队列,直到完成:

dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
     while([assetWriterVideoInput isReadyForMoreMediaData])
     {
         CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer];
         if(sampleBuffer)
         {
             [assetWriterVideoInput appendSampleBuffer:sampleBuffer];
             CFRelease(sampleBuffer);
         } else
         {
             [assetWriterVideoInput markAsFinished];
             dispatch_release(queueVideo);
             videoFinished=YES;
             break;
         }
     }
}];

dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
    while([assetWriterAudioInput isReadyForMoreMediaData])
    {
        CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer];
        if(sampleBuffer)
        {
            [assetWriterAudioInput appendSampleBuffer:sampleBuffer];
            CFRelease(sampleBuffer);
        } else
        {
            [assetWriterAudioInput markAsFinished];
            dispatch_release(queueAudio);
            audioFinished=YES;
            break;
        }
    }
}];

此外,我尝试使用以下代码将结果文件保存在库中...

while(!videoFinished && !audioFinished)
{
    sleep(1);
}
[assetWriter finishWriting];

...但我收到错误:

  

NSURL *url=[[NSURL alloc] initFileURLWithPath:path]; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url]) { [library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error) { if(error) NSLog(@"error=%@",error.localizedDescription); else NSLog(@"completed..."); }]; } else NSLog(@"error, video not saved..."); [library release]; [url release];

代码在另一个程序中没有问题。所以电影有问题......?

2 个答案:

答案 0 :(得分:9)

-(void)mergeAudioVideo
{

    NSString *videoOutputPath=[_documentsDirectory stringByAppendingPathComponent:@"dummy_video.mp4"];
    NSString *outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"];
    if ([[NSFileManager defaultManager]fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil];


    NSURL    *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    NSString *filePath = [_documentsDirectory stringByAppendingPathComponent:@"newFile.m4a"];
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSURL    *audio_inputFileUrl = [NSURL fileURLWithPath:filePath];
    NSURL    *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath];

    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);

    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         if (_assetExport.status == AVAssetExportSessionStatusCompleted) {

          //Write Code Here to Continue
         }
         else {
            //Write Fail Code here     
         }
     }
     ];



}

您可以使用此代码合并音频和视频。

答案 1 :(得分:-3)

它接缝assetWriterAudioInput忽略音频写入的样本缓冲时间。 这样做。

1)写视频曲目。

2)完成后,将其标记为完成,即[videoWriterInput markAsFinished];

3)做[assetWriter startSessionAtSourceTime:timeRangeStart];

3)实例化音频阅读器并开始写音频。