exportAsynchronouslyWithCompletionHandler失败并显示多个视频文件(代码= -11820)

时间:2012-12-29 11:26:19

标签: objective-c ios video

我正在录制小型视频片段(大约一秒左右,前置和后置摄像头,可能有不同的方向)。然后尝试使用AVAssetExportSession合并它们。我基本上使用适当的变换和音频和音频制作合成和视频组合。视频曲目。

问题是,如果你有超过4个视频剪辑,在iOS 5上会失败,而在iOS 6上,限制似乎是16个剪辑。

这对我来说似乎真的令人费解。 AVAssetExportSession是在做一些奇怪的事情,还是对可以传递给它的剪辑数量有一些未记录的限制?以下是我的代码摘录:

-(void)exportVideo
{
    AVMutableComposition *composition = video.composition;
    AVMutableVideoComposition *videoComposition = video.videoComposition;
    NSString * presetName = AVAssetExportPresetMediumQuality;

    AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:presetName];
    self.exportSession = _assetExport;

    videoComposition.renderSize = CGSizeMake(640, 480);
    _assetExport.videoComposition = videoComposition;

    NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent: @"export.mov"];
    NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];

    // Delete the currently exported files if it exists
    if([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
        [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];

    _assetExport.outputFileType = AVFileTypeQuickTimeMovie;
    _assetExport.outputURL = exportUrl;
    _assetExport.shouldOptimizeForNetworkUse = YES;

    [_assetExport exportAsynchronouslyWithCompletionHandler:^{
        switch (_assetExport.status)
        {
            case AVAssetExportSessionStatusCompleted:
                NSLog(@"Completed exporting!");
                break;
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Failed:%@", _assetExport.error.description);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Canceled:%@", _assetExport.error);
                break;
            default:
                break;
        }
    }];
}

以下是这些作品的制作方法:

-(void)setVideoAndExport
{
    video = nil;
    video = [[VideoComposition alloc] initVideoTracks];

    CMTime localTimeline = kCMTimeZero;

    // Create the composition of all videofiles
    for (NSURL *url in outputFileUrlArray) {
        AVAsset *asset = [[AVURLAsset alloc]initWithURL:url options:nil];
        [video setVideo:url at:localTimeline];
        localTimeline = CMTimeAdd(localTimeline, asset.duration); // Increment the timeline
    }
    [self exportVideo];
}

这是VideoComposition类的主要内容:

-(id)initVideoTracks
{
    if((self = [super init]))
    {
        composition = [[AVMutableComposition alloc] init];
        addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        instructions = [[NSMutableArray alloc] init];
        videoComposition = [AVMutableVideoComposition videoComposition];
    }
    return self;
}


-(void)setVideo:(NSURL*) url at:(CMTime)to
{
    asset = [[AVURLAsset alloc]initWithURL:url options:nil];

    AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    AVMutableCompositionTrack *compositionTrackVideo = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil];

    AVMutableCompositionTrack *compositionTrackAudio = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:to error:nil];

    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(to, asset.duration));

    AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrackVideo];

    [layerInstruction setTransform: assetTrack.preferredTransform atTime: kCMTimeZero];
    [layerInstruction setOpacity:0.0 atTime:CMTimeAdd(to, asset.duration)];
    [instructions addObject:layerInstruction];

    mainInstruction.layerInstructions = instructions;
    videoComposition.instructions = [NSArray arrayWithObject:mainInstruction];
    videoComposition.frameDuration = CMTimeMake(1, 30);
}

2 个答案:

答案 0 :(得分:7)

好的,我也就这个问题联系了Apple,他们给出了回复:

“这是一个已知条件。您正在达到AVFoundation中设置的解码器限制。”

他们还要求我提交有关该问题的错误报告,因为AVAssetExportSession提供的错误消息是模糊和误导性的。所以我向苹果公司提交了一份错误报告,抱怨错误信息不好。

因此确认了AVAssetExportSession中的这些限制。在iOS 5中,解码器限制为4,在iOS 6中,它被提升到16.主要问题是AVAssetExportSession报告的错误很糟糕,因为它只报告:11820“无法完成导出”而不是实际告诉我们我们有达到了极限。

答案 1 :(得分:2)

我也遇到过类似的问题。我已经设法通过将资源插入到合成中来修复它,而不是将轨道插入到可变轨道中。因此,在“setVideo”的代码中,而不是此行:

[compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil];

试试这个:

[self insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofAsset:asset atTime:to error:nil]