AVAssetExportSession间歇性错误11820"无法完成导出"建议=再次尝试导出

时间:2016-02-03 07:20:55

标签: ios xcode avfoundation avassetexportsession avmutablecomposition

  

导出状态4错误域= AVFoundationErrorDomain代码= -11820"无法完成导出" UserInfo = {NSLocalizedDescription =无法完成导出,NSLocalizedRecoverySuggestion =再次尝试导出。}

尝试使用AVMutableComposition导出AVMutableVideoCompositionLayerInstruction AVMutableVideoCompositionAVAssetExportSession时,我遇到间歇性错误。

目标是合并无限数量的视频,并使用layerInstructions在剪辑之间应用过渡。

P.S。错误不一致。它在尝试合并5个剪辑和18个剪辑时起作用,但在尝试合并17个剪辑时不起作用。

我已在下面发布了我的代码。非常感谢任何帮助。

编辑:似乎问题与创建多个AVMutableCompositionTrack(s)有关。如果创建的值超过15或16,则会发生错误。但是,我认为,创建多个AVMutableCompositionTrack是必要的,以重叠所有视频并创建重叠过渡。

编辑2:选择较短的视频时,会在发生错误之前处理更多视频。因此,它看起来像是一个存储器问题,即轨道被解除分配。但是,基于内存管理工具,似乎没有内存泄漏。

-(void)prepareMutableCompositionForPlayback{
AVMutableComposition *mutableComposition = [[AVMutableComposition alloc] init];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.backgroundColor = [[UIColor blackColor] CGColor];

NSMutableArray *instructionsArray = [[NSMutableArray alloc] init];

videoStartTime = kCMTimeZero;

for(int i = 0; i < videoAssetsArray.count; i++){
    AVAsset *videoAsset = [videoAssetsArray objectAtIndex:i];
    CMTime currentVideoDuration = [videoAsset duration];

    AVMutableCompositionTrack *videoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentVideoDuration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:videoStartTime error:nil];

    CGSize videoSize = [videoTrack naturalSize];

    if([videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0){
        AVMutableCompositionTrack *audioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentVideoDuration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:videoStartTime error:nil];
    }

    //INSTRUCTIONS - TRANSITIONS
    AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

    int transitionNumber = [[videoTransitionsArray objectAtIndex:i] intValue];
    float transitionDuration = [[videoTransitionsDurationArray objectAtIndex:i] floatValue];

    if(i == 0){
        [layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:CMTimeRangeMake(CMTimeSubtract(currentVideoDuration, CMTimeMakeWithSeconds(transitionDuration, 600)), CMTimeMakeWithSeconds(transitionDuration, 600))];
    }
    else{
        int previousTransitionNumber = [[videoTransitionsArray objectAtIndex:i - 1] intValue];
        float previousTransitionDuration = [[videoTransitionsDurationArray objectAtIndex:i - 1] floatValue];

        if(i < videoAssetsArray.count - 1){
            [layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(videoStartTime, CMTimeMakeWithSeconds(previousTransitionDuration, 600))];

            [layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:CMTimeRangeMake(CMTimeAdd(videoStartTime, CMTimeSubtract(currentVideoDuration, CMTimeMakeWithSeconds(transitionDuration, 600))), CMTimeMakeWithSeconds(transitionDuration, 600))];
        }
        else{
            [layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(videoStartTime, CMTimeMakeWithSeconds(previousTransitionDuration, 600))];
        }
    }

    [instructionsArray addObject:layerInstruction];

    if(i < videoAssetsArray.count - 1){
        //TAKING INTO ACCOUNT THE TRANSITION DURATION TO OVERLAP VIDEOS
        videoStartTime = CMTimeAdd(videoStartTime, CMTimeSubtract(currentVideoDuration, CMTimeMakeWithSeconds(transitionDuration, 600)));
    }
    else{
        //TRANSITION NOT APPLIED TO THE END OF THE LAST CLIP
        videoStartTime = CMTimeAdd(videoStartTime, currentVideoDuration);
    }
}

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,videoStartTime);
mainInstruction.layerInstructions = instructionsArray;

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObjects:mainInstruction,nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(1920, 1080);

NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:@"videoRecordingFinalOutput.mov"];
NSURL *videoOutputURL = [[NSURL alloc] initFileURLWithPath:videoOutputPath];

AVAssetExportSession *videoExportSession = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
videoExportSession.outputURL = videoOutputURL;
videoExportSession.videoComposition = videoComposition;
videoExportSession.outputFileType = AVFileTypeQuickTimeMovie;

[videoExportSession exportAsynchronouslyWithCompletionHandler:^{
    NSLog(@"EXPORT STATUS %ld %@", (long)videoExportSession.status, videoExportSession.error);

    if(videoExportSession.error == NULL){
        NSLog(@"EXPORT SUCCESSFUL");

        [library writeVideoAtPathToSavedPhotosAlbum:videoOutputURL
                                    completionBlock:^(NSURL *assetURL, NSError *error) {
                                        if(error) {

                                            NSError *error = nil;
                                            if([[NSFileManager defaultManager] fileExistsAtPath:videoOutputPath]){
                                                [[NSFileManager defaultManager] removeItemAtPath:videoOutputPath error:&error];
                                                if(error){
                                                    NSLog(@"VIDEO FILE DELETE FAILED");
                                                }
                                                else{
                                                    NSLog(@"VIDEO FILE DELETED");
                                                }
                                            }
                                        }
                                        else{
                                            NSError *error = nil;
                                            if([[NSFileManager defaultManager] fileExistsAtPath:videoOutputPath]){
                                                [[NSFileManager defaultManager] removeItemAtPath:videoOutputPath error:&error];
                                                if(error){
                                                    NSLog(@"VIDEO FILE DELETE FAILED");
                                                }
                                                else{
                                                    NSLog(@"VIDEO FILE DELETED");
                                                }
                                            }
                                        }
                                    }];
    }
    else{
        NSError *error = nil;
        if([[NSFileManager defaultManager] fileExistsAtPath:videoOutputPath]){
            [[NSFileManager defaultManager] removeItemAtPath:videoOutputPath error:&error];
            if(error){
                NSLog(@"VIDEO FILE DELETE FAILED");
            }
            else{
                NSLog(@"VIDEO FILE DELETED");
            }
        }
    }
}];
}

1 个答案:

答案 0 :(得分:2)

为什么不尝试仅使用2个videoTracks并在这些2中插入timeRanges而不是为每个剪辑创建新的videoTracks,并在2个曲目之间进行转换

所以第一个视频将被插入videoTrack1,第二个视频将放在videoTrack2上,这样就可以应用转换,然后再将第三个剪辑插入到第一个轨道中,依此类推。