AVFoundation - AVAssetExportSession - 在第二次导出尝试时停止的操作

时间:2018-01-06 15:56:00

标签: cocoa-touch avfoundation ios11 avassetexportsession

我正在创建画中画视频,这个功能已经完美运行(据我所知)已有1。5年。现在它出现在IOS 11中它只在第一次被调用时工作...当它被调用来做第二个视频时(没有强制关闭应用程序)我收到下面的错误消息。

我在堆栈上发现了这篇文章,但我已按照本文正确使用资产跟踪:AVAssetExportSession export fails non-deterministically with error: “Operation Stopped, NSLocalizedFailureReason=The video could not be composed.”

我已经使用了我正在使用的确切方法。任何帮助将不胜感激!

错误讯息:

Error: Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" 
UserInfo={NSLocalizedFailureReason=The video could not be composed., 
NSLocalizedDescription=Operation Stopped, 
NSUnderlyingError=0x1c04521e0 
{Error Domain=NSOSStatusErrorDomain Code=-17390 "(null)"}}

以下方法:

- (void) composeVideo:(NSString*)videoPIP onVideo:(NSString*)videoBG
{
@try {
    NSError *e = nil;

    AVURLAsset *backAsset, *pipAsset;

    // Load our 2 movies using AVURLAsset
    pipAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoPIP] options:nil];
    backAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoBG] options:nil];

    if ([[NSFileManager defaultManager] fileExistsAtPath:videoPIP])
    {
        NSLog(@"PIP File Exists!");
    }
    else
    {
        NSLog(@"PIP File DOESN'T Exist!");
    }

    if ([[NSFileManager defaultManager] fileExistsAtPath:videoBG])
    {
        NSLog(@"BG File Exists!");
    }
    else
    {
        NSLog(@"BG File DOESN'T Exist!");
    }

    float scaleH = VIDEO_SIZE.height / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].width;
    float scaleW = VIDEO_SIZE.width / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height;

    float scalePIP = (VIDEO_SIZE.width * 0.25) / [[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width;

    // Create AVMutableComposition Object - this object will hold our multiple AVMutableCompositionTracks.
    AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

    // Create the first AVMutableCompositionTrack by adding a new track to our AVMutableComposition.
    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    // Set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to our newly created track at kCMTimeZero so video plays from the start of the track.
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, pipAsset.duration) ofTrack:[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];
    if (e)
    {
        NSLog(@"Error0: %@",e);
        e = nil;
    }

    // Repeat the same process for the 2nd track and also start at kCMTimeZero so both tracks will play simultaneously.
    AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];

    if (e)
    {
        NSLog(@"Error1: %@",e);
        e = nil;
    }

    // We also need the audio track!
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&e];
    if (e)
    {
        NSLog(@"Error2: %@",e);
        e = nil;
    }


    // Create an AVMutableVideoCompositionInstruction object - Contains the array of AVMutableVideoCompositionLayerInstruction objects.
    AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

    // Set Time to the shorter Asset.
    MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);

    // Create an AVMutableVideoCompositionLayerInstruction object to make use of CGAffinetransform to move and scale our First Track so it is displayed at the bottom of the screen in smaller size.
    AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];

    //CGAffineTransform Scale1 = CGAffineTransformMakeScale(0.3f,0.3f);
    CGAffineTransform Scale1 = CGAffineTransformMakeScale(scalePIP, scalePIP);

    // Top Left
    CGAffineTransform Move1 = CGAffineTransformMakeTranslation(3.0, 3.0);

    [FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale1,Move1) atTime:kCMTimeZero];

    // Repeat for the second track.
    AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];

    CGAffineTransform Scale2 = CGAffineTransformMakeScale(scaleW, scaleH);
    CGAffineTransform rotateBy90Degrees = CGAffineTransformMakeRotation( M_PI_2);
    CGAffineTransform Move2 = CGAffineTransformMakeTranslation(0.0, ([[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height) * -1);

    [SecondlayerInstruction setTransform:CGAffineTransformConcat(Move2, CGAffineTransformConcat(rotateBy90Degrees, Scale2)) atTime:kCMTimeZero];

    // Add the 2 created AVMutableVideoCompositionLayerInstruction objects to our AVMutableVideoCompositionInstruction.
    MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction, SecondlayerInstruction, nil];

    // Create an AVMutableVideoComposition object.
    AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
    MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
    MainCompositionInst.frameDuration = CMTimeMake(1, 30);


    // Set the render size to the screen size.
    //        MainCompositionInst.renderSize = [[UIScreen mainScreen] bounds].size;
    MainCompositionInst.renderSize = VIDEO_SIZE;


    NSString  *fileName = [NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"fullreaction.MP4"];

    // Make sure the video doesn't exist.
    if ([[NSFileManager defaultManager] fileExistsAtPath:fileName])
    {
        [[NSFileManager defaultManager] removeItemAtPath:fileName error:nil];
    }

    // Now we need to save the video.
    NSURL *url = [NSURL fileURLWithPath:fileName];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:QUALITY];
    exporter.videoComposition = MainCompositionInst;
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeMPEG4;


    [exporter exportAsynchronouslyWithCompletionHandler:
     ^(void )
     {
         NSLog(@"File Saved as %@!", fileName);
         NSLog(@"Error: %@", exporter.error);
         [self performSelectorOnMainThread:@selector(runProcessingComplete) withObject:nil waitUntilDone:false];
     }];

}
@catch (NSException *ex) {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error 3" message:[NSString stringWithFormat:@"%@",ex]
                                                   delegate:self cancelButtonTitle:@"OK" otherButtonTitles: nil];
    [alert show];
}


}

1 个答案:

答案 0 :(得分:3)

原因: 它最终导致“MainInstruction”timeRange不正确。

无法使用“值”比较CMTime对象。相反,您必须使用CMTIME_COMPARE_INLINE。

要修复,请替换此行:

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);

使用此行:

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTIME_COMPARE_INLINE(pipAsset.duration, >, backAsset.duration) ? pipAsset.duration : backAsset.duration);