使用没有CALayer的AVFoundation合成2个具有半透明度的视频会成为追加视频吗?

时间:2015-07-03 09:15:10

标签: ios objective-c avfoundation

我想要的:在AVVideoCompositionTrack的0:00插入多个具有不透明度ALL的视频图层。

我仔细阅读了官方AVFoundation文档以及许多WWDC关于此主题的讨论。但我无法理解为什么结果不遵循API语句。

我可以在播放期间使用2个AVPlayerLayer实现叠加效果。这也可能意味着我可以使用AVVideoCompositionCoreAnimationTool在导出期间实现类似的东西。 但我倾向于将CALayer留给字幕/图像叠加或动画。

我尝试插入任何AVAsset:

- (void)addVideo:(AVAsset *)asset_in withOpacity:(float)opacity
{
    // This is demo for composition of opaque videos. So we all insert video at time - 0:00
    [_videoCompositionTrack insertTimeRange:CMTimeRangeMake( kCMTimeZero, asset_in.duration )
                                    ofTrack:[ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ]
                                     atTime:kCMTimeZero error:nil ];

    AVMutableVideoCompositionInstruction *mutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    AVAssetTrack *assettrack_in = [ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
    mutableVideoCompositionInstruction.timeRange = CMTimeRangeMake( kCMTimeZero, assettrack_in.timeRange.duration );
    AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:_videoCompositionTrack];
    [videoCompositionLayerInstruction setTransform:assettrack_in.preferredTransform atTime:kCMTimeZero];
    [videoCompositionLayerInstruction setOpacity:opacity atTime:kCMTimeZero];
    mutableVideoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];
    [_arrayVideoCompositionInstructions addObject:mutableVideoCompositionInstruction];
}

请注意,insertTimeRange的atTime:kCMTimeZero作为参数。所以我希望它们会放在视频合成的开头。

我尝试导出的内容:

- (IBAction)ExportAndPlay:(id)sender
{
    _mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy];

    // Create a static date formatter so we only have to initialize it once.
    static NSDateFormatter *kDateFormatter;
    if (!kDateFormatter) {
        kDateFormatter = [[NSDateFormatter alloc] init];
        kDateFormatter.dateStyle = NSDateFormatterMediumStyle;
        kDateFormatter.timeStyle = NSDateFormatterShortStyle;
    }
    // Create the export session with the composition and set the preset to the highest quality.
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_mutableComposition presetName:AVAssetExportPresetHighestQuality];
    // Set the desired output URL for the file created by the export process.
    exporter.outputURL = [[[[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:@YES error:nil] URLByAppendingPathComponent:[kDateFormatter stringFromDate:[NSDate date]]] URLByAppendingPathExtension:CFBridgingRelease(UTTypeCopyPreferredTagWithClass((CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension))];
    // Set the output file type to be a QuickTime movie.
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = _mutableVideoComposition;
    _mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy];
    // Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            switch ([exporter status]) {
                case AVAssetExportSessionStatusFailed:
                {
                    NSLog(@"Export failed: %@ %@", [[exporter error] localizedDescription],[[exporter error]debugDescription]);
                }
                case AVAssetExportSessionStatusCancelled:
                {
                    NSLog(@"Export canceled");
                    break;
                }
                case AVAssetExportSessionStatusCompleted:
                {
                    NSLog(@"Export complete!");
                    NSLog( @"Export URL = %@", [exporter.outputURL absoluteString] );
                    [self altPlayWithUrl:exporter.outputURL];
                }
                default:
                {
                    NSLog(@"default");
                }
            }

        } );
    }];
}

结果是:如果我选择2个视频片段,它会导出第一个视频后附加第二个视频的视频。

这与我读到的内容不同:AVMutableCompositionTrack

愿任何人为这只无助的羔羊揭开光芒吗?

编辑:是否有任何细节缺失,以至于没有人可以帮助我?如果是这样,请留下评论,以便我可以弥补。

1 个答案:

答案 0 :(得分:0)

好的,很抱歉这是因为对API有关AVMutableCompositionTrack的一些误解。

如果你想像我一样将2个视频混合为2个叠加层。您将需要2个AVMutableCompositionTrack实例,这两个实例都是从相同的AVMutableComposition实例化的,如下所示:

    // 0. Setup AVMutableCompositionTracks <=  FOR EACH AVAssets !!!
AVMutableCompositionTrack *mutableCompositionVideoTrack1 = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *mutableCompositionVideoTrack2 = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

将所需的AVAssets插入到他们自己的AVMutableCompositionTrack:

    AVAssetTrack *videoAssetTrack1 = [ [ [_arrayVideoAssets firstObject] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
AVAssetTrack *videoAssetTrack2 = [ [ [_arrayVideoAssets lastObject] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
    [ mutableCompositionVideoTrack1 insertTimeRange:CMTimeRangeMake( kCMTimeZero, videoAssetTrack1.timeRange.duration ) ofTrack:videoAssetTrack1 atTime:kCMTimeZero error:nil ];
    [ mutableCompositionVideoTrack2 insertTimeRange:CMTimeRangeMake( kCMTimeZero, videoAssetTrack2.timeRange.duration ) ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:nil ];

然后使用每个AVMutableCompositionTracks的2层指令设置AVMutableVideoComposition:

AVMutableVideoCompositionInstruction *compInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
compInstruction.timeRange = CMTimeRangeMake( kCMTimeZero, videoAssetTrack1.timeRange.duration );
AVMutableVideoCompositionLayerInstruction *layerInstruction1 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack1];
[layerInstruction1 setOpacity:0.5f atTime:kCMTimeZero];

AVMutableVideoCompositionLayerInstruction *layerInstruction2 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack2];
[layerInstruction2 setOpacity:0.8f atTime:kCMTimeZero];
CGAffineTransform transformScale = CGAffineTransformMakeScale( 0.5f, 0.5f );
CGAffineTransform transformTransition = CGAffineTransformMakeTranslation( videoComposition.renderSize.width / 2,  videoComposition.renderSize.height / 2 );
[ layerInstruction2 setTransform:CGAffineTransformConcat(transformScale, transformTransition) atTime:kCMTimeZero ];
compInstruction.layerInstructions = @[ layerInstruction1, layerInstruction2 ];
videoComposition.instructions = @[ compInstruction ];

最后,在导出期间应该没问题。 很抱歉打扰如果有人看看。