将AVMutableVideoComposition与CIFilter一起使用会忽略AVVideoCompositionCoreAnimationTool animationTool参数

时间:2017-06-05 18:14:28

标签: ios video cifilter

我正在尝试在iOS中创建一个视频合成,同时结合CIFilter的应用程序和核心动画层。这两个操作都是单独工作的,但是尝试在一次通过中将它们组合在一起似乎不起作用。

使用AVMutableVideoComposition(asset:applyCIFiltersWithHandler :)时,似乎忽略了animationTool参数。还有其他人经历过这个吗?我见过有些人建议在AVMutableVideoComposition回调期间添加任何额外的CA层,但是我的CALayer中有一些动画,所以我看不出它是如何可靠的。

这是我正在使用的代码:

        let clipVideoTrack = asset.tracks(withMediaType:AVMediaTypeVideo)[0]
        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        let videoRange = CMTimeRangeMake(startTime ?? kCMTimeZero, CMTimeSubtract( stopTime ?? asset.duration, startTime ?? kCMTimeZero ) )
        try compositionVideoTrack.insertTimeRange(videoRange, of: clipVideoTrack, at: kCMTimeZero)
        let parentLayer = CALayer()
        let videoLayer = CALayer()
        let overlayLayer = CALayer()

        let targetDimention: CGFloat = 900.0
        let videoWidthDivisor = clipVideoTrack.naturalSize.width / targetDimention
        let actualDimention = clipVideoTrack.naturalSize.width / videoWidthDivisor;
        let targetVideoSize = CGSize(width: actualDimention, height: actualDimention)

        parentLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)
        videoLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)
        overlayLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)

        parentLayer.addSublayer(videoLayer)

        for annotation in mediaAnnotationContainerView.mediaAnnotationViews
        {
            let renderableLayer = annotation.renderableCALayer(targetSize: targetVideoSize)
            parentLayer.addSublayer(renderableLayer)
        }


        let filter = CIFilter(name: "CISepiaTone")!
        filter.setDefaults()
        let videoComp = AVMutableVideoComposition(asset: asset, applyingCIFiltersWithHandler:
        {   request in
            let source = request.sourceImage.clampingToExtent()
            filter.setValue(source, forKey: kCIInputImageKey)
            let output = filter.outputImage!.cropping(to: request.sourceImage.extent)
            request.finish(with: output, context: nil)
        })

        videoComp.renderSize = targetVideoSize

        videoComp.frameDuration = CMTimeMake(1, 30)
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

        let url = AVAsset.tempMovieUrl

        let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
        exporter?.outputURL = url
        exporter?.outputFileType = AVFileTypeMPEG4
        exporter?.shouldOptimizeForNetworkUse = true
        exporter?.videoComposition = videoComp

        exporter?.exportAsynchronously
        {
            print( "Export completed" )
        }

videoComp.instructions [0]似乎是一个私有的AVCoreImageFilterVideoCompositionInstruction类。替换它会产生异常,并且添加一条额外的指令会导致导出完成而不会实际执行任何操作。

可能我正在尝试做的事情是不可能的,我实际上必须对视频进行2次传递(一次用于CIFilter,另一次用于CALayers)。但是处理到临时输出文件,然后以2遍方式再次重新处理它感觉不对。

有谁知道如何让它发挥作用?

谢谢,

2 个答案:

答案 0 :(得分:0)

1,您是否在模拟器上运行代码? 它接缝动画层无法在模拟器上渲染到视频(图层的背景可以)。

2,如果您自己创建AVVideoCompositionInstruction,请确保将enablePostProcessing设置为YES。

答案 1 :(得分:0)

通过AVMutableVideoComposition初始化init(asset: AVAsset, applyingCIFiltersWithHandler applier: @escaping (AVAsynchronousCIImageFilteringRequest) -> Void)时,AVCoreImageFilterVideoCompositionInstruction被添加到了内部。它具有标记为只读enablePostProcessing属性,但我找不到将其设置为true的方法。