AVAssetExportSession videoComposition未显示视频

时间:2016-11-02 08:01:45

标签: ios swift avassetexportsession avasset avmutablecomposition

我正在通过一系列UIImages制作视频。我成功地做到了这一点所有图像都显示在视频上。我使用AVAssetExportSession导出视频也是有效的,除非我使用AVAssetExportSession videoComposition属性时视频只显示第一张图像。这是我的代码:

func mergeAudioVideoFiles(videoUrl:NSURL, audioUrl:NSURL)->NSURL
{
    let mixComposition : AVMutableComposition = AVMutableComposition()
    var mutableCompositionVideoTrack : [AVMutableCompositionTrack] = []
    var mutableCompositionAudioTrack : [AVMutableCompositionTrack] = []
    let totalVideoCompositionInstruction : AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()


    //start merge

    let aVideoAsset : AVAsset = AVAsset(URL: videoUrl)
    let aAudioAsset : AVAsset = AVAsset(URL: audioUrl)

    mutableCompositionVideoTrack.append(mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid))
    mutableCompositionAudioTrack.append( mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))

    let aVideoAssetTrack : AVAssetTrack = aVideoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    let aAudioAssetTrack : AVAssetTrack = aAudioAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
    do{
        try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aVideoAssetTrack, atTime: kCMTimeZero)

        try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)    
    }catch{

    }
    print("\nslide duraition:\(CMTimeGetSeconds(aVideoAssetTrack.timeRange.duration))\n")
    totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,aVideoAssetTrack.timeRange.duration )

    let mutableVideoComposition : AVMutableVideoComposition = AVMutableVideoComposition(propertiesOfAsset: aVideoAsset)
    mutableVideoComposition.frameDuration = aVideoAssetTrack.timeRange.duration
    mutableVideoComposition.renderSize = CGSizeMake(1280,720)

    //find your video on this URl
    let savePathUrl : NSURL = NSURL(fileURLWithPath: documentsPath.stringByAppendingPathComponent("pandorarofinalist.mov"))

    // 4. Add subtitles (we call it theme)
    let insertTime = kCMTimeZero
    //let endTime = aVideoAssetTrack.timeRange.duration
    //let range = self.totalFrameDuration
    //let themeVideoComposition : AVMutableVideoComposition = AVMutableVideoComposition(propertiesOfAsset: aVideoAsset)
    // 4.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.

    let videolayerInstruction : AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: aVideoAssetTrack)
    totalVideoCompositionInstruction.layerInstructions = NSArray(array: [videolayerInstruction]) as! [AVVideoCompositionLayerInstruction]
    mutableVideoComposition.instructions = NSArray(array: [totalVideoCompositionInstruction]) as! [AVVideoCompositionInstructionProtocol]

    //mutableCompositionAudioTrack[0].preferredTransform
    videolayerInstruction.setTransform(mutableCompositionVideoTrack[0].preferredTransform, atTime: insertTime)
    //videolayerInstruction.setOpacity(0.0, atTime: endTime)

    // 4.3 - Add instructions


   // mutableVideoComposition.renderScale = 1.0
    //themeVideoComposition.renderSize = CGSizeMake(aVideoAssetTrack.naturalSize.width, aVideoAssetTrack.naturalSize.height)
   //themeVideoComposition.frameDuration = self.totalFrameDuration

    // add text

    let title = String("my video")

    let titleLayer = CATextLayer()
    titleLayer.string = title
    titleLayer.frame =  CGRect(x: 0, y: 0, width: aVideoAssetTrack.naturalSize.width, height: 100)
    let fontName: CFStringRef = "Helvetica-Bold"
    let fontSize = CGFloat(50)
    titleLayer.font = CTFontCreateWithName(fontName, fontSize, nil)
    titleLayer.alignmentMode = kCAAlignmentCenter
    titleLayer.foregroundColor = UIColor.orangeColor().CGColor

    let backgroundLayer = CALayer()
    backgroundLayer.frame = CGRect(x: 0, y: 0, width: aVideoAssetTrack.naturalSize.width, height: aVideoAssetTrack.naturalSize.height)
    backgroundLayer.masksToBounds = true
    backgroundLayer.addSublayer(titleLayer)

    // 2. set parent layer and video layer

    let parentLayer = CALayer()
    let videoLayer = CALayer()
    parentLayer.frame =  CGRect(x: 0, y: 0, width: aVideoAssetTrack.naturalSize.width, height: aVideoAssetTrack.naturalSize.height)
    videoLayer.frame =  CGRect(x: 0, y: 0, width: aVideoAssetTrack.naturalSize.width, height: aVideoAssetTrack.naturalSize.height)
    parentLayer.addSublayer(videoLayer)
    parentLayer.addSublayer(backgroundLayer)

    //backgroundLayer.opacity = 1.0

    // 3. make animation

    mutableVideoComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)

    // Remove the file if it already exists (merger does not overwrite)

    do{
        let fileManager = NSFileManager.defaultManager()
        try fileManager.removeItemAtURL(savePathUrl)
    }catch{
    }

    let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
    assetExport.outputFileType = AVFileTypeMPEG4
    assetExport.outputURL = savePathUrl
    assetExport.shouldOptimizeForNetworkUse = true
    assetExport.videoComposition = mutableVideoComposition

    assetExport.exportAsynchronouslyWithCompletionHandler { () -> Void in
        switch assetExport.status {

        case AVAssetExportSessionStatus.Completed:

            PHPhotoLibrary.sharedPhotoLibrary().performChanges({
                PHAssetChangeRequest.creationRequestForAssetFromVideoAtFileURL(savePathUrl)
            }) { success, error in
                if !success {
                    print("Error saving video: \(error)")
                }
            }

            //Uncomment this if u want to store your video in asset

            //let assetsLib = ALAssetsLibrary()
            //assetsLib.writeVideoAtPathToSavedPhotosAlbum(savePathUrl, completionBlock: nil)

            print("success")
        case  AVAssetExportSessionStatus.Failed:
            print("failed \(assetExport.error)")
        case AVAssetExportSessionStatus.Cancelled:
            print("cancelled \(assetExport.error)")
        default:
            print("complete")
        }
    }

    return savePathUrl
}

问题是行 assetExport.videoComposition = mutableVideoComposition 如果省略此行,输出视频就可以了。但如果我添加此行,则输出视频仅显示我为视频添加的第一张图片。我必须设置videoComposition,因为我将标题文本添加到我已添加为CALayer的视频中。我在我的项目中使用swift 2.2。有什么帮助吗?提前致谢。

1 个答案:

答案 0 :(得分:3)

我认为问题在于这一行:

mutableVideoComposition.frameDuration = aVideoAssetTrack.timeRange.duration

frameDuration应该代表视频中单个帧的持续时间,而不是视频的总持续时间。上面的一行使得视频的一帧持续原始视频轨道的持续时间,所以你只能看到一帧,好像它是一幅静止图像。

对于30fps的视频,你应该将frameDuration设置为1/30秒,如下所示:

<强> mutableVideoComposition.frameDuration = CMTime(value: 1, timescale: 30)

警告:注意不要使用其他初始化方法CMTime(seconds: 1.0, preferredTimescale: 30),因为这会使你的frameDuration为1秒。