如何使用UIViews导出视频?

时间:2016-04-09 20:26:23

标签: ios iphone swift uiview calayer

我问了这个问题:Take snapshot of a UIView except some buttons知道如何使用UIViews导出图片。

现在,我想知道如何使用UIView导出视频。 这是我正在使用的代码:

  func createFinalVideo(){

    let composition = AVMutableComposition()
    let vidAsset = AVURLAsset(URL: myURL, options: nil)

    // get video track
    let vtrack =  vidAsset.tracksWithMediaType(AVMediaTypeVideo)
    let videoTrack:AVAssetTrack = vtrack[0]
    let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)

    let audioTrack = vidAsset.tracksWithMediaType(AVMediaTypeAudio)[0]

    let compositionVideoTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
    let compositionAudioTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
    do {
        try compositionVideoTrack.insertTimeRange(vid_timerange, ofTrack: videoTrack, atTime: kCMTimeZero)
        try compositionAudioTrack.insertTimeRange(vid_timerange, ofTrack: audioTrack, atTime: kCMTimeZero)
    }
    catch let error as NSError{
        print("THERE SHOULD BE ERROR DIFFERENT THAN NIL")
        print(error)
        return;
    }

    compositionVideoTrack.preferredTransform = videoTrack.preferredTransform

    // Watermark Effect

    let contentLayer = CALayer()
    contentLayer.addSublayer(self.myTextView.layer)
    contentLayer.frame = CGRectMake(0, 0, self.view.bounds.width, self.view.bounds.height)

    let titleLayer = CATextLayer()
    titleLayer.string = "DO YOU HEAR THE PEOPLE SING?"
    titleLayer.fontSize = 18
    titleLayer.foregroundColor = UIColor.redColor().CGColor
    titleLayer.alignmentMode = kCAAlignmentCenter
    titleLayer.frame = CGRectMake(20, 10, self.view.bounds.width - 40, 20);
    titleLayer.displayIfNeeded()

    let parentlayer = CALayer()
    parentlayer.frame = CGRectMake(0, 0, self.view.bounds.width, self.view.bounds.height)
    parentlayer.addSublayer(self.playerLayer)
    parentlayer.addSublayer(titleLayer)
    self.view.layer.addSublayer(parentlayer)

    let layercomposition = AVMutableVideoComposition()
    layercomposition.frameDuration = CMTimeMake(1, 30)
    layercomposition.renderSize = CGSize(width: self.view.bounds.width, height: self.view.bounds.height)
    layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: playerLayer, inLayer: parentlayer)

    // instruction for watermark
    let instruction = AVMutableVideoCompositionInstruction()
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)
    let videotrack = composition.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
    let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)
    instruction.layerInstructions = NSArray(object: layerinstruction) as! [AVVideoCompositionLayerInstruction]
    layercomposition.instructions = NSArray(object: instruction) as! [AVVideoCompositionInstructionProtocol]
    //  create new file to receive data
    let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
    let docsDir: AnyObject = dirPaths[0]
    let movieFilePath = docsDir.stringByAppendingPathComponent("completeFinalMovie.mov")
    let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath)

    do{
      try  NSFileManager.defaultManager().removeItemAtPath(movieFilePath)
    }
    catch let error as NSError{
        print(error)
        return;
    }

    // use AVAssetExportSession to export video
    let assetExport = AVAssetExportSession(asset: composition, presetName:AVAssetExportPresetHighestQuality)!
    assetExport.outputFileType = AVFileTypeQuickTimeMovie
    assetExport.outputURL = movieDestinationUrl
    assetExport.exportAsynchronouslyWithCompletionHandler({
        switch assetExport.status{
        case AVAssetExportSessionStatus.Failed:
            print("failed \(assetExport.error)")
        case AVAssetExportSessionStatus.Cancelled:
            print("cancelled \(assetExport.error)")
        default:
            print("Movie complete")

            // play video
            NSOperationQueue.mainQueue().addOperationWithBlock({ () -> Void in
              // self.playVideo(movieDestinationUrl!)
                UISaveVideoAtPathToSavedPhotosAlbum(movieFilePath, self,#selector(self.handleCompletionOfVideoToGallery), nil)
            })
        }
    })
}

所以,我在关于firstResponder的结尾处遇到了崩溃,但这并不重要,因为我创建的titleLayer只是为了测试。

我的主要问题是关于UIViews和CALayer如何解决我的问题。根据我的理解,我必须有一个parentLayer,videoLayer(在这种情况下是我的playerLayer,一个正在播放视频的AVPlayerLayer)和一个contentLayer,它应该包含当前在顶部的UIViews。视频(如Snapchat的表情符号和录制视频之上的文字)。如果我只是contentLayer.addSubLayer(self.emoji.layer) or contentLayer.addSubLayer(self.myTextView.layer)我将录制的视频保存在我的画廊中,但没有任何内容(就像你录制了一个Snapchat视频并放了表情符号,文本和其他任何东西,但试图将视频保存到你的设备的图库,最终保存的视频与录制的视频相同,其上没有任何表情符号和文字)。那么,有什么想法吗?

我知道Swift和Objective-C,所以随意发布任何代码。

0 个答案:

没有答案