将视频导出为UIImagePickerController视图的大小

时间:2015-10-03 03:56:57

标签: swift video uiimagepickercontroller cgaffinetransform avassetexportsession

我有一个应用程序,我希望用户使用UIImagePickerController拍摄视频。

然后我在录制的视频顶部创建一个水印,并使用AVAssetExportSession导出它。

我猜测我需要使用CGAffineTransformScale,然后再使用另一个转换来将导出导出到合适的大小,但我很擅长转换。

我可以使用什么样的转换将视频裁剪为UIImagePickerController视图的大小?

func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {

        let videoPath = info[UIImagePickerControllerMediaURL] as! NSURL

        let stringVideoPath = videoPath.path

        //add watermark starting here

        let videoAsset = AVURLAsset(URL: videoPath)
        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
        let clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
        do {
        try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: clipVideoTrack, atTime: kCMTimeZero)
        } catch {
            print("error")
        }


        compositionVideoTrack.preferredTransform = clipVideoTrack.preferredTransform


        //create the watermark image
        let myImage = UIImage(named: "watermarkImage.png")
        let aLayer = CALayer()
        aLayer.contents = myImage?.CGImage
        aLayer.frame = CGRectMake(5, 25, 100, 60)
        aLayer.opacity = 1.0

        //sort layer
        let videoSize = clipVideoTrack.naturalSize
        let parentLayer = CALayer()
        let videoLayer = CALayer()
        parentLayer.frame = CGRectMake(0, 0, videoSize.height, videoSize.width)
        videoLayer.frame = CGRectMake(0, 0, videoSize.height, videoSize.width)
        parentLayer.addSublayer(videoLayer)
        parentLayer.addSublayer(aLayer)

        //create composition and add instructions to insert the layer

        let videoComp = AVMutableVideoComposition()
        videoComp.renderSize = CGSize(width: videoSize.height, height: videoSize.width)
        videoComp.frameDuration = CMTimeMake(1, 30)
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)

        //instructions
        let mainInstruction = AVMutableVideoCompositionInstruction()
        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
        let videoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0]


          let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)

//I'm guessing I would have something like let zoomTransform = CGAffineTransformScale(compositionVideoTrack.preferredTransform, 1, 1) and then apply that transform below instead of compositionVideoTrack.preferredTransform


layerInstruction.setTransform(compositionVideoTrack.preferredTransform, atTime: kCMTimeZero)


        mainInstruction.layerInstructions = [layerInstruction] 
        videoComp.instructions = [mainInstruction]

        let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetMediumQuality)
        assetExport?.videoComposition = videoComp
        let exportPath = NSTemporaryDirectory().stringByAppendingString("TestVideo.mp4")
        let exportURL = NSURL(fileURLWithPath: exportPath)

        if NSFileManager.defaultManager().fileExistsAtPath(exportPath) {
            do { try NSFileManager.defaultManager().removeItemAtPath(exportPath)} catch{}
        }

        assetExport?.outputFileType = AVFileTypeMPEG4
        assetExport?.outputURL = exportURL
        assetExport?.shouldOptimizeForNetworkUse = true
        assetExport?.exportAsynchronouslyWithCompletionHandler({ () -> Void in
            print("done")
            UISaveVideoAtPathToSavedPhotosAlbum(exportURL.path!, self, nil, nil)
        })

        picker.dismissViewControllerAnimated(true, completion: nil)
        picker.view.superview?.removeFromSuperview()

    }

0 个答案:

没有答案