我有一个功能,可以将视频网址作为输入,并在完成后输出已编辑视频的输出网址。
基本上该功能会创建一个方形视频,如果视频长度超过30秒,则视频长度为30秒。
这是功能:
func compressVideo(inputURL: URL, outputURL: URL, handler:@escaping (_ exportSession: AVAssetExportSession?)-> Void) {
let urlAsset = AVURLAsset(url: inputURL as URL, options: nil)
guard let exportSession = AVAssetExportSession(asset: urlAsset, presetName: AVAssetExportPresetMediumQuality) else {
handler(nil)
return
}
let clipVideoTrack = urlAsset.tracks( withMediaType: AVMediaType.video ).first! as AVAssetTrack
let composition = AVMutableComposition()
composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID())
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = CGSize( width: clipVideoTrack.naturalSize.height, height: clipVideoTrack.naturalSize.height )
videoComposition.frameDuration = CMTimeMake(1, 30)
let transformer = AVMutableVideoCompositionLayerInstruction(assetTrack: clipVideoTrack)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30))
let transform1: CGAffineTransform = CGAffineTransform(translationX: clipVideoTrack.naturalSize.height, y: (clipVideoTrack.naturalSize.width - clipVideoTrack.naturalSize.height) / 2)
let transform2 = transform1.rotated(by: .pi/2)
let finalTransform = transform2
transformer.setTransform(finalTransform, at: kCMTimeZero)
instruction.layerInstructions = [transformer]
videoComposition.instructions = [instruction]
let timescale = urlAsset.duration.timescale
let duration = CMTimeGetSeconds(urlAsset.duration)
if (duration < 31.0) {
print("minore")
} else {
let startTime = CMTimeMakeWithSeconds(0, timescale)
let stopTime = CMTimeMakeWithSeconds(30, timescale)
let exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime)
exportSession.timeRange = exportTimeRange
print("maggiore")
}
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileType.mov
exportSession.videoComposition = videoComposition
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously { () -> Void in
handler(exportSession)
}
}
到目前为止,我只在模拟器上测试过它。 该功能工作正常,但我注意到cpu的使用非常高。它达到了300%甚至更多。内存使用量高达414 MB。 我错过了什么吗?是应该怎么样或者我犯了一些错误?有没有办法减少CPU使用量,从而减少等待时间。 我是swift的新手,这是我第一次使用视频和AVFoundation。
谢谢。