我在这一天的大部分时间里一直在挖掘StackOverflow,虽然有关于该主题的很多好帖子,但我还没有找到解决我问题的方法。
我使用AVAssetWriter
编写视频文件没有问题。我的视频文件,如果我保存到我的相机胶卷,可以正确播放并按预期方向播放。这是我如何设置的;
init(fileUrl:URL!, height:Int, width:Int) {
// Setup the filter writer instance
fileWriter = try? AVAssetWriter(outputURL: fileUrl, fileType: AVFileType.mov)
// Setup the video settings
let videoOutputSettings: Dictionary<String, AnyObject> = [
AVVideoCodecKey : AVVideoCodecType.hevc as AnyObject,
AVVideoWidthKey : width as AnyObject,
AVVideoHeightKey : height as AnyObject
]
// Setup the attributes dictionary
let sourcePixelBufferAttributesDictionary = [
String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_32BGRA),
String(kCVPixelBufferWidthKey) : Int(width),
String(kCVPixelBufferHeightKey) : Int(height),
String(kCVPixelFormatOpenGLESCompatibility) : kCFBooleanTrue
] as [String : Any]
// Setup the video input
videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoOutputSettings)
// Data should be expected in real time
videoInput.expectsMediaDataInRealTime = true
// Perform transform
videoInput.transform = CGAffineTransform(rotationAngle: CGFloat(CGFloat.pi / 2.0))
// Setup pixel buffer intput
assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoInput,
sourcePixelBufferAttributes: sourcePixelBufferAttributesDictionary)
// Add the input
fileWriter.add(videoInput)
}
然后,我想使用AVMutableComposition
保存视频并应用图片叠加,这可以正常使用,但视频方向不正确;
func postProcessVideo(toFPS: Double, sourceVideo: URL, destination: URL, filterImage: UIImage?, completionHandler: @escaping (_ response: Bool) -> ()) {
// Log
print("Received call to begin post-processing video at:", sourceVideo)
// Instantiate the AVMutableComposion
let composition = AVMutableComposition()
// Setup the video asset
let vidAsset = AVURLAsset(url: sourceVideo, options: [:])
// Get video track
let vtrack = vidAsset.tracks(withMediaType: AVMediaType.video)
// Setup the first video track as asset track
let videoTrack: AVAssetTrack = vtrack[0]
// Setup the video timerange
let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)
// Setup the composition video track
let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID())!
// Insert expected time range
do {
try compositionvideoTrack.insertTimeRange(vid_timerange, of: videoTrack, at: kCMTimeZero)
} catch {
}
// Setup the preferred transform
compositionvideoTrack.preferredTransform = videoTrack.preferredTransform
// Update time scale
let finalTimeScale: Int64 = vidAsset.duration.value * 3
// Adjust video track duration
compositionvideoTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, vidAsset.duration), toDuration: CMTimeMake(finalTimeScale, vidAsset.duration.timescale))
// Setup effect size
let size = videoTrack.naturalSize
// Setup the image
let imglogo = UIImage(named: "gif1.png")
let imglayer = CALayer()
imglayer.contents = imglogo?.cgImage
imglayer.frame = CGRect(x: 0, y: 0, width: size.width, height: size.height)
imglayer.opacity = 0.0
// Setup the video layer
let videolayer = CALayer()
// Setup the video layer frame
videolayer.frame = CGRect(x: 0, y: 0, width: size.width, height: size.height)
// Setup the parent layer
let parentlayer = CALayer()
// Setup the parent layer frame
parentlayer.frame = CGRect(x: 0, y: 0, width: size.width, height: size.height)
// Add video layer
parentlayer.addSublayer(videolayer)
// Add filter layer
parentlayer.addSublayer(imglayer)
// Setup the layer composition
let layercomposition = AVMutableVideoComposition()
// Setup the desired frame rate
layercomposition.frameDuration = CMTimeMake(1, Int32(toFPS))
// Setup the render size
layercomposition.renderSize = size
// Setup the animation tool
layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videolayer, in: parentlayer)
// Setup instruction for filter overlay
let instruction = AVMutableVideoCompositionInstruction()
// Setup the desired time range
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)
// Setup video track
let videotrack = composition.tracks(withMediaType: AVMediaType.video)[0]
// Setup layer instruction
let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)
// Setup layer instructions
instruction.layerInstructions = [layerinstruction]
// Setup layer composition instructions
layercomposition.instructions = [instruction]
// Instantiate the asset export
let assetExport = AVAssetExportSession(asset: composition, presetName:AVAssetExportPresetHighestQuality)
// Setup the video composition
assetExport?.videoComposition = layercomposition
// Setup the output file type
assetExport?.outputFileType = AVFileType.mov
// Setup the destination
assetExport?.outputURL = destination
// Export video
assetExport?.exportAsynchronously(completionHandler: {
switch assetExport?.status{
case .failed?:
print("failed \(assetExport!.error)")
case .cancelled?:
print("cancelled \(assetExport!.error)")
default:
print("Movie complete")
completionHandler(true)
}
})
}
我对这段话的长度表示道歉,但是有没有突出的因素可以帮助解释出口期间的方向变化?
谢谢!
答案 0 :(得分:1)
我有关于方向的这个问题,这就是我解决它的方法:
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack setPreferredTransform:CGAffineTransformRotate(CGAffineTransformMakeScale(-1, 1), M_PI)];
通过旋转和缩放它。它是在Objective-C中,但您可以轻松转换它。你只需要改变这个:
// Setup the preferred transform
compositionvideoTrack.preferredTransform = videoTrack.preferredTransform
而不是preferredTransform手动提供转换。