我尝试将CIFilter应用于AVAsset,然后在应用过滤器的情况下保存它。我这样做的方法是使用AVAssetExportSession
将videoComposition
设置为AVMutableVideoComposition
对象,并使用自定义AVVideoCompositing
类。
我还将instructions
对象的AVMutableVideoComposition
设置为自定义合成指令类(符合AVMutableVideoCompositionInstruction
)。这个类传递了一个跟踪ID,以及一些其他不重要的变量。
不幸的是,我遇到了一个问题 - 我的自定义视频合成器类中的startVideoCompositionRequest:
函数(符合AVVideoCompositing
)未被正确调用。
当我将自定义指令类的passthroughTrackID
变量设置为轨道ID时,我的AVVideoCompositing
中的startVideoCompositionRequest(request)
函数未被调用。
然而,当我没有设置自定义指令类的passthroughTrackID
变量时,startVideoCompositionRequest(request)
被调用,但不正确 - 打印request.sourceTrackIDs
结果在一个空数组中,request.sourceFrameByTrackID(trackID)
导致零值。
我发现有趣的是,在尝试使用过滤器导出视频时,cancelAllPendingVideoCompositionRequests:
函数始终被调用两次。在startVideoCompositionRequest:
之前调用一次,在调用startVideoCompositionRequest:
之后调用一次,或者连续两次调用。
我已经创建了三个用于导出带过滤器的视频的类。这是实用程序类,它基本上只包含一个export
函数并调用所有必需的代码
class VideoFilterExport{
let asset: AVAsset
init(asset: AVAsset){
self.asset = asset
}
func export(toURL url: NSURL, callback: (url: NSURL?) -> Void){
guard let track: AVAssetTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).first else{callback(url: nil); return}
let composition = AVMutableComposition()
let compositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
try compositionTrack.insertTimeRange(track.timeRange, ofTrack: track, atTime: kCMTimeZero)
}
catch _{callback(url: nil); return}
let videoComposition = AVMutableVideoComposition(propertiesOfAsset: composition)
videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
videoComposition.frameDuration = CMTimeMake(1, 30)
videoComposition.renderSize = compositionTrack.naturalSize
let instruction = VideoFilterCompositionInstruction(trackID: compositionTrack.trackID)
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.asset.duration)
videoComposition.instructions = [instruction]
let session: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetMediumQuality)!
session.videoComposition = videoComposition
session.outputURL = url
session.outputFileType = AVFileTypeMPEG4
session.exportAsynchronouslyWithCompletionHandler(){
callback(url: url)
}
}
}
这是另外两个类 - 我将它们放在一个代码块中以缩短帖子
// Video Filter Composition Instruction Class - from what I gather,
// AVVideoCompositionInstruction is used only to pass values to
// the AVVideoCompositing class
class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
let trackID: CMPersistentTrackID
let filters: ImageFilterGroup
let context: CIContext
// When I leave this line as-is, startVideoCompositionRequest: isn't called.
// When commented out, startVideoCompositionRequest(request) is called, but there
// are no valid CVPixelBuffers provided by request.sourceFrameByTrackID(below value)
override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
override var requiredSourceTrackIDs: [NSValue]{get{return []}}
override var containsTweening: Bool{get{return false}}
init(trackID: CMPersistentTrackID, filters: ImageFilterGroup, context: CIContext){
self.trackID = trackID
self.filters = filters
self.context = context
super.init()
//self.timeRange = timeRange
self.enablePostProcessing = true
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
// My custom AVVideoCompositing class. This is where the problem lies -
// although I don't know if this is the root of the problem
class VideoFilterCompositor : NSObject, AVVideoCompositing{
var requiredPixelBufferAttributesForRenderContext: [String : AnyObject] = [
kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA), // The video is in 32 BGRA
kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
]
var sourcePixelBufferAttributes: [String : AnyObject]? = [
kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA),
kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
]
let renderQueue = dispatch_queue_create("co.getblix.videofiltercompositor.renderingqueue", DISPATCH_QUEUE_SERIAL)
override init(){
super.init()
}
func startVideoCompositionRequest(request: AVAsynchronousVideoCompositionRequest){
// This code block is never executed when the
// passthroughTrackID variable is in the above class
autoreleasepool(){
dispatch_async(self.renderQueue){
guard let instruction = request.videoCompositionInstruction as? VideoFilterCompositionInstruction else{
request.finishWithError(NSError(domain: "getblix.co", code: 760, userInfo: nil))
return
}
guard let pixels = request.sourceFrameByTrackID(instruction.passthroughTrackID) else{
// This code block is executed when I comment out the
// passthroughTrackID variable in the above class
request.finishWithError(NSError(domain: "getblix.co", code: 761, userInfo: nil))
return
}
// I have not been able to get the code to reach this point
// This function is either not called, or the guard
// statement above executes
let image = CIImage(CVPixelBuffer: pixels)
let filtered: CIImage = //apply the filter here
let width = CVPixelBufferGetWidth(pixels)
let height = CVPixelBufferGetHeight(pixels)
let format = CVPixelBufferGetPixelFormatType(pixels)
var newBuffer: CVPixelBuffer?
CVPixelBufferCreate(kCFAllocatorDefault, width, height, format, nil, &newBuffer)
if let buffer = newBuffer{
instruction.context.render(filtered, toCVPixelBuffer: buffer)
request.finishWithComposedVideoFrame(buffer)
}
else{
request.finishWithComposedVideoFrame(pixels)
}
}
}
}
func renderContextChanged(newRenderContext: AVVideoCompositionRenderContext){
// I don't have any code in this block
}
// This is interesting - this is called twice,
// Once before startVideoCompositionRequest is called,
// And once after. In the case when startVideoCompositionRequest
// Is not called, this is simply called twice in a row
func cancelAllPendingVideoCompositionRequests(){
dispatch_barrier_async(self.renderQueue){
print("Cancelled")
}
}
}
我一直在寻找Apple's AVCustomEdit sample project以获取相关指导,但我似乎无法找到原因。
如何才能正确调用request.sourceFrameByTrackID:
函数,并为每个帧提供有效的CVPixelBuffer
?
答案 0 :(得分:8)
事实证明,自定义requiredSourceTrackIDs
类中的AVVideoCompositionInstruction
变量(问题中为VideoFilterCompositionInstruction
)必须设置为包含曲目ID的数组
override var requiredSourceTrackIDs: [NSValue]{
get{
return [
NSNumber(value: Int(self.trackID))
]
}
}
所以最终的自定义组合指令类是
class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
let trackID: CMPersistentTrackID
let filters: [CIFilter]
let context: CIContext
override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
override var requiredSourceTrackIDs: [NSValue]{get{return [NSNumber(value: Int(self.trackID))]}}
override var containsTweening: Bool{get{return false}}
init(trackID: CMPersistentTrackID, filters: [CIFilter], context: CIContext){
self.trackID = trackID
self.filters = filters
self.context = context
super.init()
self.enablePostProcessing = true
}
required init?(coder aDecoder: NSCoder){
fatalError("init(coder:) has not been implemented")
}
}
此实用程序的所有代码is also on GitHub
答案 1 :(得分:5)
正如您所指出的那样,让passthroughTrackID
返回您要过滤的曲目并不是正确的方法 - 您需要返回要从requiredSourceTrackIDs
过滤的曲目。 (看起来,一旦你这样做,如果你还从passthroughTrackID
返回它并不重要。)回答为什么它以这种方式工作......
passthroughTrackID
和requiredSourceTrackIDs
的文档肯定不是Apple有史以来最清晰的写作。 (File a bug about it并且它们可能会有所改进。)但如果仔细观察前者的描述,就会有一个提示(强调添加)...
如果在指令的持续时间内,视频合成结果是源帧之一,则此属性返回相应的轨道ID。 合成器在指令期间不会运行,而是使用正确的源帧。
因此,当您制作的指令类通过通过而不处理时,您只能使用passthroughTrackID
。
如果您打算执行任何图像处理,即使它只是一个没有合成的曲目,请在requiredSourceTrackIDs
中指定该曲目。