在Cocoa上应用CIFilter

时间:2016-10-21 13:17:32

标签: macos cocoa avfoundation core-image cifilter

Apple docs给出了将CIFilter应用于AVAsset的示例:

let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in

   // Clamp to avoid blurring transparent pixels at the image edges
   let source = request.sourceImage.clampingToExtent()
   filter.setValue(source, forKey: kCIInputImageKey)

   // Vary filter parameters based on video timing
   let seconds = CMTimeGetSeconds(request.compositionTime)
   filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)

   // Crop the blurred output to the bounds of the original image
  let output = filter.outputImage!.cropping(to: request.sourceImage.extent)

   // Provide the filter output to the composition
   request.finish(with: output, context: nil)
})

这在一些视频上效果很好(对于那些使用AAC编解码器的人来说似乎更高效),而在其他视频上,CPU使用率会上升,视频永远不会完成处理。有没有办法将其移到GPU上以加快速度/不占用太多CPU?我看到了this question用于iOS但CIContext contextWithEAGLContext:在OS X上不可用。我是AVFoundation /视频处理新手,在OS X上是否有相同的功能?

注意:我不希望实时执行此操作,我只是想应用过滤器并使用GPU将文件导出到文件系统。

1 个答案:

答案 0 :(得分:2)

macOS而不是contextWithCGLContext用于OpenGL:

+ (CIContext *)contextWithCGLContext:(CGLContextObj)cglctx
                         pixelFormat:(nullable CGLPixelFormatObj)pixelFormat
                          colorSpace:(nullable CGColorSpaceRef)colorSpace
                             options:(nullable NSDictionary<NSString*,id> *)options;
如果您愿意,可以使用

contextWithMTLDevice:金属:

+ (CIContext *)contextWithMTLDevice:(id<MTLDevice>)device;