iOS 10打破了自定义CIFilter

时间:2016-09-25 00:31:24

标签: swift glsl avfoundation ios10 cifilter

我编写了一个色度键过滤器,用于使 MPEG 电影的背景透明,这样您就可以使用电影文件来制作更长的动画而无需冗长的 PNG序列 (对于某些类型的iOS动画通常会这样做。)

我正在使用AVPlayerAVVideoComposition和自定义CIFilter来在背景图片上呈现视频。背景图片可以 由用户与应用程序交互动态更改。

在iOS 10问世之前,这种情况一直很好,现在已经坏了。

现在发生的是视频播放,但没有发生色度键控,Xcode反复发出以下错误:

need a swizzler so that YCC420v can be written.

这是CIFilter应该产生的图像:

result with custom CIFilter working (pre iOS 10)

而这就是它产生的东西(自iOS 10起):

result with broken CIFilter (post iOS 10)

以下是我的代码中创建EAGLContext并应用自定义CIFilter的部分:

    let myEAGLContext = EAGLContext.init(API: EAGLRenderingAPI.OpenGLES2)
    //let cicontext = CIContext.init(EAGLContext: myEAGLContext, options: [kCIContextWorkingColorSpace: NSNull()])
    let cicontext = CIContext.init(EAGLContext: myEAGLContext)

    let filter = ChromaKeyFilter()
    filter.activeColor = CIColor.init(red: 0, green:1.0, blue: 0.0)
    filter.threshold = self.threshold

    //most of below comes from the "WWDC15 What's New In Core Image" slides
    let vidComp = AVVideoComposition(asset: videoAsset!,
                                     applyingCIFiltersWithHandler:
        {
            request in
            let input = request.sourceImage.imageByClampingToExtent()

            filter.inputImage = input

            let output = filter.outputImage!.imageByClampingToExtent()
            request.finishWithImage(output, context: cicontext)
            self.reloadInputViews()

    })

    let playerItem = AVPlayerItem(asset: videoAsset!)
    playerItem.videoComposition = vidComp
    self.player = AVPlayer(playerItem: playerItem)
    self.playerInitialized = true
    let layer = AVPlayerLayer(player: player)

    self.subviews.forEach { subview in
        subview.removeFromSuperview()
    }

    layer.frame = CGRect(x: 0.0, y: 0.0, width: self.frame.size.width, height: self.frame.size.height)
    self.layer.addSublayer(layer)

这是自定义CIFilter的代码:

private class ChromaKeyFilter : CIFilter {
private var kernel: CIColorKernel!
var inputImage: CIImage?
var activeColor = CIColor(red: 0.0, green: 1.0, blue: 0.0)
var threshold: Float = 0.05

override init() {
    super.init()
    kernel = createKernel()
}

required init(coder aDecoder: NSCoder) {
    super.init(coder: aDecoder)!
    kernel = createKernel()
}

override var outputImage: CIImage? {
    if let inputImage = inputImage {
        let dod = inputImage.extent
        let args = [inputImage as AnyObject, activeColor as AnyObject, threshold as AnyObject]
        return kernel.applyWithExtent(dod, arguments: args)
    }
    return nil
}

private func createKernel() -> CIColorKernel {
    let kernelString =
        "kernel vec4 chromaKey( __sample s, __color c, float threshold ) { \n" +
            //below kernel was adapted from the GPUImage custom chromakeyfilter:
            //https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m#L30
            "  float maskY = 0.2989 * c.r + 0.5866 * c.g + 0.1145 * c.b;\n" +
            "  float maskCr = 0.7132 * (c.r - maskY);\n" +
            "  float maskCb = 0.5647 * (c.b - maskY);\n" +
            "  float Y = 0.2989 * s.rgb.r + 0.5866 * s.rgb.g + 0.1145 * s.rgb.b;\n" +
            "  float Cr = 0.7132 * (s.rgb.r - Y);\n" +
            "  float Cb = 0.5647 * (s.rgba.b - Y);\n" +
            "  float blendValue = smoothstep(threshold, threshold + 0.5, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));\n" +
            "  return blendValue * vec4( s.rgb, 1.0 ); \n" +
    "}"
    let kernel = CIColorKernel(string: kernelString)
    return kernel!
}

}

任何人都有一些想法,为什么这只是现在才打破? 有趣的是,它只能通过电话打破。它仍然可以在模拟器上运行,虽然比在 iOS 10 出现之前慢得多。

1 个答案:

答案 0 :(得分:4)

看起来iOS10(设备)管道的某个部分(播放器层?)已切换到YUV。

AVPlayerLayer pixelBufferAttributes设置为BGRA可修复缺少alpha并使记录的错误静音的问题:

layer.pixelBufferAttributes = [kCVPixelBufferPixelFormatTypeKey as String: NSNumber(value: kCVPixelFormatType_32BGRA)]