How to chain filters in Metal for iOS?

时间:2016-10-20 18:33:55

标签: ios swift ios10 metal

I completed this tutorial by Simon Gladman (@flexmonkey) to capture images from AVFoundation and apply a filter to the output. However, I'm struggling to find a way to replace the blur filter with my own compute shader. In other words, I need to concatenate my custom shader after the YCbCrColorConversion filter mentioned there.

let commandBuffer = commandQueue.makeCommandBuffer()
let commandEncoder = commandBuffer.makeComputeCommandEncoder()
// pipelineState has compiled YCbCrColorConversion filter
commandEncoder.setComputePipelineState(pipelineState)
commandEncoder.setTexture(ytexture, at: 0)
commandEncoder.setTexture(cbcrTexture, at: 1)
commandEncoder.setTexture(drawable.texture, at: 2) // out texture

commandEncoder.dispatchThreadgroups(threadGroups,threadsPerThreadgroup: threadGroupCount)
commandEncoder.endEncoding()

let inPlaceTexture = UnsafeMutablePointer<MTLTexture> .allocate(capacity: 1)
        inPlaceTexture.initialize(to: drawable.texture)

// How to replace this blur with my own filter?????
blur.encodeToCommandBuffer(commandBuffer, inPlaceTexture: inPlaceTexture, fallbackCopyAllocator: nil)

commandBuffer.presentDrawable(drawable)
commandBuffer.commit();

Should I create a new commandBuffer, commandEncoder and a separate pipelineState that compiles the second kernel function? This would take the output of the first filter as an input to the second. Is there a more efficient way to do this, or this is optimal?

I'm a beginner with Metal, so any explanations on how the pipeline works are highly appreciated.

1 个答案:

答案 0 :(得分:6)

您不需要创建新的命令缓冲区或其他计算编码器,但您需要创建使用您自己的内核函数的计算管道状态。在初始化期间,无论您当前正在创建YCbCr转换管道状态,都应该执行一次此操作。

要将效果链接在一起,您需要创建一个中间纹理,作为YCbCr转换的输出纹理和内核的输入。然后,可绘制纹理将是内核函数的输出纹理。您可以为当前调度YCbCr转换工作(即每个线程组和线程组计数具有相同数量的线程)分配您自己内核的工作。

中间纹理应该与drawable具有相同的尺寸和格式。您可以懒惰地创建它并保持对它的引用,在可绘制的大小更改时重新创建它。