CIImage回到CMSampleBuffer

时间:2018-03-01 15:51:14

标签: ios swift core-image video-recording

我使用AVAssetWriterCMSampleBuffer数据(来自视频,音频输入)录制视频(.mp4文件)。

录制时我想处理帧,我正在将CMSampleBuffer转换为CIImage并进行处理。

但是如何使用CMSampleBuffer的新处理图片缓冲区更新CIImage

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    if output == videoOutput {
       let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
       let ciimage: CIImage = CIImage(cvPixelBuffer: imageBuffer)
       ... // my code to process CIImage (for example add augmented reality)
       // but how to convert it back to CMSampleBuffer?
       // because AVAssetWriterInput to encode video/audio in file needs CMSampleBuffer
       ...
    } 
    ...
}

1 个答案:

答案 0 :(得分:2)

您需要使用CIContext的render(_:to:bounds:colorSpace:)方法将CIImage渲染为CVPixelBuffer。

然后,您可以使用例如CVPixelBuffer创建CMSampleBuffer。 CMSampleBufferCreateReadyWithImageBuffer(_:_:_:_:_:)

出于效率原因,您可能需要使用CVPixelBuffer池,Apple的AVCamPhotoFilter示例代码中显示了此示例。请特别注意RosyCIRenderer class

另请参阅此答案,可能会对您有所帮助Applying a CIFilter to a Video File and Saving it