在RosyWriter中捕获照片

时间:2017-07-29 22:00:38

标签: objective-c iphone camera ios10 avcapturesession

我发现Apple很好地使用了Apple创建的 RosyWriter示例,它允许您从this link.

捕获通过GLSL图层的视频

我想通过让我不仅拍摄视频而且还拍摄照片来扩展它。使用相同的捕获会话,相同的视频设置,相同的分辨率等(基本上只是将一帧视频捕获到图像中)

它应该是直截了当的,但我似乎无法找到我需要从缓存中获取缓冲区并保存到Photolibrary。

据我所知,我可以使用委托:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection "

但我没有取得任何成功。 有人能指出我正确的方向吗?

1 个答案:

答案 0 :(得分:0)

I found a solution.

Rather than using the sampleBuffer that contained the unmodified sample from the camera i had to use the renderedPixelBuffer

The issue was that while the sampleBuffer is a CMSampleBufferRef, the renderedPixelBuffer is a CVPixelBufferRef

Using CMSampleBufferCreateForImageBuffer i converted got a Samplebuffer that i can use to save as a image.