使用GPUImage捕获时减少不需要的运动模糊

时间:2015-07-02 03:11:22

标签: ios swift camera gpuimage

我在swift中编写应用程序,并使用GPUImage捕获和操作图像。我正在寻找减少曝光时间以减少运动模糊的方法。如果你在框架中快速移动它看起来非常模糊。我的灯光很好,所以我不确定为什么曝光不够快。

我目前这样做是为了设置GPUImage:

self.stillCamera = GPUImageStillCamera(sessionPreset: AVCaptureSessionPreset640x480, cameraPosition: .Front)
self.stillCamera!.outputImageOrientation = .Portrait

然后我设置了我想要的滤镜(裁剪和可选效果)。 然后我开始预览:

self.stillCamera?.startCameraCapture()

捕获一个框架:

self.finalFilter?.useNextFrameForImageCapture()
var capturedImage = self.finalFilter?.imageFromCurrentFramebuffer()

1 个答案:

答案 0 :(得分:0)

The reason you're seeing such long exposure times is that you're using a GPUImageStillCamera and its preview to capture frames. A GPUImageStillCamera uses a AVCaptureStillImageOutput under the hood, and enables the live preview feed from that. The photo preview feed runs at ~15 FPS or lower on the various devices, and doesn't provide as clear an image as a GPUImageVideoCamera will.

You either want to capture photos from the AVCaptureStillImageOutput by triggering an actual photo capture (via -capturePhotoProcessedUpToFilter: or the like) or use a GPUImageVideoCamera and capture individual frames like you do above.