使用AVFoundation在Iphone照片捕获中,图像捕获比闪光灯慢

时间:2018-07-31 07:02:51

标签: iphone swift camera avfoundation ios11

我们正在使用最新的Swift Code和AVFoundation API创建具有Flash,倒车摄像头和捕获照片功能的摄像头应用程序。代码需要支持ios10起。

我们遇到的问题是出现了相机闪光灯,但没有在照片中被捕获(基本上相机闪光灯出现的时间比照片拍摄的时间早,或者照片拍摄的速度比闪光灯的速度慢),这使我们的闪光灯功能无法使用。

这是我们的Camera Capture的代码:

//Function to capture the image from the camera session -> this gets called from the ViewController Outlet action OnCapture
func capture() throws {
    guard captureSession.isRunning else {
        throw CameraRuntimeError.captureSessionIsMissing
    }
    let settings = AVCapturePhotoSettings()
    if getCurrentCamera().isFlashAvailable {
        settings.flashMode = self.flashMode
    }
    self.photoOutput?.capturePhoto(with: settings, delegate: self)
}

这是代表:

extension CameraFunctions: AVCapturePhotoCaptureDelegate {
private static let failedToConvertToJPEGErrorCode = "JPEGERROR"
private static let failedToCaptureImage = "CAMERROR"
public func photoOutput(_ captureOutput: AVCapturePhotoOutput,
                        didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
                        previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
                        resolvedSettings: AVCaptureResolvedPhotoSettings,
                        bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {
    if error != nil {
        onPhotoCaptured(StringResult(error: ServicesError(CameraFunctions.failedToCaptureImage, error!.localizedDescription)))
    }
    if let buffer = photoSampleBuffer, let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil) {
        let encodedString = //DO ENCODING OF THE PHOTO
        onPhotoCaptured(encodedString)
    } else {
        onPhotoCaptured(StringResult(error: ServicesError(CameraFunctions.failedToConvertToJPEGErrorCode, CameraRuntimeError.failedToConvertImageToJPEG.localizedDescription)))
    }
        closeCaptureSession()
    }
}

onPhotoCaptured存在于ViewController中。

请告诉我们是否做错了事。

1 个答案:

答案 0 :(得分:1)

设置准备好的照片输出设置可以解决此问题:

func capture(_ delegate: AVCapturePhotoCaptureDelegate, _ onError: @escaping (Error) -> Void) throws {
    guard captureSession.isRunning else {
        throw CameraRuntimeError.captureSessionIsMissing
    }
    let settings: AVCapturePhotoSettings
    if #available(iOS 11.0, *) {
        settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
        settings.isAutoStillImageStabilizationEnabled = true
    } else {
        settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecJPEG])
    }
    if getCurrentCamera().isFlashAvailable {
        settings.flashMode = self.flashMode
    }

    //This statement did the magic
    self.photoOutput?.setPreparedPhotoSettingsArray([settings]) { (suc: Bool, err: Error?) -> Void in
        if suc {
            self.photoOutput?.capturePhoto(with: settings, delegate: delegate)
        }
        if err != nil {
            onError(err!)
        }
    }
}