在iOS10中使用AVCapturePhotoOutput - NSGenericException

时间:2017-03-28 01:47:48

标签: ios nsexception avcapture

我目前正在尝试弄清楚如何使用iOS 10的AVCapturePhotoOutput方法,但我遇到了麻烦。我觉得我即将做对,但继续收到错误:

  

由于未捕获的异常终止应用程序' NSGenericException',原因:' - [AVCapturePhotoOutput capturePhotoWithSettings:delegate:]没有有效和启用的视频连接'

我试图将这行代码放在AVCapturePhotoCaptureDelegate或我的didPressTakePhoto函数中:

if let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo) {
     videoConnection.videoOrientation = AVCaptureVideoOrientation.portrait;
     ...
}

这是我到目前为止的代码:

import AVFoundation
import UIKit

class Camera: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate, AVCapturePhotoCaptureDelegate {

    @IBOutlet weak var cameraView: UIView!
    @IBOutlet weak var imageView: UIImageView!

    var captureSession : AVCaptureSession?
    var stillImageOutput : AVCapturePhotoOutput?
    var stillImageOutputSettings : AVCapturePhotoSettings?
    var previewLayer : AVCaptureVideoPreviewLayer?

    var didTakePhoto = Bool();

    override func viewDidLoad() {
        super.viewDidLoad()

        // Do any additional setup after loading the view.
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated);

        previewLayer?.frame = cameraView.bounds;
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated);

        captureSession = AVCaptureSession();
        captureSession?.sessionPreset = AVCaptureSessionPreset1920x1080;

        stillImageOutput = AVCapturePhotoOutput();

        let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo);

        do {
            let input = try AVCaptureDeviceInput(device: backCamera)

            if (captureSession?.canAddInput(input))! {
                captureSession?.addInput(input);

                if (captureSession?.canAddOutput(stillImageOutput))! {
                    captureSession?.canAddOutput(stillImageOutput);

                    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
                    previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect;
                    previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait;
                    cameraView.layer.addSublayer(previewLayer!);
                    captureSession?.startRunning();
                }
            }
        } catch {
            print(error);
        }
    }

    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        if let error = error {
            print(error.localizedDescription);
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            print(UIImage(data: dataImage)?.size as Any);

            let dataProvider = CGDataProvider(data: dataImage as CFData);
            let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent);
            let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right);

            self.imageView.image = image;
            self.imageView.isHidden = false;
        }
    }

    func didPressTakePhoto() {
            stillImageOutputSettings = AVCapturePhotoSettings();

            let previewPixelType = stillImageOutputSettings?.availablePreviewPhotoPixelFormatTypes.first!;
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                                 kCVPixelBufferWidthKey as String: 160,
                                 kCVPixelBufferHeightKey as String: 160];
            stillImageOutputSettings?.previewPhotoFormat = previewFormat;

            stillImageOutput.capturePhoto(with: stillImageOutputSettings!, delegate: self);
    }

    func didPressTakeAnother() {
        if (didTakePhoto == true) {
            imageView.isHidden = true;
            didTakePhoto = false;
        } else {
            captureSession?.startRunning();
            didTakePhoto = true;
            didPressTakePhoto();
        }
    }

    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        didPressTakeAnother();
    }
}

有什么建议吗?

提前致谢!

3 个答案:

答案 0 :(得分:5)

代码错误。

enter image description here

应该是哪个

if (captureSession?.canAddOutput(stillImageOutput))!{
      captureSession?.addOutput(stillImageOutput)
}

答案 1 :(得分:0)

对于其他可能想要解决这个问题的人来说,以下资源帮助了我:

供参考和代码布局

SnapChat: Camera 1

SnapChat: Camera 2

新iOS10相机功能的实际实施和使用

AV Foundation: iOS 10

iOS Custom Camera

答案 2 :(得分:0)

AVCaptureSessionPreset1920x1080更改为AVCaptureSessionPresetHigh

试试吧