从自定义相机获取深度数据

时间:2018-09-27 18:28:17

标签: swift camera ios11 iphone-x

我一直关注Capturing Photos with Depth,并仔细研究了the similar question中的所有建议,但是,我无法从自定义相机中获取任何深度数据。这是我对代码的最新修改,您对这个问题有任何想法吗?

点击相机按钮时,我得到:

  

libc ++ abi.dylib:以类型为NSException的未捕获异常终止

我也已经审查了解决方案。它们主要与segue有关,但我仔细检查了这部分代码和情节提要,这似乎还不错。 (在添加代码深度之前,我没有任何问题!)

class CameraViewController : UIViewController {
  @IBOutlet weak var cameraButton: UIButton!

  var captureSession = AVCaptureSession()
  var captureDevice: AVCaptureDevice?
  var photoOutput: AVCapturePhotoOutput?
  var cameraPreviewLayer: AVCaptureVideoPreviewLayer?

  var image: UIImage?

  var depthDataMap: CVPixelBuffer?
  var depthData: AVDepthData?

  override func viewDidLoad() {
    super.viewDidLoad()

    setupDevice()
    setupIO()
    setupPreviewLayer()
    startRunningCaptureSession()
  }

  func setupDevice() {
    self.captureDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)
  }

  func setupIO() {
    guard let captureInputDevice = try? AVCaptureDeviceInput(device: self.captureDevice!),
      self.captureSession.canAddInput(captureInputDevice)
      else { fatalError("Can't add video input.") }
    self.captureSession.beginConfiguration()
    self.captureSession.addInput(captureInputDevice)

    self.photoOutput = AVCapturePhotoOutput()
    self.photoOutput!.isDepthDataDeliveryEnabled = photoOutput!.isDepthDataDeliverySupported
    guard self.captureSession.canAddOutput(photoOutput!)
      else { fatalError("Can't add photo output.") }
    self.captureSession.addOutput(photoOutput!)
    self.captureSession.sessionPreset = .photo
    self.captureSession.commitConfiguration()
  }

  func setupPreviewLayer() {
    self.cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
    self.cameraPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
    self.cameraPreviewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
    self.cameraPreviewLayer?.frame = self.view.frame
    self.view.layer.insertSublayer(self.cameraPreviewLayer!, at: 0)             
  }
  func startRunningCaptureSession() {
    self.captureSession.startRunning()
  }

  @IBAction func cameraButtonDidTap(_ sender: Any) {    
    let setting = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.hevc])
    setting.isDepthDataDeliveryEnabled = self.photoOutput!.isDepthDataDeliverySupported
    self.photoOutput?.capturePhoto(with: setting, delegate: self)
  }

  override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
    if segue.identifier == "showPhoto" {
      let nav = segue.destination as! UINavigationController
      let previewVC = nav.topViewController as! PhotoViewController

      previewVC.image = self.image
      previewVC.depthData = self.depthData
      previewVC.depthDataMap = self.depthDataMap
    }
  }
}

extension CameraViewController: AVCapturePhotoCaptureDelegate{
  func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    if let imageData = photo.fileDataRepresentation() {
      image = UIImage(data: imageData)
      let imageSource = CGImageSourceCreateWithData(imageData as CFData, nil)
      let auxiliaryData = CGImageSourceCopyAuxiliaryDataInfoAtIndex(imageSource!, 0, kCGImageAuxiliaryDataTypeDisparity) as? [AnyHashable: Any]

      let depthData = try? AVDepthData(fromDictionaryRepresentation: auxiliaryData!)
      self.depthDataMap = depthData?.depthDataMap

      self.performSegue(withIdentifier: "showPhoto", sender: self)
    }
  }
}

1 个答案:

答案 0 :(得分:0)

This is the problem with my code:

DepthDataDelivery将不被支持,除非将照片输出添加到会话中,并且将会话的输入正确配置为传递深度。

  1. 首先设置会话预设:

    self.captureSession.sessionPreset = .photo

  2. 添加双摄像头输入后,添加照片输出。

    guard self.captureSession.canAddOutput(photoOutput!)

  3. 现在已启用深度传送:

    self.photoOutput!.isDepthDataDeliveryEnabled = photoOutput!.isDepthDataDeliverySupported