使用现有视图作为UIImagePickerController的cameraOverlayView

时间:2015-07-07 19:44:50

标签: objective-c iphone swift uiscrollview uiimagepickercontroller

我有一个滚动视图,内容顶部有一个透明的开口。开幕式在屏幕外开始。 (黑色矩形表示设备窗口):

enter image description here

当滚动视图开始向下滚动时,我想启动设备的相机,图像选择器视图滚动视图后面:

enter image description here

scrollview将锁定到位:

enter image description here

透明开口和scrollview行为没有问题。我还成功设法呈现了一个UIImagePickerController,其中不是已经是屏幕上的滚动视图。

我似乎无法弄清楚如何在上面模拟的现有内容后面显示图像选择器的视图

1 个答案:

答案 0 :(得分:0)

经过数小时的额外研究后,我得出的结论是,用UIImagePickerController 做我想做的事情是不可能的。然而,纵观框架堆栈,我发现AVCaptureSession更强大(如果复杂得多)。

解决方案大纲

在WWDC视频中使用“在iOS 5上使用AV Foundation从相机捕捉”,我制作了一个名为VideoCaptureView的类,它封装了等式的输入和相机预览侧。然后我在“舷窗”视图后面打了一个空白的UIView,它提供了圆形直视屏蔽,并使UIView成为VideoCaptureView的子类。使用视图控制器中的插座,当scrollview将舷窗移动到位时,我调用captureView.beginCapture()。 (尚未处理停止视频捕获......这对读者来说是一种练习。)

代码

对于其他感兴趣的人,这是我目前对VideoCaptureView的实现...也在GitHub上作为gist

import UIKit
import AVFoundation

typealias VideoCaptureFailure = String

class VideoCaptureView: UIView {

  var videoDataCaptureDelegate: AVCaptureVideoDataOutputSampleBufferDelegate?

  private var previewLayer: AVCaptureVideoPreviewLayer?
  private lazy var captureSession:AVCaptureSession = {
    let session = AVCaptureSession()
    print("created capture session: \(session)".emphasizeForLogging())
    session.sessionPreset = AVCaptureSessionPresetPhoto
    return session
  }()

  func beginCapture() -> VideoCaptureFailure? {

    print("Stopping session: \(captureSession)".emphasizeForLogging())
    captureSession.stopRunning()

    if let device = captureDevice() {
      var err : NSError? = nil

      if captureSession.inputs.isEmpty {
        captureSession.addInput(AVCaptureDeviceInput(device: device, error: &err))
      }

      if captureSession.outputs.isEmpty {
        assert(videoDataCaptureDelegate != nil, "Hey, you forgot to hook up the damn delegate!")
        let output = AVCaptureVideoDataOutput()
        captureSession.addOutput(output)
        let serial_queue = dispatch_queue_create("smilesVideoOutputQueue", nil)
        output.setSampleBufferDelegate(videoDataCaptureDelegate, queue: serial_queue)
      }

      if err != nil {
        println("error: \(err?.localizedDescription)".emphasizeForLogging())
      }
      if previewLayer == nil {
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        self.layer.addSublayer(previewLayer)
        previewLayer?.frame = self.layer.frame
      }
      captureSession.startRunning()
      return nil // Success. 
    } else { return "No capture device available" }
  }

  private func captureDevice() -> AVCaptureDevice? {
    // Make sure this particular device supports video
    for device in AVCaptureDevice.devices() {
      if device.hasMediaType(AVMediaTypeVideo) {
        if device.position == .Front {
          return device as? AVCaptureDevice
        }
      }
    }
    return nil
  }
}