iOS Swift - AVCaptureSession - 捕获尊重帧速率的帧

时间:2016-01-11 10:11:06

标签: ios swift avcapturesession avcapturedevice

我正在尝试构建一个应用程序,该应用程序将从相机捕获帧并使用OpenCV处理它们,然后将这些文件保存到设备,但是以特定的帧速率。

我目前所坚持的是AVCaptureVideoDataOutputSampleBufferDelegate似乎不尊重AVCaptureDevice.activeVideoMinFrameDurationAVCaptureDevice.activeVideoMaxFrameDuration设置。

captureOutput的运行速度远远超过每秒2帧,因为上述设置会显示。

您是否碰巧知道无论是否有代表,都可以实现这一目标?

的ViewController:

override func viewDidLoad() {
    super.viewDidLoad()

}

override func viewDidAppear(animated: Bool) {
    setupCaptureSession()
}

func setupCaptureSession() {

    let session : AVCaptureSession = AVCaptureSession()
    session.sessionPreset = AVCaptureSessionPreset1280x720

    let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]

    for device in videoDevices {
        if device.position == AVCaptureDevicePosition.Back {
            let captureDevice : AVCaptureDevice = device

            do {
                try captureDevice.lockForConfiguration()
                captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
                captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
                captureDevice.unlockForConfiguration()

                let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

                if session.canAddInput(input) {
                    try session.addInput(input)
                }

                let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

                let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
                output.setSampleBufferDelegate(self, queue: dispatch_queue)

                session.addOutput(output)

                session.startRunning()

                let previewLayer = AVCaptureVideoPreviewLayer(session: session)
                previewLayer.connection.videoOrientation = .LandscapeRight

                let previewBounds : CGRect = CGRectMake(0,0,self.view.frame.width/2,self.view.frame.height+20)
                previewLayer.backgroundColor = UIColor.blackColor().CGColor
                previewLayer.frame = previewBounds
                previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
                self.imageView.layer.addSublayer(previewLayer)

                self.previewMat.frame = CGRectMake(previewBounds.width, 0, previewBounds.width, previewBounds.height)

            } catch _ {

            }
            break
        }
    }

}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    self.wrapper.processBuffer(self.getUiImageFromBuffer(sampleBuffer), self.previewMat)
}

1 个答案:

答案 0 :(得分:10)

所以我发现了问题。

AVCaptureDevice.h属性上方activeVideoMinFrameDuration的评论部分中,它指出:

  

在iOS上,接收者的activeVideoMinFrameDuration重置为其   在以下条件下的默认值:

- The receiver's activeFormat changes
- The receiver's AVCaptureDeviceInput's session's sessionPreset changes
- The receiver's AVCaptureDeviceInput is added to a session

最后一个要点导致了我的问题,因此执行以下操作解决了我的问题:

        do {

            let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

            if session.canAddInput(input) {
                try session.addInput(input)
            }

            try captureDevice.lockForConfiguration()
            captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
            captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
            captureDevice.unlockForConfiguration()

            let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

            let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
            output.setSampleBufferDelegate(self, queue: dispatch_queue)

            session.addOutput(output)