样本缓冲区代表Swift 2用于实时视频滤波器

时间:2016-03-30 15:55:27

标签: ios swift camera avfoundation avcapturesession

我正试图在iPhone上使用相机快速创建一个光强读取器。这个想法是它为所有像素采用强度分量并平均它们给我一个单一的值。我不需要预览相机。我一直在拼凑几个教程试图让它工作,到目前为止已经提出了下面的代码。 camDeviceSetup()在ViewDidLoad上运行,cameraSetup()在按下按钮时运行。

我在启动" videoDeviceOutput!.setSampleBufferDelegate"的行上遇到错误,它说它无法将FirstViewController类型的值(视图控制器)转换为预期的参数。

let captureSession = AVCaptureSession()
// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?
var videoDeviceOutput: AVCaptureVideoDataOutput?
// AVCaptureVideoPreviewLayer is a subclass of CALayer that you use to display video as it is being captured by an input device.
var previewLayer = AVCaptureVideoPreviewLayer()

func camDeviceSetup() {
    captureSession.sessionPreset = AVCaptureSessionPreset640x480
    let devices = AVCaptureDevice.devices()
    for device in devices {
        // Make sure this particular device supports video
        if (device.hasMediaType(AVMediaTypeVideo)) {
            // Finally check the position and confirm we've got the back camera
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
            }
        }
    }
    if captureDevice != nil {
        let err : NSError? = nil
        captureSession.addInput(try! AVCaptureDeviceInput(device: captureDevice))

        if err != nil {
            print("error: \(err?.localizedDescription)")
        }

    }
}

func cameraSetup() {
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.bounds
    view.layer.addSublayer(previewLayer)

    videoDeviceOutput = AVCaptureVideoDataOutput()
    videoDeviceOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
    videoDeviceOutput!.alwaysDiscardsLateVideoFrames = true

//This is the line that gets stuck and not sure why
    videoDeviceOutput!.setSampleBufferDelegate(self, queue: dispatch_queue_create("VideoBuffer", DISPATCH_QUEUE_SERIAL))

    if captureSession.canAddOutput(videoDeviceOutput) {
        captureSession.addOutput(videoDeviceOutput)
    }

    captureSession.startRunning() 
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    // Think once the delegate is correctly set my algorithm for finding light intensity goes here

}

1 个答案:

答案 0 :(得分:0)

该行的问题归结于我没有在我的ViewController顶部的类中声明AVCaptureVideoDataOutputSampleBufferDelegate。