未调用AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput

时间:2016-03-07 03:53:39

标签: ios objective-c swift avfoundation ios-camera

我目前有一个自行开发的框架(MySDK)和一个使用MySDK的iOS应用程序(MyApp)。

在MySDK内部,我在MySDK中有一个类(扫描仪)处理来自设备相机视频输出的图像。

以下是我的代码示例:

Scanner.swift

class Scanner: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

    var captureDevice : AVCaptureDevice?
    var captureOutput : AVCaptureVideoDataOutput?
    var previewLayer : AVCaptureVideoPreviewLayer?
    var captureSession : AVCaptureSession?

    var rootViewController : UIViewController?

    func scanImage (viewController: UIViewController)
    {
        NSLog("%@", "scanning begins!")

        if (captureSession == nil) { captureSession = AVCaptureSession() }

        rootViewController = viewController;

        captureSession!.sessionPreset = AVCaptureSessionPresetHigh

        let devices = AVCaptureDevice.devices()

        for device in devices {
            if (device.hasMediaType(AVMediaTypeVideo)) {
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                }
            }
        }

        if (captureDevice != nil) {
            NSLog("%@", "beginning session!")

            beginSession()
        }
    }

    func beginSession()
    {
        if (captureSession == nil) { captureSession = AVCaptureSession() }
        if (captureOutput == nil) { captureOutput = AVCaptureVideoDataOutput() }
        if (previewLayer == nil) { previewLayer = AVCaptureVideoPreviewLayer() }

        let queue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);

        captureOutput!.setSampleBufferDelegate(self, queue: queue)
        captureOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as NSString: Int(kCVPixelFormatType_32BGRA)]

        captureSession!.addInput(try! AVCaptureDeviceInput(device: captureDevice))
        captureSession!.addOutput(captureOutput)

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer!.frame = rootViewController!.view.layer.frame

        rootViewController!.view.layer.addSublayer(previewLayer!)

        captureSession!.startRunning()
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef!, fromConnection connection: AVCaptureConnection!)
    {
        NSLog("%@", "captured!")
    }
}

在MyApp内部,我有一个实现IBAction的ViewController,其中初始化了Scanner类,并触发了scanImage函数。

MyApp.m

- (IBAction)btnScanImage_TouchDown:(id)sender
{
    Scanner * scanner = [[Scanner alloc] init];

    [scanner scanImage:self];
}

摄像机视图出现在应用程序内部,但captureOutput函数永远不会被触发,控制台只包含这两行:

2016-03-07 11:11:45.860 myapp[1236:337377] scanning begins!
2016-03-07 11:11:45.984 myapp[1236:337377] beginning session!

创建一个独立的应用程序,并将 Scanner.swift 中的代码嵌入到ViewController中工作得很好; captureOutput函数正确触发。

有谁知道我在这里做错了什么?

1 个答案:

答案 0 :(得分:-1)

经过多次反复试验,我终于找到了解决问题的方法。

显然,我没有将扫描程序对象创建为变量,仅作为本地变量。

扫描程序对象创建为变量后,委托方法 captureOutput 被正确触发。