AVCaptureDeviceInput在第一秒钟运行带有原生脚本的AVCaptureSession后丢弃帧

时间:2019-07-19 19:18:46

标签: objective-c nativescript avcapturesession avcapturedevice

我正在尝试在ios上的nativescript插件中创建一个录像机,这意味着我正在使用该插件内部的Native Objective C类在具有Android实现的nativescript应用中具有共享接口。

我已加载摄像机视图,并且试图从AVCaptureSession访问视频帧。我创建了一个对象,该对象实现了协议以获取帧,并且在第一秒中,函数captureOutput具有参数DidOutputSampleBuffer输出帧。但是从那以后所有的帧都掉了,我不知道为什么。我可以看到它们被丢弃,因为协议中带有参数DidDropSampleBuffer的captureOutput函数在每一帧都运行。

我尝试更改avcapturesession的初始化顺序,但没有任何改变。

下面是带有创建捕获会话和捕获对象的主要代码的主要功能。虽然这是打字稿,但nativescript允许您调用原生的Objective C函数和类,因此逻辑与目标c相同。我还用nativescript创建了一个VideoDelegate对象,该对象对应于Objective C中的一个类,它使我能够实现协议以从捕获设备的输出中获取视频帧。

  this._captureSession = AVCaptureSession.new();
        //Get the camera

        this._captureSession.sessionPreset = AVCaptureSessionPreset640x480;

        let inputDevice = null;
        this._cameraDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo);


        //Get the camera input
        let error: NSError = null;
        this._captureInput = AVCaptureDeviceInput.deviceInputWithDeviceError(this._cameraDevice);

        if(this._captureSession.canAddInput(this._captureInput)){
            this._captureSession.addInput(this._captureInput);
        }
        else{
            console.log("couldn't add input");
        }
        let self = this;
        const VideoDelegate = (NSObject as any).extend({
            captureOutputDidOutputSampleBufferFromConnection(captureOutput: any,sampleBuffer: any, connection:any): void {
               console.log("Captureing Frames");
              if(self.startRecording){
                self._mp4Writer.appendVideoSample(sampleBuffer);
                console.log("Appending Video Samples");
              }
            },
            captureOutputDidDropSampleBufferFromConnection(captureOutput: any,sampleBuffer: any, connection:any): void {
              console.log("Dropping Frames");

            },
            videoCameraStarted(date){
               // console.log("CAMERA STARTED");
            }
        }, {
            protocols: [AVCaptureVideoDataOutputSampleBufferDelegate]
        });

        this._videoDelegate = VideoDelegate.new();

        //setting up camera output for frames
        this._captureOutput = AVCaptureVideoDataOutput.new();
        this._captureQueue = dispatch_queue_create("capture Queue", null);
        this._captureOutput.setSampleBufferDelegateQueue(this._videoDelegate,this._captureQueue);


        this._captureOutput.alwaysDiscardsLateVideoFrames = false;


    this._framePixelFormat = NSNumber.numberWithInt(kCVPixelFormatType_32BGRA);
    this._captureOutput.videoSettings = NSDictionary.dictionaryWithObjectForKey(this._framePixelFormat,kCVPixelBufferPixelFormatTypeKey);



        this._captureSession.addOutput(this._captureOutput);


        this._captureSession.startRunning();

0 个答案:

没有答案