AVCaptureSession多个输出

时间:2018-08-09 07:52:39

标签: objective-c avfoundation

我有一个应用程序,希望以两种方式处理捕获的样本缓冲区:

  • 用于处理和视频流的YUV。
  • 用于“图像预览”的BGRA,可通过网络发送给感兴趣的客户。

设置为:

// ...
AVCaptureDeviceInput *videoIn
        = [AVCaptureDeviceInput deviceInputWithDevice:self.cameraDevice error:nil];

if ([captureSession canAddInput:videoIn])
    [captureSession addInput:videoIn];

AVCaptureVideoDataOutput *videoOut = [AVCaptureVideoDataOutput new];
videoOut.videoSettings = @{
  (id) kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8Planar)
};
videoOut.alwaysDiscardsLateVideoFrames = YES;

NSString *queueName 
   = [NSString stringWithFormat:@"videocap.%@", self.customCameraUID];
dispatch_queue_t videoCaptureQueue = CustomQueueSerial(queueName);

[videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];

if ([captureSession canAddOutput:videoOut])
    [captureSession addOutput:videoOut];

videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];

BOOL deinterlace = [self toggleDeinterlace];
if (deinterlace) {
    videoConnection.videoFieldMode = AVVideoFieldModeDeinterlace;
}

/* PREVIEW IMAGES*/
if (self.previewImagesEnabled) {
    self.videoOutBGRA = [AVCaptureVideoDataOutput new];
    self.videoOutBGRA.videoSettings = @{
            (id) kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)
    };
    self.videoOutBGRA.alwaysDiscardsLateVideoFrames = YES;


    NSString *qname = [NSString stringWithFormat:@"captureprv.%@", self.customCameraUID];
    dispatch_queue_t videoCaptureQueueBgra = CustomQueueSerial(qname);
    [videoOutBGRA setSampleBufferDelegate:self queue:videoCaptureQueueBgra];

    if ([captureSession canAddOutput:videoOutBGRA]) {
        [captureSession addOutput:videoOutBGRA];
    }
}

然后将捕获回调实现为:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
@autoreleasepool {
    if (s_preflight_skip(&frameCounter)) {
        return;
    }

    if (!CMSampleBufferIsValid(sampleBuffer)) {
        NSLog(@"Invalid %s frame discarded.", (connection == videoConnection) ? "video" : "audio");
        return;
    }

    if ([captureOutput isEqual:self.videoOutBGRA] && [self.delegate sendPreviewImage:self]) {
        s_processPreviewImages(self, sampleBuffer);
        return;
    }

    if (connection == videoConnection) {
        [self pushVideoToAssetWriter:sampleBuffer];

    // ...

但是videoOutBGRA有时被调用,有时不被调用,有时它在随机时间停止。

这是正确的方法吗?否则我应该做不同的事情。

0 个答案:

没有答案