AVCapturePhotoOutput不提供预览缓冲区

时间:2017-05-29 18:56:06

标签: swift avfoundation avcapturesession avcaptureoutput

使用AVCapturePhotoOutput设置自定义相机。 除主JPEG缓冲区外,还配置AVCapturePhotoOutput以提供预览缓冲区(缩略图)。

问题是我只接收预览缓冲区一次(第一次捕获),然后从第二次接收nil(总是正确接收主photoSampleBuffer)。

以下是我设置捕获的方法:

@Injectable()
export class AbstractBlog {
  constructor(
    protected elementRef: ElementRef, // used only by BlogComponent 
    protected modalService: ModalService,
    protected router: Router,
    protected itemService: ItemService
 ) { }
}

export class BlogComponent extends AbstractBlog {
  // use services
}

export class Blog2Component extends AbstractBlog {
  // use services
}

在我的PhotoCaptureDelegate(实现AVCapturePhotoCaptureDelegate)中:

func capturePhoto() {

    guard let videoPreviewLayerOrientation = deviceOrientation.videoOrientation else { return }

    sessionQueue.async {
        if let photoOutputConnection = self.photoOutput.connection(withMediaType: AVMediaTypeVideo) {
            photoOutputConnection.videoOrientation = videoPreviewLayerOrientation
        }

        // each photo captured requires a brand new setting object and capture delegate
        let photoSettings = AVCapturePhotoSettings()

        // Capture a JPEG photo with flash set to auto and high resolution photo enabled.
        photoSettings.isHighResolutionPhotoEnabled = true

        //configure to receive a preview image (thumbnail)
        if let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first {
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String : previewPixelType,
                                 kCVPixelBufferWidthKey as String : NSNumber(value: 160),
                                 kCVPixelBufferHeightKey as String : NSNumber(value: 160)]
            photoSettings.previewPhotoFormat = previewFormat
        }

        // TODO: photoSettings.flashMode = .auto 

        // Use a separate object for the photo capture delegate to isolate each capture life cycle.
        let photoCaptureDelegate = PhotoCaptureDelegate(with: photoSettings, willCapturePhotoAnimation: { [unowned self] in
            // show shutter animation
            self.shutterAnimation()
            }, completed: { [unowned self] (photoCaptureDelegate, photoData, previewThumbnail) in

                self.captureCompleted(photoCaptureDelegate: photoCaptureDelegate, data: photoData, thumbnail: previewThumbnail)
            }
        )
      // The Photo Output keeps a weak reference to the photo capture delegate so we store it in an array
        // to maintain a strong reference to this object until the capture is completed.
      self.inProgressPhotoCaptureDelegates[photoCaptureDelegate.requestedPhotoSettings.uniqueID] = photoCaptureDelegate
        self.photoOutput.capturePhoto(with: photoSettings, delegate: photoCaptureDelegate)
    }
}

当我第一次捕获时,我会收到photoSampleBuffer& previewPhotoSampleBuffer。第二次和我只收到func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { if let photoBuffer = photoSampleBuffer { photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoBuffer, previewPhotoSampleBuffer: nil) } if let previewBuffer = previewPhotoSampleBuffer { if let pixelBuffer = CMSampleBufferGetImageBuffer(previewBuffer) { photoThumbnail = CIImage(cvPixelBuffer: pixelBuffer) } } } photoSampleBuffer但是当我检查previewPhotoSampleBuffer = nil时,我得到: resolvedSettings.previewDimensions

如果我通过重新配置捕获会话来切换摄像机(从前到后),则之后的第一次捕获是正常的,然后再次没有预览缓冲区。委托回调中的CMVideoDimensions(width: 160, height: 120)参数始终为零。

在运行iOS 10.3.1的iPhone 6上测试

1 个答案:

答案 0 :(得分:0)

找到了解决方案,但不完全了解它是如何导致初始问题的。

我已经更改了照片捕获委托中预览样本缓冲区的转换,以通过CIContext对象转换为UIImage。事先我简单地创建了一个CIImage并将其发送到UI(不同的线程),看起来CIImage保存了对原始缓冲区的引用,以及稍后用以某种方式对其进行混乱的UI处理会影响下一次捕获(同样,不明白为什么。)

新代码创建了一个新的图像位图副本(通过cgImage),然后将其发送到UI - >所以没有对原始缓冲区进行处理。

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

    if let photoBuffer = photoSampleBuffer {
        photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoBuffer, previewPhotoSampleBuffer: nil)
    }

    let previewWidth = Int(resolvedSettings.previewDimensions.width)
    let previewHeight = Int(resolvedSettings.previewDimensions.height)

    if let previewBuffer = previewPhotoSampleBuffer {
        if let imageBuffer = CMSampleBufferGetImageBuffer(previewBuffer) {
            let ciImagePreview = CIImage(cvImageBuffer: imageBuffer)
            let context = CIContext()
            if let cgImagePreview = context.createCGImage(ciImagePreview, from: CGRect(x: 0, y: 0, width:previewWidth , height:previewHeight )) {
                photoThumbnail = UIImage(cgImage: cgImagePreview)
            }
        }
    }
}