将CMSampleBufferRef转换为OpenCV IplImage的最佳/最快方法是什么?

时间:2011-03-03 03:05:42

标签: iphone opencv real-time avfoundation

我正在编写一个iPhone应用程序,可以使用OpenCV进行某种实时图像检测。将CMSampleBufferRef图像从相机(我正在使用AVFoundation的AVCaptureVideoDataOutputSampleBufferDelegate)转换为OpenCV可以理解的IplImage的最佳方法是什么?转换需要足够快,以便可以实时运行。

- (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
    fromConnection:(AVCaptureConnection *)connection
{
  NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

  // Convert CMSampleBufferRef into IplImage
  IplImage *openCVImage = ???(sampleBuffer);

  // Do OpenCV computations realtime
  // ...

  [pool release];
} 

提前致谢。

2 个答案:

答案 0 :(得分:12)

此示例代码基于Apple的样本来管理CMSampleBuffer的指针:

- (IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    IplImage *iplimage = 0;
    if (sampleBuffer) {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // get information of the image in the buffer
        uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
        size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer);
        size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer);

        // create IplImage
        if (bufferBaseAddress) {
            iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4);
            iplimage->imageData = (char*)bufferBaseAddress;
        }

        // release memory
        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    }
    else
        DLog(@"No sampleBuffer!!");

    return iplimage;
}

您需要创建一个4通道IplImage,因为手机的相机缓冲区位于BGRA中。

根据我的经验,这种转换速度非常快,可以在实时应用程序中完成,但当然,任何添加到它的内容都会花费时间,特别是对于OpenCV。

答案 1 :(得分:2)

“iplimage-> imageData =(char *)bufferBaseAddress;”会导致内存泄漏。

它应该是“memcpy(iplimage-> imageData,(char *)bufferBaseAddress,iplimage-> imageSize);”

所以完整的编码是:

-(IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
  IplImage *iplimage = 0;

  if (sampleBuffer) {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // get information of the image in the buffer
    uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
    size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer);
    size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer);

    // create IplImage
    if (bufferBaseAddress) {
        iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4);

        //iplimage->imageData = (char*)bufferBaseAddress; 
        memcpy(iplimage->imageData, (char*)bufferBaseAddress, iplimage->imageSize);
    }

    // release memory
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}
else
    DLog(@"No sampleBuffer!!");

return iplimage;

}