我正在使用OpenCV在iOS上进行实时视频处理,而不使用CvVideoCamera
。我的应用程序因内存压力而崩溃。
本机iOS相机每次捕获帧时都会调用此函数:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
//convert the frame to a UIImage:
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
//convert the UIImage to a Mat:
Mat srcMat = [self cvMatFromUIImage:image];
//Process the Mat:
Mat dst, cdst;
Canny(srcMat, dst, 50, 200, 3);
cvtColor(dst, cdst, COLOR_GRAY2BGR);
}
由于内存压力,应用程序在15秒后崩溃。
我正在使用Apple's code for imageFromSampleBuffer:
和OpenCV's code for cvMatFromUIImage
。是的,我正在使用ARC。
我使用Allocations Instrument分析了应用程序,发现崩溃是由于大量UIImage
被创建并且从未发布。经过一些调查后,我发现对Canny()
的调用对此负责,因为当UIImage
的调用被注释掉时,Canny()
个对象不会泄漏。
为什么调用Canny使UIImage
个对象留在内存中?
答案 0 :(得分:2)
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFRetain(sampleBuffer);
dispatch_queue_t myQueue = dispatch_queue_create("my.dispatch.q", 0);
dispatch_async(myQueue,
^{
//convert the frame to a UIImage:
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
//convert the UIImage to a Mat:
Mat srcMat = [self cvMatFromUIImage:image];
//Process the Mat:
Mat dst, cdst;
Canny(srcMat, dst, 50, 200, 3);
cvtColor(dst, cdst, COLOR_GRAY2BGR);
CFRelease(sampleBuffer);
});
}