我正在尝试使用10.10上的vImage转换库将CVPixelBufferRef
从视频源转换为CGImageRef
。这在大多数情况下工作正常。但是,每次我从vImage_Buffer
初始化新的CVPixelBufferRef
时,内存都会被丢弃,但永远不会返回。
以下是转换的简化版本,理想情况下应该在一天结束时不使用内存:
CVPixelBufferRef pixelBuffer = ...; // retained CVPixelBufferRef from somewhere else
vImage_Buffer buffer;
vImage_CGImageFormat format = {.bitsPerComponent = 8, .bitsPerPixel = 32, .colorSpace = NULL, .bitmapInfo = kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little, .version = 0, .decode = NULL, .renderingIntent = kCGRenderingIntentAbsoluteColorimetric};
vImage_Error imageError = vImageBuffer_InitWithCVPixelBuffer(&buffer, &format, pixelBuffer, NULL, NULL, kvImagePrintDiagnosticsToConsole);
// Do conversion here
free(buffer.data);
注释掉最后两行(init和free)实际上不会使用比我开始时更多的内存。然而,在那里有两条线,每次消耗6 MB。
如果我只注释掉free
,就会消耗更多的内存,所以free正在做一些事情,但我只能假设vImageBuffer_InitWithCVPixelBuffer
使用的内存超出预期。有没有人见过这个?
完成后,以下是从CVPixelBufferRef
到NSImage
的整个转换方法:
CVPixelBufferRef pixelBuffer = ...; // retained CVPixelBufferRef from somewhere else
NSImage *image = nil;
vImage_Buffer buffer;
vImage_CGImageFormat format = {.bitsPerComponent = 8, .bitsPerPixel = 32, .colorSpace = NULL, .bitmapInfo = kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little, .version = 0, .decode = NULL, .renderingIntent = kCGRenderingIntentAbsoluteColorimetric};
vImage_Error imageError = vImageBuffer_InitWithCVPixelBuffer(&buffer, &format, pixelBuffer, NULL, NULL, kvImagePrintDiagnosticsToConsole);
if (imageError != 0) {
NSLog(@"vImageBuffer_InitWithCVPixelBuffer Error: %zd", imageError);
} else {
CGImageRef imageRef = vImageCreateCGImageFromBuffer(&buffer, &format, NULL, NULL, kvImagePrintDiagnosticsToConsole|kvImageHighQualityResampling, &imageError);
if (!imageRef) {
NSLog(@"vImageCreateCGImageFromBuffer Error: %zd", imageError);
} else {
image = [[NSImage alloc] initWithCGImage:imageRef size:NSMakeSize(CGImageGetWidth(imageRef), CGImageGetHeight(imageRef))];
CGImageRelease(imageRef);
NSAssert(image != nil, @"Creating the image failed!");
}
}
free(buffer.data);