我正在开发一个丰富的图形iOS应用程序。在一个实例中,我们的应用程序占用的内存为250 MB。我会从Camera中获取每个帧,使用OpenGL着色器处理它并提取一些数据。每次我使用相机获取帧进行处理时,我看到内存增加到280 MB。当我停止捕获帧时,内存恢复正常至250 MB。如果我重复启动相机并退出10次(让我们说),我收到一个内存警告(虽然没有观察到内存泄漏)。我这里不使用ARC。我正在维护一个自动发布池,其中包括帧的整个处理。在剖析时我没有看到任何泄漏。 10次后,内存似乎达到250 MB。我不确定记忆警告的原因。任何见解?我很乐意提供进一步的信息。 Opengl版本 - ES 2.0,iOS版本 - 7.0
答案 0 :(得分:0)
你必须使用ARC,它会自动释放坏内存,并使你的应用程序优化
答案 1 :(得分:0)
根据其他一些问题(Crash running OpenGL on iOS after memory warning)和这一个问题(instruments with iOS: Why does Memory Monitor disagree with Allocations?),问题可能是你没有删除OpenGL资源(VBO,纹理,渲染缓冲区等等)你完成了他们。
答案 2 :(得分:0)
没有看到代码,谁知道?您是否只是使用EAGLContext的presentRenderbuffer方法渲染帧缓冲区?那么,你对传递给CVOpenGLESTextureCacheCreateTextureFromImage的pixelBuffer做了什么?在典型的使用场景中,像素缓冲区是实质内存的唯一来源。
但是,如果您将渲染缓冲区中的数据交换到另一个缓冲区(例如glReadPixels),那么您已经引入了几个内存生成器中的一个。如果您交换的缓冲区是CoreGraphics缓冲区,例如CGDataProvider,您是否包含数据发布回调,或者您在创建提供程序时是否将nil作为参数传递?交换缓冲区后你有没有glFlush?
如果您提供代码,我可以确定答案;如果您认为可以在不这样做的情况下解决这个问题,但是希望看到在最艰巨的用例场景中成功管理内存的工作代码可能会:
https://demonicactivity.blogspot.com/2016/11/tech-serious-ios-developers-use-every.html
为方便起见,我在下面提供了一些代码。在调用presentRenderbuffer
方法之后放置它,如果你不想将缓冲区渲染到CAEAGLLayer中的显示器,则注释掉调用(如我在下面的示例中所做的那样):
// [_context presentRenderbuffer:GL_RENDERBUFFER];
dispatch_async(dispatch_get_main_queue(), ^{
@autoreleasepool {
// To capture the output to an OpenGL render buffer...
NSInteger myDataLength = _backingWidth * _backingHeight * 4;
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glPixelStorei(GL_UNPACK_ALIGNMENT, 8);
glReadPixels(0, 0, _backingWidth, _backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// To swap the pixel buffer to a CoreGraphics context (as a CGImage)
CGDataProviderRef provider;
CGColorSpaceRef colorSpaceRef;
CGImageRef imageRef;
CVPixelBufferRef pixelBuffer;
@try {
provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, &releaseDataCallback);
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * _backingWidth;
colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
imageRef = CGImageCreate(_backingWidth, _backingHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
} @catch (NSException *exception) {
NSLog(@"Exception: %@", [exception reason]);
} @finally {
if (imageRef) {
// To convert the CGImage to a pixel buffer (for writing to a file using AVAssetWriter)
pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:imageRef];
// To verify the integrity of the pixel buffer (by converting it back to a CGIImage, and thendisplaying it in a layer)
imageLayer.contents = (__bridge id)[CVCGImageUtil cgImageFromPixelBuffer:pixelBuffer context:_ciContext];
}
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(imageRef);
}
}
});
。 。
在CGDataProvider类的实例中释放数据的回调:
static void releaseDataCallback (void *info, const void *data, size_t size) {
free((void*)data);
}
CVCGImageUtil类接口和实现文件分别为:
@import Foundation;
@import CoreMedia;
@import CoreGraphics;
@import QuartzCore;
@import CoreImage;
@import UIKit;
@interface CVCGImageUtil : NSObject
+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context;
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image;
+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image;
@end
#import "CVCGImageUtil.h"
@implementation CVCGImageUtil
+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context
{
// CVPixelBuffer to CoreImage
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(M_PI)];
CGPoint origin = [image extent].origin;
image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];
// CoreImage to CGImage via CoreImage context
CGImageRef cgImage = [context createCGImage:image fromRect:[image extent]];
// CGImage to UIImage (OPTIONAL)
//UIImage *uiImage = [UIImage imageWithCGImage:cgImage];
//return (CGImageRef)uiImage.CGImage;
return cgImage;
}
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize frameSize = CGSizeMake(CGImageGetWidth(image),
CGImageGetHeight(image));
NSDictionary *options =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES],
kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status =
CVPixelBufferCreate(
kCFAllocatorDefault, frameSize.width, frameSize.height,
kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(
pxdata, frameSize.width, frameSize.height,
8, CVPixelBufferGetBytesPerRow(pxbuffer),
rgbColorSpace,
(CGBitmapInfo)kCGBitmapByteOrder32Little |
kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image
{
CVPixelBufferRef pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:image];
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timimgInfo = kCMTimingInfoInvalid;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, pixelBuffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
pixelBuffer,
true,
NULL,
NULL,
videoInfo,
&timimgInfo,
&newSampleBuffer);
return newSampleBuffer;
}
@end