我正在连续拍摄多张照片并使用GPUImage框架处理它们。我有一个帮助程序类,主要是执行GPUImageSubtractBlendFilter。这是我的工作:
#import "ImageProcessor.h"
@interface ImageProcessor ()
@end
@implementation ImageProcessor
GPUImageSubtractBlendFilter *subFilter;
-(id)init {
self = [super init];
subFilter = [[GPUImageSubtractBlendFilter alloc] init];
return self;
}
-(UIImage*)flashSubtract:(UIImage*) image1 : (UIImage*) image2{
UIImage *processedImage;
// @autoreleasepool {
//CAUSING MEMORY ISSUE
GPUImagePicture *img1 = [[GPUImagePicture alloc] initWithImage:image1];
GPUImagePicture *img2 = [[GPUImagePicture alloc] initWithImage:image2];
//MEMORY ISSUE END
[img1 addTarget:subFilter];
[img2 addTarget:subFilter];
[img1 processImage];
[img2 processImage];
[subFilter useNextFrameForImageCapture];
processedImage = [subFilter imageFromCurrentFramebuffer];
// }
//consider modifications to filter possibly?
return processedImage;
}
即使启用了ARC,内存也在不断增长,并且不会解除分配。我调试了它并将其缩小到这两个分配,这是原因的核心:
img1 = [[GPUImagePicture alloc] initWithImage:[imagesArray objectAtIndex:1]];
img2 = [[GPUImagePicture alloc] initWithImage:[imagesArray objectAtIndex:0]];
我在这里遗漏了什么,或者有什么我应该做的更好,不能连续分配GPUImagePicture变量?
以下是代码的来源:
-(void)burstModeCapture : (AVCaptureConnection *) videoConnection : (int) i{//start capturing picture s rapidly and cache them in ram
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
NSLog(@"time entering: %d", i);
[photoOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if(error)
NSLog(@"%s",[[error localizedDescription] UTF8String]);
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(imageSampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
Byte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);
size_t width = CVPixelBufferGetWidth(cameraFrame);
size_t height = CVPixelBufferGetHeight(cameraFrame);
NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];
// Do whatever with your bytes
// create suitable color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
//Create suitable context (suitable for camera output setting kCVPixelFormatType_32BGRA)
CGContextRef newContext = CGBitmapContextCreate(rawImageBytes, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
// release color space
CGColorSpaceRelease(colorSpace);
//Create a CGImageRef from the CVImageBufferRef
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
UIImage *FinalImage = [[UIImage alloc] initWithCGImage:newImage];
[imagesArray addObject:FinalImage];//append image to array
dispatch_group_leave(group);
}];
dispatch_group_notify(group, dispatch_get_main_queue(), ^{//execute function recursively to shoot n photos
//base case to stop shooting pictures
shootCounter--;
if (shootCounter <= 0) {
[flash turnOffFlash];
shootCounter = NUMSHOTS;
UIImage *output = [self processImages]; //THIS IS WHERE MEMORY STARTS ACCUMULATING
[self updateUIWithOutput:output];
NSLog(@"Done shooting!");
}
else {
[NSThread sleepForTimeInterval: 0.1];
[self burstModeCapture:videoConnection : shootCounter];
}
});
}
我以递归方式运行此函数两次以捕获图像对。 [imageProcessor flashSubtract]是存在问题的地方。
答案 0 :(得分:0)
CGContextRelease(newContext);
行之后您遗失CGImageRef newImage = CGBitmapContextCreateImage(newContext);
。这可能会导致内存泄漏。