我使用默认的AVCaptureSession来捕捉相机视图 一切正常,我没有任何泄漏,但是当我在启动和关闭AVCaptureDevice后使用Allocations查找被遗弃的内存时,它向我显示了大约230个仍然存在的对象。
这是我的代码:
或者Controller.h:
@interface Controller : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate> {
AVCaptureSession *captureSession;
AVCaptureDevice *device;
IBOutlet UIView *previewLayer;
}
@property (nonatomic, retain) AVCaptureSession *captureSession;
@property (nonatomic, retain) UIView *previewLayer;
- (void)setupCaptureSession;
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection;
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer;
Controller.m或者:
- (void)setupCaptureSession {
NSError *error = nil;
[self setCaptureSession: [[AVCaptureSession alloc] init]];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
}
[[self captureSession] addInput:input];
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[[self captureSession] addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
output.minFrameDuration = CMTimeMake(1, 15);
[[self captureSession] startRunning];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
captureVideoPreviewLayer.frame = previewLayer.bounds;
[previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
[previewLayer setHidden:NO];
}
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) {
// Create a UIImage from the sample buffer data
mutex = NO;
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
image = [Tools rotateImage:image andRotateAngle:UIImageOrientationUp];
CGRect rect;
rect.size.width = 210;
rect.size.height = 50;
rect.origin.x = 75;
rect.origin.y = 175;
UIImage *croppedImage = [image resizedImage:image.size interpolationQuality:kCGInterpolationHigh];
croppedImage = [croppedImage croppedImage:rect];
croppedImage = [self processImage:croppedImage];
[NSThread detachNewThreadSelector:@selector(threadedReadAndProcessImage:) toTarget:self withObject:croppedImage];
}
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
return (image);
}
我用这段代码清理所有内容:
- (void)cancelTapped {
[[self captureSession] stopRunning], self.captureSession = nil;
for (UIView *view in self.previewLayer.subviews) {
[view removeFromSuperview];
}
[self dismissModalViewControllerAnimated:YES];
}
- (void)dealloc {
[super dealloc];
[captureSession release];
[device release];
[previewLayer release];
}
仪器向我展示了这样的事情: http://i.stack.imgur.com/NBWgZ.png
http://i.stack.imgur.com/1GB6C.png
任何想法我做错了什么?
答案 0 :(得分:2)
- (void)setupCaptureSession { NSError *error = nil; [self setCaptureSession: [[AVCaptureSession alloc] init]]; ...
泄漏了捕获会话,这会使所有输入和输出以及所有内部帮助者保持活跃状态。
两个选项:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
self.captureSession = session;
[session release], session = nil;
// or:
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
答案 1 :(得分:1)
- (void)dealloc {
[super dealloc];
[captureSession release];
[device release];
[previewLayer release];
}
super dealloc应该在其他版本之后调用,或者你的实例的内存可能不包含你发布的这些对象的有效指针,因此你不会释放它们,特别是如果它们是nil。