我让我的AVCaptureSession工作,它几乎完美地复制了Camera.app用户界面,然而,几秒钟后应用程序将崩溃,我只是找不到我做错了什么。我真的希望有人知道如何优化这个!
我 AM 使用ARC;而且,整个会话运行良好,但稍微崩溃了。 AVCaptureSession委托方法被调用似乎每秒钟。如果只有当用户按下“拍照”按钮时才有办法调用该方法,那么如何在保持“实时”预览图层的同时执行此操作?
提前致谢!
设置会话
NSError *error = nil;
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
if(version >= 4.0 && version < 5.0) {
output.minFrameDuration = CMTimeMake(1, 15);
}
output.alwaysDiscardsLateVideoFrames = YES;
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:previewLayer];
[self.view addSubview:camera_overlay];
[session startRunning];
AVCaptureSession正在调用的代理:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
return capture_image;
}
从样本缓冲区获取UIImage的方法
- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
return image;
}
答案 0 :(得分:5)
请查看Apple的AVCam Demo应用程序以获取完整示例。
方法
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
每次相机帧准备好时都会调用,在您的情况下,每秒调用15次,或者至少应调用15次,因为您将帧速率指定为output.minFrameDuration = CMTimeMake(1, 15);
从您提供的代码中,我能想到的唯一原因是您没有发布UIImage *capture_image
您可以使用XCode Instruments分析您的应用程序并查看其发生的原因:Instruments Guide
Leaks工具是你的第一站,网上有很多教程,这里有一个:Tracking iPhone Memory Leaks写了一个SO用户OwenGross,如果我没弄错的话{{ 3}}
答案 1 :(得分:0)
帖子看起来很旧但是如果有人看到这个:
您是谁在委托方法中将图片返回(
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
return capture_image;
}
)?
您可以使用一个引发标志的按钮,在委托方法中检查是否引发了标志,然后才创建图像。 图像应该是一个实例变量,否则它在任何情况下都会丢失。
还有委托方法来捕获图像 captureStillImageAsynchronouslyFromConnection