我正在尝试使用以下代码在AVFoundation
框架中使用iOS7 QR阅读功能:
-(void)setupCaptureSession_iOS7 {
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(@"Error: %@", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(@"%lu",(unsigned long)output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(@"%@",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = @[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode];
output.rectOfInterest = CGRectMake(0, 0, 320, 480);
[session startRunning];
// Assign session to an ivar.
[self setSession:self.session];
}
此代码显然不会将帧渲染到屏幕(尚未)。这是因为,我不需要使用AVCaptureVideoPreviewLayer
类来显示预览,而是需要将帧显示为UIImage
(这是因为我想在视图上多次显示帧)。
如果我使用AVCaptureVideoDataOutput
作为输出,我可以通过从captureOutput:didOutputSampleBuffer:fromConnection:
回调中获取帧来导出帧。但是当我使用AVCaptureMetadataOutput
作为输出时,我找不到相同的方法来调用获取frameBuffer。
有没有人知道如何做到这一点?