为QR码优化相机

时间:2015-04-15 09:35:06

标签: iphone ios8 camera qr-code camera-calibration

我有AVCaptureDevice专门扫描QR码(使用AVMetadataObjectTypeQRCode)。我的目标是尽可能快地进行QR码扫描。

AVCaptureDevice相机的多个设置(例如focusexposure)可以在iOS中以编程方式进行调整。

我可以进行哪些相机优化以最大程度地缩短在iPhone上捕获QR码所需的时间?

2 个答案:

答案 0 :(得分:1)

这些设置中的大多数最佳值因环境而异(例如,黑暗/明亮的房间,近/远QR码等),因此,除非您了解用户的环境(例如,如果应用程序仅用于工厂装配线),默认最好。

但是(根据this source),如果您知道QR码将靠近相机,则可以通过将autoFocusRangeRestriction设置为近距离值来加快自动对焦。您还可以确保将smoothAutoFocusEnabled设置为false。

答案 1 :(得分:-4)

我用过AVCaptureDevice。 这个代码工作找到我在我的条形码应用程序中使用此代码。

-(void)BarcodeStart
{

    _highlightView = [[UIView alloc] init];

    _highlightView.autoresizingMask = UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleBottomMargin;

    _highlightView.layer.borderColor = [UIColor lightGrayColor].CGColor;
    _highlightView.layer.borderWidth = 3;

    [barcameraView addSubview:_highlightView];

    _label = [[UILabel alloc] init];
    _label.frame = CGRectMake(0, self.view.bounds.size.height - 40, self.view.bounds.size.width, 40);
    _label.autoresizingMask = UIViewAutoresizingFlexibleTopMargin;
    _label.backgroundColor = [UIColor colorWithWhite:0.15 alpha:0.65];
    _label.textColor = [UIColor whiteColor];
    _label.textAlignment = NSTextAlignmentCenter;
    _label.text = @"(none)";
    [self.view addSubview:_label];
    //BackBtn UP side Show
    UIButton *button = [UIButton buttonWithType:UIButtonTypeCustom];
    //[button addTarget:self action:@selector(aMethod:)forControlEvents:UIControlEventTouchDown];
    UIImageView *img = [[UIImageView alloc] init];
    button.frame = CGRectMake(3,19,30,30);
    img.image = [UIImage imageNamed:@"backBtnImg.png"];
    [button setImage:img.image forState:UIControlStateNormal];
    [_highlightView addSubview:button];
    //
    _session = [[AVCaptureSession alloc] init];
    _device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;

    _input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
    if (_input) {
        [_session addInput:_input];
    } else {
        NSLog(@"Error: %@", error);
    }

    _output = [[AVCaptureMetadataOutput alloc] init];
    [_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
    [_session addOutput:_output];

    _output.metadataObjectTypes = [_output availableMetadataObjectTypes];

    _prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
    // _prevLayer.frame = CGRectMake(20, 70, 280, 280);
    _prevLayer.frame = barcameraView.bounds;
    _prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [barcameraView.layer addSublayer:_prevLayer];

    [_session startRunning];

    [barcameraView bringSubviewToFront:_highlightView];
    [self.view bringSubviewToFront:_label];
}