在项目中集成ZBar reader:创建自定义viewcontroller

时间:2015-03-30 11:24:03

标签: ios objective-c uiviewcontroller zbar

我已将此代码添加到我的项目中。它工作正常,从当前视图创建和显示ZBarReaderViewController的实例。

但是我希望能够定义当前视图控制器的自定义区域并在该区域内显示ZBarReaderViewController,同时仍然显示我的" previous / other"视图。下面的代码以全屏模式显示视图控制器。

在界面构建器上,我只能在现有的ViewController中添加UIViews,因此我无法将自定义视图区域与 ZBarReaderViewController 相关联。

我唯一能做的就是将它与 ZBarReaderView 实例相关联,但由于ZBarReaderViewController是一个封闭源(我只能看到ZBar reader project上我使用的头文件)我无法修改行为。

我该如何解决这个问题?

  • (IBAction)startScanning:(id)sender {

    的NSLog(@"扫描..&#34); resultTextView.text = @"扫描..&#34 ;;

    ZBarReaderViewController * codeReader = [ZBarReaderViewController new]; codeReader.readerDelegate =自我; codeReader.supportedOrientationsMask = ZBarOrientationMaskAll;

    ZBarImageScanner * scanner = codeReader.scanner; [scanner setSymbology:ZBAR_I25 config:ZBAR_CFG_ENABLE to:0];

    [self presentViewController:codeReader animated:YES completion:nil]; }

1 个答案:

答案 0 :(得分:1)

以下是扫描仪视图控制器的示例。我使用故事板来创建视图,但您也可以以编程方式或使用常规笔尖来完成。

首先,创建您的视图(让我们在故事板中说出来)并在其中放置一个UIView,您希望显示扫描仪。

现在,让我们看看视图控制器(请参阅其中的评论):

#import <AVFoundation/AVFoundation.h>
#import "ScannerViewController.h"

@interface ScannerViewController () <AVCaptureMetadataOutputObjectsDelegate>

// UI
@property (weak, nonatomic) IBOutlet UIView *viewPreview; // Connect it to the view you created in the storyboard, for the scanner preview

// Video
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@property (nonatomic, strong) AVAudioPlayer *audioPlayer;
@property (nonatomic, strong) AVCaptureSession *flashLightSession;
@property (nonatomic) BOOL isReading;

@end

@implementation ScannerViewController

- (void)viewDidLoad
{
     [super viewDidLoad];

     // Initially make the captureSession object nil.
     _captureSession = nil;

    // Set the initial value of the flag to NO.
    _isReading = NO;
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
}

- (void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];

    [self startStopReading:nil];
}

- (IBAction)startStopReading:(id)sender
{
    if (!_isReading) {
        [self startReading];
    }
    else {
        // In this case the app is currently reading a QR code and it should stop doing so.
        [self stopReading];

    }

    // Set to the flag the exact opposite value of the one that currently has.
    _isReading = !_isReading;
}

#pragma mark - Private

- (BOOL)startReading
{
    NSError *error;

    // Get an instance of the AVCaptureDevice class to initialize a device object and provide the video
    // as the media type parameter.
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Get an instance of the AVCaptureDeviceInput class using the previous device object.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];

    if (!input) {
        // If any error occurs, simply log the description of it and don't continue any more.
        NSLog(@"%@", [error localizedDescription]);
        return NO;
    }

    // Initialize the captureSession object.
    _captureSession = [[AVCaptureSession alloc] init];
    // Set the input device on the capture session.
    [_captureSession addInput:input];

    // Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
    AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
    [_captureSession addOutput:captureMetadataOutput];

    // Create a new serial dispatch queue.
    dispatch_queue_t dispatchQueue;
    dispatchQueue = dispatch_queue_create("myQueue", NULL);
    [captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
    [captureMetadataOutput setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]]; // Add all the types you need, currently it is just QR code

    // Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
    _videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
    [_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    [_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
    [_viewPreview.layer addSublayer:_videoPreviewLayer];

    // Start video capture.
    [_captureSession startRunning];

    return YES;
}

- (void)stopReading
{
    // Stop video capture and make the capture session object nil.
    [_captureSession stopRunning];
    _captureSession = nil;

    // Remove the video preview layer from the viewPreview view's layer.
    [_videoPreviewLayer removeFromSuperlayer];
}

#pragma mark - AVCaptureMetadataOutputObjectsDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
   // Check if the metadataObjects array is not nil and it contains at least one object.
   if (metadataObjects != nil && [metadataObjects count] > 0) {

        [self performSelectorOnMainThread:@selector(stopReading) withObject:nil waitUntilDone:NO];

        _isReading = NO;

        // If the audio player is not nil, then play the sound effect.
        if (_audioPlayer) {
            [_audioPlayer play];
        }

        // This was my result, but you can search the metadataObjects array for what you need exactly
        NSString *code = [(AVMetadataMachineReadableCodeObject *)[metadataObjects objectAtIndex:0] stringValue];

    }

}