子ViewController不会显示摄像头预览

时间:2015-08-18 18:09:40

标签: ios objective-c avfoundation

我正在努力让相机预览工作在2个不同的视图控制器上,这对于我在这个需要单独的应用代表的应用中使用的另一个SDK的解决方法是必要的。

基本上,我有一个显示在UIView中的相机预览,它在启动时加载viewDidLoad方法没问题。然后我需要按一个按钮并加载另一个可以读取和处理QR码的相机预览。我有两个视图的单独xib,并且子视图控制器没有问题,但相机预览从未加载,因此扫描QR码非常困难。

这就是我所拥有的:

这是父ViewController的viewDidLoad。相机预览没有问题。

- (void)viewDidLoad {
[super viewDidLoad];
[DRDouble sharedDouble].delegate = self;
NSLog(@"SDK Version: %@", kDoubleBasicSDKVersion);

//capture live preview
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;

CALayer *viewLayer = cameraPreview.layer;
NSLog(@"viewLayer = %@", viewLayer);

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

captureVideoPreviewLayer.frame = cameraPreview.bounds;
[cameraPreview.layer addSublayer:captureVideoPreviewLayer];

NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *device = [possibleDevices lastObject];

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
    // Handle the error appropriately.
    NSLog(@"ERROR: trying to open camera: %@", error);
}

[session addInput:input];

[session startRunning];
}

然后我用这个按钮调用子ViewController

- (IBAction)switchScanView {

[session stopRunning];
[session release];

[self presentViewController:[[ViewController alloc] init] animated:true completion:nil];

}

这是来自子视图控制器的viewDidLoad

AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];

if (!input) {
    NSLog(@"%@", [error localizedDescription]);
    return NO;
}

_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];

AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];

dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];

viewPreview view's layer.
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_mviewPreview.layer.bounds];
[_mviewPreview.layer addSublayer:_videoPreviewLayer];


// Start video capture.
[_captureSession startRunning];

我怀疑它与在两个不同的控制器上运行AVCaptureSession有关,但你可以看到我在AVCaptureSession对象上调用了[stopRunning]和[release]两个按钮,它仍然没有出现。 / p>

我不知道还有什么可以尝试,有人看到发生了什么吗?

1 个答案:

答案 0 :(得分:0)

根据您描述的内容以及查看代码,您似乎正在尝试同时从两个摄像头捕获,对吧?你没有明确地说,但我可以在代码中看到你第一次做

AVCaptureDevice *device = [possibleDevices lastObject];

然后进行第二次捕获:

AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 

不幸的是,iOS中不允许同时从两个摄像头捕获。你不能这样做。这里的文档:https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW14

如果那不是意图,那么我道歉。