当我创建第二个AVCaptureSession时,自动对焦无法在第一个AVCaptureSession上运行。要创建的第二个会话是自动对焦工作的会话,第一个创建的会话不会自动对焦。
我希望任何一个会话都可以在另一个会话停止后自动对焦,就像两个会话的自动白平衡和自动曝光工作一样。如果您使用下面的示例代码观察日志窗口,您可以看到通过的键值观察消息;但是在最高会话运行时从不改变焦点消息。
旁注:不幸的是我在我使用的第三方库中有一个错误,这使我无法简单地重新创建会话,因为我在它们之间切换(它正在泄漏其AVCaptureSessions,最终导致应用程序被杀死)。完整的故事是这个库正在为我创建一个捕获会话,它有一个公共API来启动和停止会话,我希望创建另一个会话。下面的代码演示了问题,但没有使用第三方库。
我创建了一个测试应用程序,其中包含下面列出的代码和一个有两个视图的XIB文件,一个在另一个之上,另一个按钮连接到switchSessions方法,用于演示该问题。
这可能与此处描述的问题有关, Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession),虽然没有提到两个捕获会议。
标题文件:
#import <UIKit/UIKit.h>
@class AVCaptureSession;
@class AVCaptureStillImageOutput;
@class AVCaptureVideoPreviewLayer;
@class AVCaptureDevice;
@class AVCaptureDeviceInput;
@interface AVCaptureSessionFocusBugViewController : UIViewController {
IBOutlet UIView *_topView;
IBOutlet UIView *_bottomView;
AVCaptureDevice *_device;
AVCaptureSession *_topSession;
AVCaptureStillImageOutput *_outputTopSession;
AVCaptureVideoPreviewLayer *_previewLayerTopSession;
AVCaptureDeviceInput *_inputTopSession;
AVCaptureSession *_bottomSession;
AVCaptureStillImageOutput *_outputBottomSession;
AVCaptureVideoPreviewLayer *_previewLayerBottomSession;
AVCaptureDeviceInput *_inputBottomSession;
}
- (IBAction)switchSessions:(id)sender;
@end
实施档案:
#import "AVCaptureSessionFocusBugViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface AVCaptureSessionFocusBugViewController ()
- (void)setupCaptureSession:(AVCaptureSession **)session
output:(AVCaptureStillImageOutput **)output
previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
input:(AVCaptureDeviceInput **)input
view:(UIView *)view;
- (void)tearDownSession:(AVCaptureSession **)session
output:(AVCaptureStillImageOutput **)output
previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
input:(AVCaptureDeviceInput **)input
view:(UIView *)view;
@end
@implementation AVCaptureSessionFocusBugViewController
- (IBAction)switchSessions:(id)sender
{
if ([_topSession isRunning]) {
[_topSession stopRunning];
[_bottomSession startRunning];
NSLog(@"Bottom session now running.");
}
else {
[_bottomSession stopRunning];
[_topSession startRunning];
NSLog(@"Top session now running.");
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context
{
NSLog(@"Observed value for key at key path %@.", keyPath);
// Enable to confirm that the focusMode is set correctly.
//NSLog(@"Autofocus for the device is set to %d.", [_device focusMode]);
}
- (void)viewDidLoad {
[super viewDidLoad];
_device = [[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] retain];
[self setupCaptureSession:&_topSession
output:&_outputTopSession
previewLayer:&_previewLayerTopSession
input:&_inputTopSession
view:_topView];
[self setupCaptureSession:&_bottomSession
output:&_outputBottomSession
previewLayer:&_previewLayerBottomSession
input:&_inputBottomSession
view:_bottomView];
// NB: We only need to observe one device, since the top and bottom sessions use the same device.
[_device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
[_device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:nil];
[_device addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:nil];
[_topSession startRunning];
NSLog(@"Starting top session.");
}
- (void)setupCaptureSession:(AVCaptureSession **)session
output:(AVCaptureStillImageOutput **)output
previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
input:(AVCaptureDeviceInput **)input
view:(UIView *)view
{
*session = [[AVCaptureSession alloc] init];
// Create the preview layer.
*previewLayer = [[AVCaptureVideoPreviewLayer layerWithSession:*session] retain];
[*previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[*previewLayer setFrame:[view bounds]];
[[view layer] addSublayer:*previewLayer];
// Configure the inputs and outputs.
[*session setSessionPreset:AVCaptureSessionPresetMedium];
NSError *error = nil;
*input = [[AVCaptureDeviceInput deviceInputWithDevice:_device error:&error] retain];
if (!*input) {
NSLog(@"Error creating input device:%@", [error localizedDescription]);
return;
}
[*session addInput:*input];
*output = [[AVCaptureStillImageOutput alloc] init];
[*session addOutput:*output];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[*output setOutputSettings:outputSettings];
[outputSettings release];
}
- (void)viewDidUnload {
[_topView release];
_topView = nil;
[_bottomView release];
_bottomView = nil;
[_device release];
_device = nil;
[self tearDownSession:&_topSession
output:&_outputTopSession
previewLayer:&_previewLayerTopSession
input:&_inputTopSession
view:_topView];
[self tearDownSession:&_bottomSession
output:&_outputBottomSession
previewLayer:&_previewLayerBottomSession
input:&_inputBottomSession
view:_bottomView];
}
- (void)tearDownSession:(AVCaptureSession **)session
output:(AVCaptureStillImageOutput **)output
previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
input:(AVCaptureDeviceInput **)input
view:(UIView *)view
{
if ([*session isRunning]) {
[*session stopRunning];
}
[*session removeOutput:*output];
[*output release];
*output = nil;
[*session removeInput:*input];
[*input release];
*input = nil;
[*previewLayer removeFromSuperlayer];
[*previewLayer release];
*previewLayer = nil;
[*session release];
*session = nil;
}
@end
答案 0 :(得分:8)
Apple技术支持已确认不支持创建两个同时捕获会话。你必须拆掉一个,然后创建另一个。