未找到AVCapture会话。无法添加视频输入

时间:2012-01-06 21:02:43

标签: ios image avfoundation avcapturesession

我正在尝试使用AVCaptureSession制作相机应用。现在我只想看看视频输入是否有效。但看起来似乎没有输入,我似乎无法理解为什么。

- (void)viewDidLoad
{
    [super viewDidLoad];

    session = [[AVCaptureSession alloc] init];

    [self addVideoPreviewLayer];

    CGRect layerRect = [[[self view] layer] bounds];

    [[self  previewLayer] setBounds:layerRect];
    [[self  previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
                                                                  CGRectGetMidY(layerRect))];
    [[[self view] layer] addSublayer:[self previewLayer]];

    UIButton *myButton = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    myButton.frame = CGRectMake(80, 320, 200, 44);
    [myButton setTitle:@"Click Me!" forState:UIControlStateNormal];
    [myButton addTarget:self action:@selector(scanButtonPressed) forControlEvents:UIControlEventTouchDown];
    [self.view addSubview:myButton];
}

-(void)addVideoPreviewLayer
{
    [self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self session]] autorelease]];
    [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}

-(void) addVideoInput
{
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];   
    if (videoDevice) 
    {
        NSError *error;
        AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
        if (!error) 
        {
            if ([[self session] canAddInput:videoIn])
                [[self session] addInput:videoIn];
            else
                NSLog(@"Couldn't add video input");     
        }
        else
            NSLog(@"Couldn't create video input");
    }
    else
        NSLog(@"Couldn't create video capture device");
}

-(IBAction)scanButtonPressed
{
    [self addVideoInput];
}

1 个答案:

答案 0 :(得分:0)

我是这样做的。这是由多个函数压缩而来的,因此它可能不是可编译的代码,并且大部分都删除了错误处理。

captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;

AVCaptureDevice *videoDevice;
videoDevice = [self frontFacingCamera];
if (videoDevice == nil) {
    videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];    
}

if ( videoDevice ) {
    NSError *error;
    videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 

    [captureSession addInput:self.videoInput];  
}

videoOutput = [[AVCaptureVideoDataOutput alloc] init];

[videoOutput setAlwaysDiscardsLateVideoFrames:NO];

AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];

if (conn.supportsVideoMinFrameDuration)
    conn.videoMinFrameDuration = CMTimeMake(1, frameRate);
if (conn.supportsVideoMaxFrameDuration)
    conn.videoMaxFrameDuration = CMTimeMake(1, frameRate);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    [videoOutput setVideoSettings:videoSettings]; 
[videoOutput setSampleBufferDelegate:self queue:capture_queue];

if ([captureSession canAddOutput:videoOutput])
    [captureSession addOutput:videoOutput];
else
    NSLog(@"Couldn't add video output");    

[self.captureSession startRunning];

previewLayer.session = captureSession;