使用AVCaptureSession捕获视频,使用EAGLContext无可见输出

时间:2014-03-25 17:32:23

标签: ios opengl-es avcapturesession core-image yuv

我使用AVCaptureSession在iPhone上使用后置摄像头捕捉实时视频,使用CoreImage应用一些过滤器,然后尝试使用OpenGL ES输出生成的视频。大部分代码来自WWDC 2012会议的核心图像技术'。

使用[UIImage imageWithCIImage:...]显示过滤器链的输出或通过为每个帧创建CGImageRef工作正常。但是,当尝试使用OpenGL ES进行显示时,我得到的只是一个黑屏。

在课程中,他们使用自定义视图类来显示输出,但该类的代码不可用。我的视图控制器类扩展了GLKViewController,它的视图类被设置为GLKView。

我已经搜索并下载了我能找到的所有GLKit教程和示例,但没有任何帮助。特别是当我尝试从here运行示例时,我无法获得任何视频输出。有人能指出我正确的方向吗?

#import "VideoViewController.h"

@interface VideoViewController ()
{
    AVCaptureSession *_session;

    EAGLContext *_eaglContext;
    CIContext *_ciContext;

    CIFilter *_sepia;
    CIFilter *_bumpDistortion;
}

- (void)setupCamera;
- (void)setupFilters;

@end

@implementation VideoViewController

- (void)viewDidLoad
{
    [super viewDidLoad];

    GLKView *view = (GLKView *)self.view;

    _eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
    [EAGLContext setCurrentContext:_eaglContext];

    view.context = _eaglContext;

    // Configure renderbuffers created by the view
    view.drawableColorFormat = GLKViewDrawableColorFormatRGBA8888;
    view.drawableDepthFormat = GLKViewDrawableDepthFormat24;
    view.drawableStencilFormat = GLKViewDrawableStencilFormat8;

    [self setupCamera];
    [self setupFilters];
}

- (void)setupCamera {
    _session = [AVCaptureSession new];
    [_session beginConfiguration];

    [_session setSessionPreset:AVCaptureSessionPreset640x480];

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
    [_session addInput:input];

    AVCaptureVideoDataOutput *dataOutput = [AVCaptureVideoDataOutput new];
    [dataOutput setAlwaysDiscardsLateVideoFrames:YES];

    NSDictionary *options;
    options = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] };

    [dataOutput setVideoSettings:options];

    [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

    [_session addOutput:dataOutput];
    [_session commitConfiguration];
}

#pragma mark Setup Filters
- (void)setupFilters {
    _sepia = [CIFilter filterWithName:@"CISepiaTone"];
    [_sepia setValue:@0.7 forKey:@"inputIntensity"];

    _bumpDistortion = [CIFilter filterWithName:@"CIBumpDistortion"];
    [_bumpDistortion setValue:[CIVector vectorWithX:240 Y:320] forKey:@"inputCenter"];
    [_bumpDistortion setValue:[NSNumber numberWithFloat:200] forKey:@"inputRadius"];
    [_bumpDistortion setValue:[NSNumber numberWithFloat:3.0] forKey:@"inputScale"];
}

#pragma mark Main Loop
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    // Grab the pixel buffer
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);

    // null colorspace to avoid colormatching
    NSDictionary *options = @{ (id)kCIImageColorSpace : (id)kCFNull };
    CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer options:options];

    image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(-M_PI/2.0)];
    CGPoint origin = [image extent].origin;
    image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];

    // Pass it through the filter chain
    [_sepia setValue:image forKey:@"inputImage"];
    [_bumpDistortion setValue:_sepia.outputImage forKey:@"inputImage"];

    // Grab the final output image
    image = _bumpDistortion.outputImage;

    // draw to GLES context
    [_ciContext drawImage:image inRect:CGRectMake(0, 0, 480, 640) fromRect:[image extent]];

    // and present to screen
    [_eaglContext presentRenderbuffer:GL_RENDERBUFFER];

    NSLog(@"frame hatched");

    [_sepia setValue:nil forKey:@"inputImage"];
}

- (void)loadView {
    [super loadView];

    // Initialize the CIContext with a null working space
    NSDictionary *options = @{ (id)kCIContextWorkingColorSpace : (id)kCFNull };
    _ciContext = [CIContext contextWithEAGLContext:_eaglContext options:options];
}

- (void)viewWillAppear:(BOOL)animated {
    [super viewWillAppear:animated];

    [_session startRunning];
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end

1 个答案:

答案 0 :(得分:2)

哇,实际上我自己弄明白了。毕竟这项工作可能适合我;)

首先,无论出于何种原因,此代码仅适用于OpenGL ES 2,而不适用于3.但要找出原因。

其次,我在loadView方法中设置了CIContext,它显然在viewDidLoad方法之前运行,因此使用了尚未初始化的EAGLContext。