在AVFoundation相机Feed上应用CIFilter失败

时间:2015-08-04 07:59:26

标签: ios objective-c swift avfoundation core-image

我正在尝试将CIFilter应用到实时相机输入(并且能够捕获过滤后的静止图像)。

我在StackOverflow上看到了一些与此问题有关的代码,但我还没有能够让它发挥作用。

我的问题是,在方法captureOutput中,过滤器似乎已正确应用(我在那里放了一个断点并快速查找它),但我在UIView中看不到它(我看到原始的Feed ,没有过滤器。)

另外我不确定我应该在会话中添加哪个输出:

[self.session addOutput: self.stillOutput]; //AVCaptureStillImageOutput
[self.session addOutput: self.videoDataOut]; //AVCaptureVideoDataOutput

在寻找连接时我应该迭代哪些(findVideoConnection)。

我完全糊涂了。

这里有一些代码:

viewDidLoad中

-(void)viewDidLoad {
    [super viewDidLoad];

    self.shutterButton.userInteractionEnabled = YES;
    self.context = [CIContext contextWithOptions: @{kCIContextUseSoftwareRenderer : @(YES)}];

    self.filter = [CIFilter filterWithName:@"CIGaussianBlur"];
    [self.filter setValue:@15 forKey:kCIInputRadiusKey];
}

准备会议

-(void)prepareSessionWithDevicePosition: (AVCaptureDevicePosition)position {

    AVCaptureDevice* device = [self videoDeviceWithPosition: position];
    self.currentPosition = position;        

    self.session = [[AVCaptureSession alloc] init];
    self.session.sessionPreset = AVCaptureSessionPresetPhoto;

    NSError* error = nil;
    self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error];

    if ([self.session canAddInput: self.deviceInput]) {
        [self.session addInput: self.deviceInput];
    }

    AVCaptureVideoPreviewLayer* previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession: self.session];
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;

    self.videoDataOut = [AVCaptureVideoDataOutput new];
    [self.videoDataOut setSampleBufferDelegate: self queue:dispatch_queue_create("bufferQueue", DISPATCH_QUEUE_SERIAL)];
    self.videoDataOut.alwaysDiscardsLateVideoFrames = YES;

    CALayer* rootLayer = [[self view] layer];
    rootLayer.masksToBounds = YES;
    CGRect frame = self.previewView.frame;
    previewLayer.frame = frame;
    [rootLayer insertSublayer: previewLayer atIndex: 1];

    self.stillOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary* outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    self.stillOutput.outputSettings = outputSettings;

    [self.session addOutput: self.stillOutput];
    //tried [self.session addOutput: self.videoDataOut];
    //and didn't work (filtered image didn't show, and also couldn't take pictures)
    [self findVideoConnection];
}

查找视频连接

-(void)findVideoConnection {

    for (AVCaptureConnection* connection in self.stillOutput.connections) {
//also tried self.videoDataOut.connections        
        for (AVCaptureInputPort* port in [connection inputPorts]) {

            if ([[port mediaType] isEqualToString: AVMediaTypeVideo]) {

                self.videoConnection = connection;
                break;
            }
        }

        if (self.videoConnection != nil) {
            break;
        }
    }
}

捕获输出,应用过滤器并将其放入CALayer

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // turn buffer into an image we can manipulate
    CIImage *result = [CIImage imageWithCVPixelBuffer:imageBuffer];

    // filter
    [self.filter setValue:result forKey: @"inputImage"];

    // render image
    CGImageRef blurredImage = [self.context createCGImage:self.filter.outputImage fromRect:result.extent];

    UIImage* img = [UIImage imageWithCGImage: blurredImage];
    //Did this to check whether the image was actually filtered.
    //And surprisingly it was.

    dispatch_async(dispatch_get_main_queue(), ^{

        //The image present in my UIView is for some reason not blurred.
        self.previewView.layer.contents = (__bridge id)blurredImage;
        CGImageRelease(blurredImage);
    });
}

0 个答案:

没有答案