iOS GPUImage过滤器无法混合到视频缓冲区中

时间:2016-05-03 11:54:56

标签: ios objective-c gpuimage avcapturesession

我正在开发一款真人应用。我需要在视频缓冲区中添加过滤器。 然后我使用了GPUImage框架并编写了一个过滤器。它看起来很好,但缓冲区在'willOutputSampleBuffer:'函数中没有任何过滤器效果。

以下是一些关键代码:

    self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:self.sessionPreset cameraPosition:AVCaptureDevicePositionFront];
    self.videoCamera.delegate = self;
    self.videoCamera.horizontallyMirrorFrontFacingCamera = YES;

    self.filterView = [[GPUImageView alloc] init];

    GPUImageBeautifyFilter *beautifyFilter = [[GPUImageBeautifyFilter alloc] init];
    [self.videoCamera addTarget:beautifyFilter];
    [beautifyFilter addTarget:self.filterView];


    dispatch_async(dispatch_get_main_queue(), ^{
        [self.view insertSubview:self.filterView atIndex:1];
        [self.filterView mas_makeConstraints:^(MASConstraintMaker *make) {
            make.edges.equalTo(self.view);
        }];
        [self.videoCamera startCameraCapture];
    });

有没有我忽略的细节?感谢!!!

1 个答案:

答案 0 :(得分:0)

我需要添加一个新的输出来过滤目标,所以我在项目中添加了这些代码,然后我得到了带过滤器的缓冲区。

GPUImageRawDataOutput *rawDataOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(720, 1280) resultsInBGRAFormat:YES];

[self.beautifyFilter addTarget:rawDataOutput];

__weak GPUImageRawDataOutput *weakOutput = rawDataOutput;

[rawDataOutput setNewFrameAvailableBlock:^{

    __strong GPUImageRawDataOutput *strongOutput = weakOutput;
    [strongOutput lockFramebufferForReading];
    GLubyte *outputBytes = [strongOutput rawBytesForImage];
    NSInteger bytesPerRow = [strongOutput bytesPerRowInOutput];
    CVPixelBufferRef pixelBuffer = NULL;
    CVPixelBufferCreateWithBytes(kCFAllocatorDefault, 720, 1280, kCVPixelFormatType_32BGRA, outputBytes, bytesPerRow, nil, nil, nil, &pixelBuffer);

    //Do something with pixelBuffer

    [strongOutput unlockFramebufferAfterReading];        
    CFRelease(pixelBuffer);

}];