在iOS中用白色像素替换部分像素缓冲区

时间:2016-04-25 10:54:54

标签: ios objective-c avfoundation cvpixelbuffer

我正在使用iPhone相机捕捉实时视频并将像素缓冲区送入执行某些对象识别的网络。以下是相关代码:(我不会发布设置AVCaptureSession的代码 因为这是非常标准的。)

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    OSType sourcePixelFormat = CVPixelBufferGetPixelFormatType( pixelBuffer );
    int doReverseChannels;
    if ( kCVPixelFormatType_32ARGB == sourcePixelFormat ) {
        doReverseChannels = 1;
    } else if ( kCVPixelFormatType_32BGRA == sourcePixelFormat ) {
        doReverseChannels = 0;
    } else {
        assert(false);
    }

    const int sourceRowBytes = (int)CVPixelBufferGetBytesPerRow( pixelBuffer );
    const int width = (int)CVPixelBufferGetWidth( pixelBuffer );
    const int fullHeight = (int)CVPixelBufferGetHeight( pixelBuffer );
    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
    unsigned char* sourceBaseAddr = CVPixelBufferGetBaseAddress( pixelBuffer );
    int height;
    unsigned char* sourceStartAddr;
    if (fullHeight <= width) {
        height = fullHeight;
        sourceStartAddr = sourceBaseAddr;
    } else {
        height = width;
        const int marginY = ((fullHeight - width) / 2);
        sourceStartAddr = (sourceBaseAddr + (marginY * sourceRowBytes));
    }
}

然后,网络将sourceStartAddrwidthheightsourceRowBytes&amp; doReverseChannels作为输入。

我的问题如下:用所有白色像素替换或删除部分图像数据的最简单和/或最有效的方法是什么?是否有可能直接覆盖像素缓冲区数据的e部分,如果是,如何?

我对这个像素缓冲区的工作原理只有非常基本的了解,所以如果我在这里遗漏了一些非常基本的东西,我会道歉。我在Stackoverflow上找到的与我最密切相关的问题是this one,其中EAGLContext用于向视频帧添加文本。虽然这实际上适用于我的目标,只需要替换单个图像,但我认为如果应用于每个视频帧,这一步将会破坏性能,我想知道是否有其他方法。这里的任何帮助将不胜感激。

3 个答案:

答案 0 :(得分:6)

这是一种在不使用Core Graphics或OpenGL等其他库的情况下操作CVPixelBufferRef的简单方法:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    const int kBytesPerPixel = 4;
    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
    int bufferWidth = (int)CVPixelBufferGetWidth( pixelBuffer );
    int bufferHeight = (int)CVPixelBufferGetHeight( pixelBuffer );
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow( pixelBuffer );
    uint8_t *baseAddress = CVPixelBufferGetBaseAddress( pixelBuffer );

    for ( int row = 0; row < bufferHeight; row++ )
    {
        uint8_t *pixel = baseAddress + row * bytesPerRow;
        for ( int column = 0; column < bufferWidth; column++ )
        {
            if ((row < 100) && (column < 100) {
                pixel[0] = 255; // BGRA, Blue value
                pixel[1] = 255; // Green value
                pixel[2] = 255; // Red value
            }
            pixel += kBytesPerPixel;
        }
    }

    CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );

    // Do whatever needs to be done with the pixel buffer
}

这将覆盖图像中带有白色像素的100 x 100像素的左上角补丁。

我在这个名为RosyWriter的Apple Developer Example中找到了这个解决方案。

有点惊讶我在这里没有得到任何答案,考虑到这是多么容易。希望这有助于某人。

答案 1 :(得分:2)

我不得不使用captureOutput和CVPixelBuffer处理来自iPhone相机的帧。我使用您的代码(谢谢!)以每秒15帧的速度在像素缓冲区中循环播放约200k像素,但是我经常遇到掉帧问题。事实证明,在Swift中,while循环比for ... in循环快10倍。

赞:

0.09秒:

   for row in 0..<bufferHeight {

        for col in 0..<bufferWidth {
          // process pixels

0.01秒:

    var x = 0
    var y = 0

    while y < bufferHeight
    {
        y += 1
        x = 0;
        while x < bufferWidth
        {
        // process pixels 
        }
     }

答案 2 :(得分:1)

通过Swift实现对其进行更新。

const express = require('express');
const app = express();
const expbs = require('express-handlebars');
const path = require('path');

var paginate = require('handlebars-paginate');
expbs.registerHelper('paginate', paginate);

const routes = require('./routes/handlers');

app.use(express.static('public'));  

const hbs = expbs.create({
    defaultLayout: 'main',
    layoutsDir: path.join(__dirname, 'views/mainLayout'), 
    partialsDir: path.join(__dirname, 'views/pieces'), 
    helpers: {
        calculation: function(value) {
            return value * 5;
        },
        list: function(value, options) {
            let out = "<ul>";
            for (let i = 0; i < value.length; i++) {
                out = out + "<li>" +  options.fn(value[i]) + "</li>";
            }
            return out + "</ul>";
        }
    }
});

app.engine('handlebars', hbs.engine);
app.set('view engine', 'handlebars');

app.use('/route1', routes);

app.listen(8080, () => {
    console.log('Server is starting at port ', 8080);
});

由于 CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0)) let bufferWidth = Int(CVPixelBufferGetWidth(pixelBuffer)) let bufferHeight = Int(CVPixelBufferGetHeight(pixelBuffer)) let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer) guard let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer) else { return } for row in 0..<bufferHeight { var pixel = baseAddress + row * bytesPerRow for col in 0..<bufferWidth { let blue = pixel blue.storeBytes(of: 255, as: UInt8.self) let red = pixel + 1 red.storeBytes(of: 255, as: UInt8.self) let green = pixel + 2 green.storeBytes(of: 255, as: UInt8.self) let alpha = pixel + 3 alpha.storeBytes(of: 255, as: UInt8.self) } pixel += bytesPerPixel; } CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0)) 给出了不支持下标的baseAddress,因此您必须改用UnsafeMutableRawPointer。这基本上是与上述Objective-C版本唯一的主要区别。