如何用透明背景编码视频

时间:2014-05-05 15:47:11

标签: macos video h.264 avassetwriter

我在h264中使用cocoa为OSX(带AVAssetWriter)编码视频。 这是配置:

// Configure video writer
AVAssetWriter *m_videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:@(outputFile)] fileType:AVFileTypeMPEG4 error:NULL];

// configure video input
NSDictionary *videoSettings = @{ AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : @(m_width), AVVideoHeightKey : @(m_height) };
AVAssetWriterInput* m_writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

// Add video input into video writer
[m_videoWriter addInput:m_writerInput];

// Start video writer
[m_videoWriter startWriting];
[m_videoWriter startSessionAtSourceTime:kCMTimeZero];

我正在使用' AVAssetWriterInputPixelBufferAdaptor'要将帧添加到合成中的元素,如下所示:

AVAssetWriterInputPixelBufferAdaptor *m_pixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:m_writerInput sourcePixelBufferAttributes:NULL];

uint8_t* videobuffer = m_imageRGBA.data;


CVPixelBufferRef pixelBuffer = NULL;
CVReturn status = CVPixelBufferCreate (NULL, m_width, m_height, kCVPixelFormatType_32ARGB, NULL, &pixelBuffer);
if ((pixelBuffer == NULL) || (status != kCVReturnSuccess))
{
    NSLog(@"Error CVPixelBufferPoolCreatePixelBuffer[pixelBuffer=%@][status=%d]", pixelBuffer, status);
    return;
}
else
{
    uint8_t *videobuffertmp = videobuffer;
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    GLubyte *pixelBufferData = (GLubyte *)CVPixelBufferGetBaseAddress(pixelBuffer);

    //printf("Video frame pixel: %d, %d, %d, %d\n", videobuffertmp[0], videobuffertmp[1], videobuffertmp[2], videobuffertmp[3]);

    for( int row=0 ; row<m_width ; ++row )
    {
        for( int col=0 ; col<m_height ; ++col )
        {
            memcpy(&pixelBufferData[0], &videobuffertmp[3], sizeof(uint8_t));       // alpha
            memcpy(&pixelBufferData[1], &videobuffertmp[2], sizeof(uint8_t));       // red
            memcpy(&pixelBufferData[2], &videobuffertmp[1], sizeof(uint8_t));       // green
            memcpy(&pixelBufferData[3], &videobuffertmp[0], sizeof(uint8_t));       // blue

            pixelBufferData += 4*sizeof(uint8_t);
            videobuffertmp  += 4*sizeof(uint8_t);
        }
    }

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
}

// transform new frame into pixel buffer
[m_pixelBufferAdaptor appendPixelBuffer:pixelBuffer
                   withPresentationTime:CMTimeMake(m_frameNumber, m_framesPerSecond)];

CFRelease(pixelBuffer);
pixelBuffer = nil;

在像素数据中,像素的alpha值被定义为透明,但视频没有透明区域。

我不确定编码器是否忽略了alpha值,或者无法编码具有透明区域的视频。 有没有办法在编码过程中包含alpha通道值?

1 个答案:

答案 0 :(得分:2)

你可能不会,至少在H.264中。请参阅:How to create an h264 video with an alpha channel for use with HTML5 Canvas?

我猜透明度可以在编码到H.264之前用于混合和效果处理,但不能用在最终输出中。

一种解决方法可能是将透明区域设置为纯绿色值,并在稍后的视频编辑过程中将该颜色用作遮罩(例如,当他们使用绿色背景进行天气预报时)。显然,只有当输出用于这样的编辑过程而不是最终输出时才有效。