420YpCbCr8到RGB对话

时间:2014-12-21 08:46:46

标签: video ios7 opengl-es-2.0 rgb fragment-shader

我尝试使用从视频流

获得的CVPixelBuffer设置纹理
NSDictionary* videoOutputOptions = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
        self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:videoOutputOptions];

到我想要使用的纹理。为此,我使用下一个片段着色器:

varying lowp vec2 v_texCoord;
precision mediump float;

uniform sampler2D SamplerUV;
uniform sampler2D SamplerY;
uniform mat3 colorConversionMatrix;

void main()
{
    mediump vec3 yuv;
    lowp vec3 rgb;

    // Subtract constants to map the video range start at 0
    yuv.x = (texture2D(SamplerY, v_texCoord).r - (16.0/255.0));
    yuv.yz = (texture2D(SamplerUV, v_texCoord).rg - vec2(0.5, 0.5));

    rgb = colorConversionMatrix * yuv;

    gl_FragColor = vec4(rgb,1);
}

会话矩阵

// BT.709, which is the standard for HDTV.
static const GLfloat kColorConversion709[] = {
    1.164,  1.164, 1.164,
    0.0, -0.213, 2.112,
    1.793, -0.533,   0.0,
};

但结果我得到greened纹理 - 认为这意味着我使用了错误的对话。 我的结果:

greened image

因此,也尝试将konversation矩阵更改为另一个 - here。 还可以尝试this资源中的变体。但是看起来它不仅可能是在对话中出现问题,而且在片段着色器中也是如此?

任何建议,为什么我得到绿化图像? (的问题)。

修改

这是我如何获得纹理的方法(使用apple sample AVBasicVideoOutputthis

- (void)displayPixelBuffer:(CVPixelBufferRef)pixelBuffer frameSize:(CGSize)presentationSize
{
    CVReturn err;
    if (pixelBuffer != NULL) {
        int frameWidth = (int)CVPixelBufferGetWidth(pixelBuffer);
        int frameHeight = (int)CVPixelBufferGetHeight(pixelBuffer);

        if (!_videoTextureCache) {
            NSLog(@"No video texture cache");
            return;
        }
        [self cleanUpTextures];
        //Use the color attachment of the pixel buffer to determine the appropriate color conversion matrix.
        CFTypeRef colorAttachments = CVBufferGetAttachment(pixelBuffer, kCVImageBufferYCbCrMatrixKey, NULL);

        if (colorAttachments == kCVImageBufferYCbCrMatrix_ITU_R_601_4) {
            _preferredConversion = kColorConversion601;
        }
        else {
            _preferredConversion = kColorConversion709;
        }
         //CVOpenGLESTextureCacheCreateTextureFromImage will create GLES texture optimally from CVPixelBufferRef.
         //Create Y and UV textures from the pixel buffer. These textures will be drawn on the frame buffer Y-plane.
        glActiveTexture(GL_TEXTURE0);
        err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL,  GL_TEXTURE_2D, GL_LUMINANCE, frameWidth, frameHeight, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &_lumaTexture);
        if (err) {
            NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
        }

        glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

        // UV-plane.
        glActiveTexture(GL_TEXTURE1);
        err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, frameWidth / 2, frameHeight / 2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &_chromaTexture);
        if (err) {
            NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
        }

        glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

        glBindFramebuffer(GL_FRAMEBUFFER, _vertexBufferID);

        CFRelease(pixelBuffer);
    }
}

修改

如果有人提出-1 - 请添加评论为什么你这样做,哪些是错的 - 也许是因为有人问题是明显的,但对其他人来说 - 不是

1 个答案:

答案 0 :(得分:0)

最后 - 我发现了我的错误 - 愚蠢的错误:

相反SamplerUV我在代码中使用SamplerUY来分配制服 - 更改它 - 并且一切正常![/ p>

还可以使用下一个会话矩阵

    1.1643,  0.0000,  1.2802,
    1.1643, -0.2148, -0.3806,
    1.1643,  2.1280,  0.0000

所以,如果你有不正确的图像 - 检查每个组件 - 色度和亮度 - 其中一个不正确。

也许对某人来说这个信息会很有用