使用OpenCVSharp将YUV420转换为RGB

时间:2019-07-02 09:57:46

标签: c# wpf opencv

我想将YUV流转换为RGB字节,以便可以通过WPF图像显示它们。

每帧 y u v 将所有YUV值放置在它们各自的数组中。色度宽度和高度是各自亮度尺寸的一半。

byte[] y = (byte[])yData;
byte[] u = (byte[])uData;
byte[] v = (byte[])vData;

var ym = new Mat(new[] { lumaHeight, lumaWidth }, MatType.CV_8UC1, y, new long[] { lumaStride });
var um = new Mat(new[] { chromaWidth, chromaHeight }, MatType.CV_8UC1, u, new long[] { chromaStride});
var vm = new Mat(new[] { chromaWidth, chromaHeight }, MatType.CV_8UC1, v, new long[] { chromaStride});

我使用以下代码将数据传递给openCV:

var combinedSource = new[] { ym, um, vm };

var m = new Mat();
var src = InputArray.Create(combinedSource);
var @out = OutputArray.Create(m);
Cv2.CvtColor(src, @out, ColorConversionCodes.YUV2BGR);
ImageData = @out.GetMat().ToBytes();

但是我收到错误{"!_src.empty()"} yuv 数组绝对不是空的。

我尝试使用另一种方法:

var combinedOut = new Mat(new[] { lumaHeight, lumaWidth }, MatType.CV_8UC3);
Cv2.Merge(combinedSource, combinedOut);
var bgra = combinedOut.CvtColor(ColorConversionCodes.YUV2BGR);
ImageData = bgra.ToBytes();

但是会收到错误 {"mv[i].size == mv[0].size && mv[i].depth() == depth"}

为什么我会收到这些错误?正确的转换方法是什么?

2 个答案:

答案 0 :(得分:0)

为什么不尝试通过遍历所有源数组和输出数组进行转换。为了使这种方法起作用,您必须将所有数组组合为一个,然后将该数组传递给以下函数

private static unsafe void YUV2RGBManaged(byte[] YUVData, byte[] RGBData, int width, int height)
    {
        fixed(byte* pRGBs = RGBData, pYUVs = YUVData)
        {
            for (int r = 0; r < height; r++)
            {
                byte* pRGB = pRGBs + r * width * 3;
                byte* pYUV = pYUVs + r * width * 2;

                //process two pixels at a time
                for (int c = 0; c < width; c += 2)
                {
                    int C1 = pYUV[1] - 16;
                    int C2 = pYUV[3] - 16;
                    int D = pYUV[2] - 128;
                    int E = pYUV[0] - 128;

                    int R1 = (298 * C1 + 409 * E + 128) >> 8;
                    int G1 = (298 * C1 - 100 * D - 208 * E + 128) >> 8;
                    int B1 = (298 * C1 + 516 * D + 128) >> 8;

                    int R2 = (298 * C2 + 409 * E + 128) >> 8;
                    int G2 = (298 * C2 - 100 * D - 208 * E + 128) >> 8;
                    int B2 = (298 * C2 + 516 * D + 128) >> 8;
#if true
                    //check for overflow
                    //unsurprisingly this takes the bulk of the time.
                    pRGB[0] = (byte)(R1 < 0 ? 0 : R1 > 255 ? 255 : R1);
                    pRGB[1] = (byte)(G1 < 0 ? 0 : G1 > 255 ? 255 : G1);
                    pRGB[2] = (byte)(B1 < 0 ? 0 : B1 > 255 ? 255 : B1);

                    pRGB[3] = (byte)(R2 < 0 ? 0 : R2 > 255 ? 255 : R2);
                    pRGB[4] = (byte)(G2 < 0 ? 0 : G2 > 255 ? 255 : G2);
                    pRGB[5] = (byte)(B2 < 0 ? 0 : B2 > 255 ? 255 : B2);
#else
                    pRGB[0] = (byte)(R1);
                    pRGB[1] = (byte)(G1);
                    pRGB[2] = (byte)(B1);

                    pRGB[3] = (byte)(R2);
                    pRGB[4] = (byte)(G2);
                    pRGB[5] = (byte)(B2);
#endif

                    pRGB += 6;
                    pYUV += 4;
                }
            }
        }
    }

答案 1 :(得分:0)

遵循此post之后,我可以得出以下解决方案:

    private byte[] YuvToRgbOpenCv(object luma, object chroma, object yData, object uData, object vData)
    {
        int[] lumaArray = (int[])luma;
        int[] chromaArray = (int[])chroma;

        int lumaWidth = lumaArray[0];
        int lumaHeight = lumaArray[1];

        int chromaWidth = chromaArray[0];
        int chromaHeight = chromaArray[1];

        byte[] y = (byte[])yData;
        byte[] u = (byte[])uData;
        byte[] v = (byte[])vData;

        var ym = new Mat(new[] { lumaHeight, lumaWidth }, MatType.CV_8UC1, y);
        var um = new Mat(new[] { chromaHeight, chromaWidth }, MatType.CV_8UC1, u);
        var vm = new Mat(new[] { chromaHeight, chromaWidth }, MatType.CV_8UC1, v);

        var umResized = um.Resize(new OpenCvSharp.Size(lumaWidth, lumaHeight), 0, 0, InterpolationFlags.Nearest);
        var vmResized = vm.Resize(new OpenCvSharp.Size(lumaWidth, lumaHeight), 0, 0, InterpolationFlags.Nearest);

        var yuvMat = new Mat();
        var resizedChannels = new[] { ym, umResized, vmResized };

        Cv2.Merge(resizedChannels, yuvMat);

        var bgr = yuvMat.CvtColor(ColorConversionCodes.YUV2BGR);

        var result = bgr.ToBytes();

        return result;
    }

我必须调整U,V数据长度的大小以匹配Y的长度。