如何使用WebRTC.NET扩展视频

时间:2018-06-13 23:10:02

标签: webrtc scaling yuv libyuv

我试图在将视频缓冲区推送到webrtc之前缩放视频缓冲区。我的代码正在尝试使用ScaleFrom类中的webrtc::I420Buffer方法。当我应用缩放时,屏幕在另一端显示为绿色。

我不确定我做错了什么。

以下是我的一些代码:

        capturer->scaled_buffer->ScaleFrom(*capturer->video_buffer);
        width_ = 1024;
        height_ = 768;
        rtc::LogMessage::LogToDebug(rtc::LoggingSeverity::INFO);
        LOG(INFO) << "Scaling buffer now...AMRIT";

        rtc::scoped_refptr<webrtc::I420Buffer> yuv_buffer =
            webrtc::I420Buffer::Create(1024, 728);
        yuv_buffer->ScaleFrom(*capturer->video_buffer);
        rtc::scoped_refptr<webrtc::I420Buffer> axx_buffer =
            webrtc::I420Buffer::Create(1024, 768);
        libyuv::ScalePlane(capturer->video_buffer->DataY(), capturer->video_buffer->StrideY(), capturer->video_buffer->width(),
            capturer->video_buffer->height(), axx_buffer->MutableDataY(),
            axx_buffer->StrideY(), 1024, 768,
            libyuv::kFilterNone);


        libyuv::Scale(capturer->video_buffer->DataY(), capturer->video_buffer->DataU(), capturer->video_buffer->DataV(), capturer->video_buffer->StrideY(), capturer->video_buffer->StrideU(), capturer->video_buffer->StrideV(),
            capturer->video_buffer->width(), capturer->video_buffer->height(), (uint8*)yuv_buffer->DataY(), (uint8*)yuv_buffer->DataU(), (uint8*)yuv_buffer->DataV(), yuv_buffer->StrideY(), yuv_buffer->StrideU(), yuv_buffer->StrideV(), 1024, 768, LIBYUV_TRUE);

        /*rtc::scoped_refptr<webrtc::I420ABufferInterface> merged_buffer = webrtc::WrapI420Buffer(
            yuv_buffer->width(), yuv_buffer->height(), yuv_buffer->DataY(),
            yuv_buffer->StrideY(), yuv_buffer->DataU(), yuv_buffer->StrideU(),
            yuv_buffer->DataV(), yuv_buffer->StrideV(), axx_buffer->DataY(),
            axx_buffer->StrideY(),
            nullptr);*/

        //rtc::scoped_refptr<webrtc::VideoFrameBuffer> srcBuf(capturer->video_buffer);

        // capturer->video_buffer->CropAndScaleFrom(*srcBuf, 0, 0, 100, 100);
        //auto yuv = (uint8_t*)capturer->video_buffer->DataY();

        //LOG(INFO) << "Setting yuv...AMRIT";
        /*auto yuv = (uint8_t*)capturer->scaled_buffer->DataY();

        LOG(INFO) << "Scaled yuv: " << yuv;*/

        auto yuv = (uint8_t*)yuv_buffer->DataY();

此代码包含我尝试过的一些已注释的内容。有人能帮助我更好地理解我做错了什么,以及我可能会尝试解决这个问题吗?

0 个答案:

没有答案