如何使用VideoWriter从OpenCV打开GStreamer管道

时间:2017-09-14 12:39:44

标签: c++ opencv video gstreamer

我正在使用OpenCV VideoCapture捕获视频帧。捕获工作正常,因为我能够使用这样的帧:

cv::VideoCapture cap("v4l2src device=/dev/video1 ! videoscale ! videorate ! video/x-raw, width=640, height=360, framerate=30/1 ! videoconvert ! appsink");
cv::imshow("feed", frame);

我还想通过网络发送流,这就是我被困的地方。不知怎的,我在appsrc管道部分失败了。我想将流编码为jpeg并将其发送给vie udp。这就是我得到的:

cv::VideoWriter writer
writer.open("appsrc ! videoconvert ! jpegenc ! jpegparse ! rtpjpegpay pt=96 ! udpsink host=192.168.1.25 port=5000", 0, (double)30, cv::Size(640, 360), true);

看起来上面这行没有做任何事情。 writer << frame没有做任何事情。此gstreamer命令也不显示任何内容:

gst-launch-1.0 udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)96" ! rtpjpegdepay ! jpegdec ! decodebin ! videoconvert ! autovideosink

我不知道writer.open部分我在哪里失败了。如果我运行像这样的gstreamer命令,那么它们可以工作:

gst-launch-1.0 v4l2src device=/dev/video1 ! videoscale ! videorate ! video/x-raw, width=640, height=360, framerate=30/1 ! jpegenc ! jpegparse ! rtpjpegpay pt=96 ! udpsink host=192.168.1.25 port=5000
gst-launch-1.0 udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)96" ! rtpjpegdepay ! jpegdec ! decodebin ! videoconvert ! autovideosink

1 个答案:

答案 0 :(得分:8)

在使用OpenCV的Gstreamer API之前,我们需要一个使用Gstreamer命令行工具的工作管道。

发件人: OP正在使用JPEG编码,因此此管道将使用相同的编码。

gst-launch-1.0 -v v4l2src \
! video/x-raw,format=YUY2,width=640,height=480 \
! jpegenc \
! rtpjpegpay \
! udpsink host=127.0.0.1 port=5000

接收方:[{1}}的接收器caps需要匹配发件人管道rtpjpegdepay的src caps

rtpjpegpay

现在我们有发送方和接收方的工作流水线,我们可以将它们移植到OpenCV。

<强>发件人:

gst-launch-1.0 -v udpsrc port=5000 \
! application/x-rtp, media=video, clock-rate=90000, encoding-name=JPEG, payload=26 \
! rtpjpegdepay \
! jpegdec \
! xvimagesink sync=0

<强>接收器:

void sender()
{
    // VideoCapture: Getting frames using 'v4l2src' plugin, format is 'BGR' because
    // the VideoWriter class expects a 3 channel image since we are sending colored images.
    // Both 'YUY2' and 'I420' are single channel images. 
    VideoCapture cap("v4l2src ! video/x-raw,format=BGR,width=640,height=480,framerate=30/1 ! appsink",CAP_GSTREAMER);

    // VideoWriter: 'videoconvert' converts the 'BGR' images into 'YUY2' raw frames to be fed to
    // 'jpegenc' encoder since 'jpegenc' does not accept 'BGR' images. The 'videoconvert' is not
    // in the original pipeline, because in there we are reading frames in 'YUY2' format from 'v4l2src'
    VideoWriter out("appsrc ! videoconvert ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000",CAP_GSTREAMER,0,30,Size(640,480),true);

    if(!cap.isOpened() || !out.isOpened())
    {
        cout<<"VideoCapture or VideoWriter not opened"<<endl;
        exit(-1);
    }

    Mat frame;

    while(true) {

        cap.read(frame);

        if(frame.empty())
            break;

        out.write(frame);

        imshow("Sender", frame);
        if(waitKey(1) == 's')
            break;
    }
    destroyWindow("Sender");
}