Ubuntu 14.04 Gstreamer autovideosink

时间:2016-07-22 01:24:49

标签: ubuntu gstreamer pipeline

我正在尝试在Ubuntu 14.04上运行Gstreamer视频流,但接收方无法正常显示视频。我有一个发送MJPEG图像的发送方管道,我这样开始:

gst-launch-1.0 -v videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5200

输出结果为:

Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:src: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 1012832172
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 21614
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

接收管道是:

gst-launch-1.0 udpsrc port=5200 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96" ! rtpjpegdepay ! jpegdec ! xvimagesink

注意我已经从发件人信息中复制了上限。

然而,我一直在发送"无法发送粘性事件"错误,接收器立即终止。我能做错什么?我的输出如下:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.016961467 12229      0x266a400 WARN                GST_PADS gstpad.c:3669:gst_pad_peer_query:<jpegdec0:src> could not send sticky events
0:00:00.017297845 12229      0x266a400 WARN                 basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: Internal data flow error.
0:00:00.017306308 12229      0x266a400 WARN                 basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: streaming task paused, reason not-negotiated (-4)
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.000651039
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

我尝试过autovideosink,ximagesink,结果相同。

感谢。

1 个答案:

答案 0 :(得分:0)

Okey你只需要添加videoconvert - jpegdec会生成xvimagesink无法理解的视频:

gst-launch-1.0 udpsrc port=5200 ! application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)JPEG\,\ a-framerate\=\(string\)\"30\\\,000000\"\,\ payload\=\(int\)26\,\ ssrc\=\(uint\)2512101225\,\ timestamp-offset\=\(uint\)1627080146\,\ seqnum-offset\=\(uint\)4727 ! rtpjpegdepay ! jpegdec ! videoconvert ! xvimagesink

我怎么知道这个?通过这种方式使用fakesink:

udpsrc ! fakesink -v

工作,然后

udpsrc ! rptjpegdepay ! fakesink 

再次工作..所以我继续这样,并发现问题实际上在xvimagesink ..很奇怪,我们没有得到更明确的错误信息..

您还可以使用以下方法调试应用程序:

GST_DEBUG=4 gst-launch-1.0 udpsrc .....

您可以滚动到未经协商的部分,并环顾四周。