当尝试从立体UVC摄像机流传输并排视频时,gstreamer引发错误。
我有一个通过USB连接到ARM板上的立体相机,但是在相机允许gstreamer的最高分辨率设置下,它引发了无效尺寸0x0错误。
v4l2-ctl --list-formats-ext -d /dev/video2
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Size: Discrete 2560x960
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 2560x720
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x480
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x240
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! "image/jpeg, width=2560, height=960, framerate=60/1" ! progressreport ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstProgressReport:progressreport0.GstPad:src: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)60.000000, x-dimensions=(string)"2560\,960", payload=(int)26, ssrc=(uint)1656850644, timestamp-offset=(uint)2590317031, seqnum-offset=(uint)18356
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)60.000000, x-dimensions=(string)"2560\,960", payload=(int)26, ssrc=(uint)1656850644, timestamp-offset=(uint)2590317031, seqnum-offset=(uint)18356
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstProgressReport:progressreport0.GstPad:sink: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
0:00:00.163107430 7652 0x55d47a920a30 WARN v4l2bufferpool gstv4l2bufferpool.c:790:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 2590320450
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 18356
progressreport0 (00:00:05): 4 seconds
progressreport0 (00:00:10): 9 seconds
progressreport0 (00:00:15): 14 seconds
0:00:15.770955137 7652 0x55d47a920a30 WARN v4l2src gstv4l2src.c:968:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 2 - ts: 0:00:15.622372937
progressreport0 (00:00:20): 19 seconds
progressreport0 (00:00:25): 24 seconds
progressreport0 (00:00:30): 29 seconds
progressreport0 (00:00:35): 34 seconds
然后在查看机上(当前仅在同一台笔记本电脑上使用本地主机):
$ gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp, encoding-name=JPEG, payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
WARNING: from element /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0: Invalid Dimension 0x0.
Additional debug info:
gstrtpjpegdepay.c(741): gst_rtp_jpeg_depay_process (): /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0
WARNING: from element /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0: Invalid Dimension 0x0.
两种最低分辨率模式都可以使用此配置,但是720p并排模式会引发上述错误。
我在这里做错了什么?这是否与不支持全屏模式的gst-launch-1.0有关?
预先感谢
答案 0 :(得分:0)
有关宽度和高度,请参见RFC 2435 https://tools.ietf.org/html/rfc2435:
3.1.5. Width: 8 bits
This field encodes the width of the image in 8-pixel multiples (e.g.,
a width of 40 denotes an image 320 pixels wide). The maximum width
is 2040 pixels.
3.1.6. Height: 8 bits
This field encodes the height of the image in 8-pixel multiples
(e.g., a height of 30 denotes an image 240 pixels tall). When
encoding interlaced video, this is the height of a video field, since
fields are individually JPEG encoded. The maximum height is 2040
pixels.
在这里您可以看到两个维度的限制均为2040像素。这是协议限制。
检查GStreamer源代码https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/blob/master/gst/rtp/gstrtpjpegdepay.c:
/* allow frame dimensions > 2040, passed in SDP session or media attributes
* from gstrtspsrc.c (gst_rtspsrc_sdp_attributes_to_caps), or in caps */
if (!width)
width = rtpjpegdepay->media_width;
if (!height)
height = rtpjpegdepay->media_height;
在这里您可以看到GStreamer开发人员为您提供了此协议限制的解决方案。当接收到任一尺寸的像素超过2040像素的图像时,您可能必须在顶盖上添加宽度和高度信息。