背景:
我的gstreamer管道读取RTP数据包,将它们从YUV转换为原始RGB,然后保存它们。到目前为止,它在这方面是正确的:
gst-launch --gst-debug=2 -vv udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" port=33314 ! rtph264depay ! h264parse ! ffdec_h264 ! ffmpegcolorspace ! "video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false" ! filesink location="privateryan.rgb"
这是命令的调试输出,它可以正常工作并保存RAW RGB视频输出。这个文件在几秒钟之后就是巨大的(如预期的那样),所以如果你运行这个命令要小心:
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)704, height=(int)240, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, width=(int)704, height=(int)240, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)704, height=(int)240, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)5/11
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, width=(int)704, height=(int)240, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)5/11
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false, width=(int)704, height=(int)240, pixel-aspect-ratio=(fraction)5/11, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
录制完成后,我使用Vooya原始序列播放器检查文件并播放。但是,如果没有告诉Vooya视频是隔行扫描的,则文件无法正常播放。
我需要平面封装的帧,我可以在以后提取它用于计算机视觉应用。
在这里你可以看到视频播放,虽然视频格式错误(隔行扫描):
http://i.imgur.com/QDZSWdJ.png
在这里,当我将设置更改为我需要的设置时,您可以看到视频无法播放:
http://i.imgur.com/zSrOvFj.png
所以,我试图将deinterlace插件添加到我的管道中,但没有成功。我能做错什么?
这是我的新管道,在filesink之前有 deinterlace :
gst-launch --gst-debug=1 -v udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,frame-rate=(fraction)25/1" port=33314 ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! "video/x-raw-rgb, bpp=(int)24, depth=(int)24, framerate=(fraction)25/1, interlaced=(boolean)false" ! deinterlace ! filesink location="privateryan.rgb"
0:00:00.096569700 12969 0x607080 ERROR GST_PIPELINE ./grammar.y:614:gst_parse_perform_link: could not link ffmpegcsp0 to deinterlace0
WARNING: erroneous pipeline: could not link ffmpegcsp0 to deinterlace0
为什么我的视频在完成所有处理之后仍然显示隔行扫描,对于隔行扫描插件我可能做错了什么?
我认为录像机或B& W摄像机可能会在视频上插入视频,但我不确定。即使它们是,我也无法改变它,仍然需要去隔行。
谢谢!