我在RTSP中有一个摄像机供稿,例如: rtsp://172.16.1.177:8554 / test
这是我使用ffmpeg -i rtsp://172.16.1.177:8554/test
Input #0, rtsp, from 'rtsp://172.16.1.177:8554/test':
Metadata:
title : Session streamed with GStreamer
comment : rtsp-server
Duration: N/A, start: 0.710544, bitrate: N/A
Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1920x1080, 15 tbr, 90k tbn, 180k tbc
现在,我将chromakey应用于上述流,这使我在mp4中获得了完美的输出
ffmpeg -i background.jpg -i rtsp://172.16.1.177:8554/test -filter_complex "[1:v]colorkey=0x26ff0b:0.3:0.2[ckout];[0:v][ckout]overlay[out]" -map "[out]" output.mp4
之后,我使用以下配置文件创建并成功启动了ffserver
HTTPPort 8091
RTSPPort 8092
HTTPBindAddress 0.0.0.0
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 2048M
ACL allow localhost
</Feed>
<Stream live.sdp>
Feed feed1.ffm
Format rtp
NoAudio
VideoCodec libx264
VideoFrameRate 15
VideoBitRate 1000
VideoSize 1920x1080
ACL allow 172.16.1.30 172.16.0.2
</Stream>
我正在尝试使用以下命令导出输出流
ffmpeg -i background.jpg -i rtsp://172.16.1.177:8554/test -filter_complex "[1:v]colorkey=0x26ff0b:0.3:0.2[ckout];[0:v][ckout]overlay[out]" -map "[out]" http://localhost:8091/feed1.ffm
这给了我下面的错误
Input #0, image2, from 'background.jpg':
Duration: 00:00:00.04, start: 0.000000, bitrate: 23866 kb/s
Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 854x480 [SAR 72:72 DAR 427:240], 25 tbr, 25 tbn, 25 tbc
Input #1, rtsp, from 'rtsp://172.16.1.177:8554/test':
Metadata:
title : Session streamed with GStreamer
comment : rtsp-server
Duration: N/A, start: 0.711933, bitrate: N/A
Stream #1:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1920x1080, 15 tbr, 90k tbn, 180k tbc
[tcp @ 0x7fdb88706680] Connection to tcp://localhost:8091 failed (Connection refused), trying next address
[tcp @ 0x7fdb88402920] Connection to tcp://localhost:8091 failed (Connection refused), trying next address
Filter overlay has an unconnected output
我对ffmpeg或ffserver都没有太多经验,所以我不完全知道为什么输出未连接会出现问题
过滤器叠加层的输出未连接