ffmpeg无法流式传输到远程客户端

时间:2014-09-03 23:42:20

标签: ffmpeg streaming video-streaming

我在笔记本电脑上构建一个简单的ffmpeg命令行,从其摄像头流式传输。命令行读取(详细信息输出):

host1> ffmpeg -v verbose \
              -f dshow \
              -i video="Camera":audio="Microphone" \
              -r 30 -g 0 -vcodec h264 -acodec libmp3lame \
              -tune zerolatency \
              -preset ultrafast \
              -f mpegts udp://12.34.56.78:12345

首先,它在本地工作。即,我可以在同一主机上使用ffplay来查看输出:

host1> ffplay -hide_banner -v udp://12.34.56.78:12345

现在什么是不可行的是当我从同一网络中的另一台机器上执行此操作时。它显示nan进度:

host2> ffplay -hide_banner -v udp://12.34.56.78:12345
    nan    :  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0   

我使用ncat转储原始内容。但是没有输出:

host2>\ncat\ncat -v -u 12.34.56.78 12345
Ncat: Version 5.59BETA1 ( http://nmap.org/ncat )
Ncat: Connected to 12.34.56.78:12345.
(...and nothing happen...)

请注意,我可以排除防火墙问题,因为我使用ncat使用相同的端口和协议(UDP)通过电线相互通信。这有效,他们可以互相聊天:

host1> ncat -l -u -p 12345
host2> ncat -u 12.34.56.78 12345

任何提示?

我使用安装了here的FFMPEG 64位的Windows x64。下面是我的ffmpeg命令的输出:

C:\ffmpeg\bin>ffmpeg -v verbose -f dshow -i video="Integrated Camera":audio="Microphone (Realtek High Definition Audio)" -r 30 -g 0 -vcodec h264 -acodec libmp3lame -tune zerolatency -preset ultrafast -f mpegts udp://12.34.56.78:12345
ffmpeg version N-66012-g97b8809 Copyright (c) 2000-2014 the FFmpeg developers
  built on Sep  1 2014 00:21:15 with gcc 4.8.3 (GCC)
  configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug -enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-decklink --enable-zlib
  libavutil      54.  7.100 / 54.  7.100
  libavcodec     56.  1.100 / 56.  1.100
  libavformat    56.  3.100 / 56.  3.100
  libavdevice    56.  0.100 / 56.  0.100
  libavfilter     5.  0.103 /  5.  0.103
  libswscale      3.  0.100 /  3.  0.100
  libswresample   1.  1.100 /  1.  1.100
  libpostproc    53.  0.100 / 53.  0.100
Guessed Channel Layout for  Input Stream #0.1 : stereo
Input #0, dshow, from 'video=Integrated Camera:audio=Microphone (Realtek High Definition Audio)':
  Duration: N/A, start: 171840.657000, bitrate: N/A
    Stream #0:0: Video: rawvideo, bgr24, 640x480, 30 fps, 30 tbr, 10000k tbn, 30 tbc
    Stream #0:1: Audio: pcm_s16le, 44100 Hz, 2 channels, s16, 1411 kb/s
Matched encoder 'libx264' for codec 'h264'.
[graph 0 input from stream 0:0 @ 0000000000470aa0] w:640 h:480 pixfmt:bgr24 tb:1/10000000 fr:10000000/333333 sar:0/1 sws_param:flags=2
[auto-inserted scaler 0 @ 0000000004326d00] w:iw h:ih flags:'0x4' interl:0
[format @ 0000000004325a00] auto-inserting filter 'auto-inserted scaler 0' between the filter 'Parsed_null_0' and the filter 'format'
[auto-inserted scaler 0 @ 0000000004326d00] w:640 h:480 fmt:bgr24 sar:0/1 -> w:640 h:480 fmt:yuv444p sar:0/1 flags:0x4
No pixel format specified, yuv444p for H.264 encoding chosen.
Use -pix_fmt yuv420p for compatibility with outdated media players.
[graph 1 input from stream 0:1 @ 0000000000460c20] tb:1/44100 samplefmt:s16 samplerate:44100 chlayout:0x3
[audio format for output stream 0:1 @ 00000000004601a0] auto-inserting filter 'auto-inserted resampler 0' between the filter 'Parsed_anull_0' and the filter 'audio format for output stream 0:1'
[auto-inserted resampler 0 @ 00000000004604a0] ch:2 chl:stereo fmt:s16 r:44100Hz -> ch:2 chl:stereo fmt:s16p r:44100Hz
[libx264 @ 000000000081bb20] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 000000000081bb20] profile High 4:4:4 Intra, level 3.0, 4:4:4 8-bit
[mpegts @ 000000000081abe0] muxrate VBR, pcr every 3 pkts, sdt every 200, pat/pmt every 40 pkts
Output #0, mpegts, to 'udp://12.34.56.78:12345':
  Metadata:
    encoder         : Lavf56.3.100
    Stream #0:0: Video: h264 (libx264), yuv444p, 640x480, q=-1--1, 30 fps, 90k tbn, 30 tbc
    Metadata:
      encoder         : Lavc56.1.100 libx264
    Stream #0:1: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s16p
    Metadata:
      encoder         : Lavc56.1.100 libmp3lame
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
  Stream #0:1 -> #0:1 (pcm_s16le (native) -> mp3 (libmp3lame))
Press [q] to stop, [?] for help
*** 1 dup!
frame=  241 fps= 31 q=28.0 Lsize=    3439kB time=00:00:08.03 bitrate=3506.4kbits/s dup=1 drop=0
video:3035kB audio:125kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 8.791966%
Input file #0 (video=Integrated Camera:audio=Microphone (Realtek High Definition Audio)):
  Input stream #0:0 (video): 240 packets read (221184000 bytes); 240 frames decoded;
  Input stream #0:1 (audio): 16 packets read (1411200 bytes); 16 frames decoded (352800 samples);
  Total: 256 packets (222595200 bytes) demuxed
Output file #0 (udp://12.34.56.78:12345):
  Output stream #0:0 (video): 241 frames encoded; 241 packets muxed (3108187 bytes);
  Output stream #0:1 (audio): 306 frames encoded (352512 samples); 307 packets muxed (128313 bytes);
  Total: 548 packets (3236500 bytes) muxed
[libx264 @ 000000000081bb20] frame I:241   Avg QP:27.97  size: 12897
[libx264 @ 000000000081bb20] mb I  I16..4: 100.0%  0.0%  0.0%
[libx264 @ 000000000081bb20] coded y,u,v intra: 26.3% 0.5% 0.0%
[libx264 @ 000000000081bb20] i16 v,h,dc,p: 19% 28% 21% 31%
[libx264 @ 000000000081bb20] kb/s:3095.29
[dshow @ 0000000000467720] real-time buffer[Integrated Camera] too full (90% of size: 3041280)! frame dropped!
Received signal 2: terminating. (I pressed CTRL-C)

1 个答案:

答案 0 :(得分:6)

好的,我搞定了。问题是我对FFmpegFFplay工作方式错误的理解。当我们说:

host1> ffmpeg -i INPUT -i protocol://ip:port

这并不意味着ffmpeg绑定和监听ip:port,而是试图将输出“发布”到此端点。

类似地

host2> ffplay -i protocol://ip:port

表示ffplay实际上绑定ip并在port上收听传入内容。

因此,要使此工作正常,ffmpeg应发布到ip:port,其中ip:portffplay正在侦听的远程主机和端口,而不是IP地址本地计算机 - 因为ffmpeg是客户端,而不是服务器。