FFMPEG音频和视频流合并和休息

时间:2015-02-25 18:19:14

标签: ffmpeg streaming video-streaming audio-streaming live-streaming

我有多个流媒体资源需要组合并重新流式传输作为单一来源。

我的消息来源是:

  • 本地低速率RTP音频流
  • 相机

我需要在本地网络中重新分配组合流(通过UDP多播)。

我看到的问题是定期ffmpeg似乎锁定并在不确定的时间后停止处理组合(有时短至15分钟,有时几乎一小时)。但是,如果我独立地重定向流(仅限音频或视频),则似乎没有问题并且无限期地运行。

命令

ffmpeg -f rtp -i rtp://127.0.0.1:6666 -f video4linux2 -standard NTSC -s 704x480 -i /dev/video1 -strict experimental -vcodec libx264 -acodec ac3 -preset ultrafast -r 3 -g 3 -keyint_min 6 -x264opts "keyint=6:min-keyint=6:no-scenecut" -b:v 200k -ac 1 -b:a 64k -f mpegts udp://225.1.1.15:30000

输出

ffmpeg version 2.5.1-   http://johnvansickle.com/ffmpeg/    Copyright (c) 2000-2014 the FFmpeg developers
  built on Dec 18 2014 09:06:26 with gcc 4.8 (Debian 4.8.3-19)
  configuration: --enable-gpl --enable-version3 --disable-shared --disable-    debug --enable-runtime-cpudetect --enable-libmp3lame --enable-libx264 --enable-    libx265 --enable-libwebp --enable-libspeex --enable-libvorbis --enable-libvpx --    enable-libfreetype --enable-fontconfig --enable-libxvid --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvo-aacenc --    enable-libvo-amrwbenc --enable-gray --enable-libopenjpeg --enable-libopus --    disable-ffserver --enable-libass --enable-gnutls --cc=gcc-4.8
  libavutil      54. 15.100 / 54. 15.100
  libavcodec     56. 13.100 / 56. 13.100
  libavformat    56. 15.102 / 56. 15.102
  libavdevice    56.  3.100 / 56.  3.100
  libavfilter     5.  2.103 /  5.  2.103
  libswscale      3.  1.101 /  3.  1.101
  libswresample   1.  1.100 /  1.  1.100
  libpostproc    53.  3.100 / 53.  3.100
[rtp @ 0xb61abe0] Guessing on RTP content - if not received properly you need an SDP file describing it
Guessed Channel Layout for  Input Stream #0.0 : mono
Input #0, rtp, from 'rtp://127.0.0.1:6666':
  Duration: N/A, start: 0.000000, bitrate: 64 kb/s
    Stream #0:0: Audio: pcm_mulaw, 8000 Hz, 1 channels, s16, 64 kb/s
Input #1, video4linux2,v4l2, from '/dev/video1':
  Duration: N/A, start: 1424887596.039777, bitrate: 162039 kb/s
    Stream #1:0: Video: rawvideo (YUY2 / 0x32595559`enter code here`), yuyv422, 704x480, 162039 kb/s, 29.97 fps, 29.97 tbr, 1000k tbn, 1000k tbc
No pixel format specified, yuv422p for H.264 encoding chosen.
Use -pix_fmt yuv420p for compatibility with outdated media players.
[libx264 @ 0xb61f900] using cpu capabilities: MMX2 SSE2Fast SSSE3 Cache64
[libx264 @ 0xb61f900] profile High 4:2:2, level 2.2, 4:2:2 8-bit
Output #0, mpegts, to 'udp://225.1.1.15:30000':
  Metadata:
    encoder         : Lavf56.15.102
    Stream #0:0: Video: h264 (libx264), yuv422p, 704x480, q=-1--1, 200 kb/s, 3 fps, 90k tbn, 3 tbc
    Metadata:
       encoder         : Lavc56.13.100 libx264
    Stream #0:1: Audio: ac3, 8000 Hz, mono, fltp, 64 kb/s
    Metadata:
       encoder         : Lavc56.13.100 ac3
Stream mapping:
  Stream #1:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
  Stream #0:0 -> #0:1 (pcm_mulaw (native) -> ac3 (native))
Press [q] to stop, [?] for help
frame=    5 fps=0.0 q=12.0 size=       0kB time=00:00:00.33 bitrate=   0.0kbits/s dup=0 drop=12 

1 个答案:

答案 0 :(得分:0)

事实证明,我能够让流程长时间运行的唯一方法是将它们分成两个独立的流。