我在FFMpeg和FFServer的最后几天一直在玩,因为我认为它们是嵌入式设备(带有集成高清摄像头)和各种客户端(智能手机)之间直播的候选者。
我已经设法使用以下配置为FFServer实现了流:
HTTPPort 1234
RTSPPort 1235
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 512000 # Maximum bandwidth per client
# set this high enough to exceed stream bitrate
CustomLog -
NoDaemon # Remove this if you want FFserver to daemonize after start
<Feed feed.ffm> # This is the input feed where FFmpeg will send
File /tmp/feed.ffm # video stream.
FileMaxSize 512K
</Feed>
<Stream test.h264> # Output stream URL definition
Feed feed.ffm # Feed from which to receive video
Format rtp
# Video settings
VideoCodec libvpx
VideoSize 720x576 # Video resolution
VideoFrameRate 60 # Video FPS
AVOptionVideo flags +global_header # Parameters passed to encoder
# (same as ffmpeg command-line parameters)
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 400 # Video bitrate
NoAudio
</Stream>
以下FFMpeg命令将流发送到FFServer:
ffmpeg -rtbufsize 2100M -f dshow -i video="Integrated Camera" -vcodec libx264 http://127.0.0.1:1234/feed.ffm
我还有一个简单的Android客户端,它使用以下URL播放RTSP流:
rtsp://mylocalnetworkip:1235/test.h264
但现在我正在尝试在嵌入式设备和智能手机之间实现P2P连接。这必须通过互联网(而不是局域网)并且能够实现UDP打孔(例如Skype用于p2p视频通话)。