我正在尝试使用此处找到的示例脚本来拍摄OpenCV图像并将其转换为rtp / rtsp流:
https://github.com/madams1337/python-opencv-gstreamer-examples/blob/master/gst_device_to_rtp.py
这是脚本的描述:
“ gst_device_to_rtp捕获VideoCapture(0),对帧进行编码并将其流传输到rtp:// localhost:5000”
这是我要使用的代码
# Cam properties
fps = 30.
frame_width = 1920
frame_height = 1080
# Create capture
#cap = cv2.VideoCapture(0)
# Set camera properties
cap.set(cv2.CAP_PROP_FRAME_WIDTH, frame_width)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, frame_height)
cap.set(cv2.CAP_PROP_FPS, fps)
# Define the gstreamer sink
gst_str_rtp = "appsrc ! videoconvert ! x264enc noise-reduction=10000 tune=zerolatency byte-stream=true threads=4 " \
" ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=127.0.0.1 port=5000"
# Create videowriter as a SHM sink
out = cv2.VideoWriter(gst_str_rtp, 0, fps, (frame_width, frame_height), True)
# Loop it
while True:
# Get the frame
ret, frame = cap.read()
# Check
if ret is True:
# Flip frame
frame = cv2.flip(frame, 1)
# Write to SHM
out.write(frame)
else:
print "Camera error."
time.sleep(10)
cap.release()
主要是以下代码,用于指定gstreamer管道配置:
# Define the gstreamer sink
gst_str_rtp = "appsrc ! videoconvert ! x264enc noise-reduction=10000 tune=zerolatency byte-stream=true threads=4 " \
" ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=127.0.0.1 port=5000"
# Create videowriter as a SHM sink
out = cv2.VideoWriter(gst_str_rtp, 0, fps, (frame_width, frame_height), True)
据我了解,它会将OpenCV视频图像发送到“ rtp:// localhost:5000”
但是,每当我尝试在终端中执行此终端命令而又使脚本运行时:
ffplay 'rtp://localhost:5000'
它像这样永久悬挂: 我无法真正确定这到底意味着什么。这是否意味着它可以在该端口连接到localhost,但在那里找不到任何内容?我真的不知道如果存在另一个带有其他rtsp url的命令,则该命令似乎可行,但不适用于此
如果我尝试“ ffplay'rtsp:// localhost:5000'”,那么我只会收到“连接被拒绝”错误(也许在该流中没有生成任何内容)
该脚本是否真的将rtp流输出到localhost:5000?还是我的计算机上的ffplay有问题?还是我应该执行特殊的ffplay命令?我该怎么办?