为了加速视频中一系列matplotlib
制作的图像的动画,我想将以下(工作)示例转换为gstreamer
,以便能够使用硬件编码Raspberry Pi 2 GPU的功能。
起点是下面的代码,它使用stdin
将ARGB视频帧流式传输到ffmpeg
。如何替换命令字符串以使用gst-launch-1.0
代替?
import numpy as np
import matplotlib.pylab as plt
import time
import subprocess
# Number of frames
nframes = 200
# Generate data
x = np.linspace(0, 100, num=nframes)
y = np.random.random_sample(np.size(x))
def testSubprocess(x, y):
start_time = time.time()
#set up the figure
fig = plt.figure(figsize=(15, 9))
canvas_width, canvas_height = fig.canvas.get_width_height()
# First frame
ax0 = plt.plot(x,y)
pointplot, = plt.plot(x[0], y[0], 'or')
def update(frame):
# your matplotlib code goes here
pointplot.set_data(x[frame],y[frame])
# Open an ffmpeg process
outf = 'testSubprocess.mp4'
cmdstring = ('ffmpeg',
'-y', '-r', '1', # overwrite, 1fps
'-s', '%dx%d' % (canvas_width, canvas_height), # size of image string
'-pix_fmt', 'argb', # format
'-f', 'rawvideo', '-i', '-', # tell ffmpeg to expect raw video from the pipe
'-vcodec', 'mpeg4', outf) # output encoding
p = subprocess.Popen(cmdstring, stdin=subprocess.PIPE)
# Draw frames and write to the pipe
for frame in range(nframes):
# draw the frame
update(frame)
fig.canvas.draw()
# extract the image as an ARGB string
string = fig.canvas.tostring_argb()
# write to pipe
p.stdin.write(string)
# Finish up
p.communicate()
print("Movie written in %s seconds" % (time.time()-start_time))
if __name__ == "__main__":
testSubprocess(x, y)
管道中使用的组件是omxh264enc
,但是一旦我理解了如何将数据提供给管道,这也可以作为第二步实现。
基于gst-python
的答案也是完全可以接受的。
答案 0 :(得分:1)
“videoparse”看起来像是你想在这里使用的关键词,告诉GStreamer你的输入是什么格式。
gst-launch-1.0 fdsrc!视频宽度= 128高度= 128格式= argb 帧率= 5/1!视频!视频转换! omxh264enc! h264parse! mp4mux! filesink location = OUTPUTFILE