我正在尝试将并行处理的python脚本放在一起。它使用期货执行繁重的处理任务,而主线程构建要执行的任务,然后将其放入multiprocessing.Queue
中。还有另一个处理期货结果的过程(读取结果并编译最终结果)。我参加减仓交易是因为我有能力在那里等待。但是,我收到一个关于无法腌制rthread对象的错误。
Traceback (most recent call last):
File "/usr/lib64/python3.6/multiprocessing/queues.py", line 234, in _feed
obj = _ForkingPickler.dumps(obj)
File "/usr/lib64/python3.6/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
TypeError: can't pickle _thread.RLock objects
这是我目前的代码,我似乎无法想到一个效果很好的想法。结果的顺序很重要,如果我提交作业A和B。我需要A先出来然后B。
我的想法是:
肯定有一个简单的解决方案,我在这里没有看到。
# Does video encoding in the background
def worker():
fourcc = cv2.VideoWriter_fourcc(*'XVID')
out = cv2.VideoWriter('out.avi', fourcc, 10, (1000, 75))
n = 0;
while True:
task = workqueue.get()
print(task)
img = cv2.imread(task)
out.write(img)
os.remove(task)
workqueue.task_done()
out.release()
cv2.destoryAllWindows()
workqueue = JoinableQueue(maxsize=50)
with open('runtime_trace.csv') as f:
t = Process(target=worker)
t.daemon = True
t.start()
max_size = get_max_alloc_size()
reader = csv.DictReader(f, ['i', 'addr', 'size', 'alloc'])
last_val = 0
vals = []
with concurrent.futures.ProcessPoolExecutor() as exe:
for i,d in enumerate(reader):
d = fix_types(d)
if d['i'] != last_val:
future = exe.submit(plot, vals, last_val, max_size)
workqueue.put(future)
vals = []
vals.append(d)
last_val = d['i']
workqueue.join()