Python进程合并结果

时间:2017-08-22 09:34:02

标签: python set python-multiprocessing

我目前正在尝试实施一个类来进行强化演算:

import random
import multiprocessing as mp


class IntensiveStuff:

    def __init__(self):
        self.N = 20
        self.nb_process = 4
        set_of_things = set()

    def lunch_multiprocessing(self):
        processes = []
        for i in range(self.nb_process):
            processes.append(mp.Process(target=self.process_method, args=()))
        [x.start() for x in processes]
        [x.join() for x in processes]
        set_of_things = ... # I want all the sub_set of 'process_method' updated in set_of_things

    def process_method(self):
        sub_set = set()
        for _ in range(self.N):
            sub_set.add(random.randint(100))

我想计算独立的微积分,将结果放在每个进程的sub_set中,并合并set_of_things中的所有sub_set(它们是实际代码中的对象)。

我试图使用Queue但没有成功,有什么建议吗?

P.S:试图在Can a set() be shared between Python processes?中重现代码,但也没有运气。

1 个答案:

答案 0 :(得分:-1)

进程无法共享内存,但它们可能通过管道,套接字等进行通信。多处理模块具有特殊对象(我相信,它们使用引擎盖下的管道)。 multiprocessing.Queue也应该有效,但我经常使用这两个对象:

multiprocessing.Manager()。list()和 multiprocessing.Manager()。字典()

results = multiprocessing.Manager().list()
# now a bit of your code
processes = []
for i in range(self.nb_process):
   processes.append(mp.Process(target=self.process_method, args=(), results))

def process_method(self, results):
    sub_set = set()
    for _ in range(self.N):
        sub_set.add(random.randint(100))
    results.append(sub_set) # or what you really need to add to results