python多重处理;每个字典键所有值的一个处理器

时间:2020-09-09 02:14:38

标签: python python-multiprocessing

如果有的话,通常使用池和星图

if __name__ == '__main__':
    with multiprocessing.Pool() as p:
        temp_arr = p.starmap(process, tuple_list)

tuple_list = [(1,2 ,,(3,4)],例如,导致分别分配到不同处理器的process(1,2)和process(3,4)。

如果我有

dict = {'0': [(1,1), (2,3)], '1': [(4,4)], '2': [(2,4), (3,5)]}:

有没有一种我可以使用Pool的方法,以便所有键'0'的值一次都转到第一个处理器(作为元组列表[(1,1),(2,3)],例如,这样我就可以稍后在process()中分别处理每个元组,键“ 1”的值将进入第二个处理器,依此类推。

谢谢。

2 个答案:

答案 0 :(得分:1)

尝试一下:

import multiprocessing as mp
import time
dict = {'0': [(1,1), (2,3)], '1': [(4,4)], '2': [(2,4), (3,5)]}

def process(tup):
    print(f"input tuple: {tup} -- worker_id: {mp.current_process()}\n")
    time.sleep(2)

def process_all(index):
    for tup in dict[index]:
        process(tup)

with mp.Pool() as p:
    temp_arr = p.starmap(process_all, dict.keys())

# Result
#input tuple: (1, 1) -- worker_id: <ForkProcess(ForkPoolWorker-121, started daemon)>
#input tuple: (2, 4) -- worker_id: <ForkProcess(ForkPoolWorker-123, started daemon)>
#input tuple: (4, 4) -- worker_id: <ForkProcess(ForkPoolWorker-122, started daemon)>
#input tuple: (3, 5) -- worker_id: <ForkProcess(ForkPoolWorker-123, started daemon)>
#input tuple: (2, 3) -- worker_id: <ForkProcess(ForkPoolWorker-121, started daemon)>

这正是您想要的吗?

答案 1 :(得分:1)

您可以将map()dict.values()一起使用

import multiprocessing as mp

dict = {
     '0': [(1,1), (2,3)], 
     '1': [(4,4)], 
     '2': [(2,4), (3,5)]
}

def process(data):
    print(f"process data: {data}")
    #return result

with mp.Pool() as p:
    all_results = p.map(process, dict.values())

结果:

process data: [(1, 1), (2, 3)]
process data: [(4, 4)]
process data: [(2, 4), (3, 5)]