Python多进程列表的dict

时间:2018-02-10 11:28:42

标签: python python-3.x shared-memory python-multiprocessing dill

我需要在Python 3.6的多进程中做一些事情。也就是说,我必须更新添加对象列表的字典。由于这些对象不可用,我需要使用dill代替picklemultiprocess代替pathos而不是multiprocessing,但这不应该是问题。< / p>

向字典添加列表需要在添加到字典之前重新序列化列表。这会减慢一切,并且在没有多处理的情况下需要相同的时间。你能建议我一个解决方法吗?

这是我的python 3.6代码: init1工作正常但速度很慢,init2速度很快但却很糟糕。剩下的仅用于测试目的。

import time

def init1(d: dict):
    for i in range(1000):
        l = []
        for k in range(i):
             l.append(k)
        d[i] = l

def init2(d: dict):
    for i in range(1000):
        l = []
        d[i] = l
        for k in range(i):
            l.append(i)

def test1():
    import multiprocess as mp
    with mp.Manager() as manager:
        d = manager.dict()
        p = mp.Process(target=init1, args=(d,))
        p.start()
        p.join()
        print(d)

def test2():
    import multiprocess as mp
    with mp.Manager() as manager:
        d = manager.dict()
        p = mp.Process(target=init2, args=(d,))
        p.start()
        p.join()
        print(d)

start = time.time()
test1()
end = time.time()
print('test1: ', end - start)


start = time.time()
test2()
end = time.time()
print('test2: ', end - start)

1 个答案:

答案 0 :(得分:0)

使用管道的可能解决方案。在我的电脑上,这需要870毫秒,相比之下test1的1.10和test2的200毫秒。

def init3(child_conn):
    d = {}
    for i in range(1000):
        l = []
        for k in range(i):
            l.append(i)
        d[i] = l
    child_conn.send(d)

def test3():
    import multiprocess as mp
    parent_conn, child_conn = mp.Pipe(duplex=False)
    p = mp.Process(target=init3, args=(child_conn,))
    p.start()
    d = parent_conn.recv()
    p.join()

在jupyter上,通过使用魔法%timeit,我得到了:

In [01]: %timeit test3()
872 ms ± 11.9 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

In [02]: %timeit test2()
199 ms ± 1.72 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)

In [03]: %timeit test1()
1.09 s ± 10.1 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)