多处理 - 共享阵列

时间:2016-08-24 11:37:10

标签: python multiprocessing shared

所以我正在尝试在python中实现多处理,我希望有一个4-5个进程池并行运行一个方法。这样做的目的是运行总共1000个Monte模拟(每个进程250-200个模拟)而不是运行1000.我希望每个进程在完成处理后立即通过获取锁定来写入公共共享数组一次模拟的结果,写入结果并释放锁定。所以它应该是一个三步过程:

  1. 获取锁定
  2. 撰写结果
  3. 释放其他进程等待写入阵列的锁定。
  4. 每次我将数组传递给进程时,每个进程都会创建一个我不想要的数组副本,因为我想要一个公共数组。任何人都可以通过提供示例代码来帮助我吗?

2 个答案:

答案 0 :(得分:0)

未经测试,但这样的事情应该有效。 数组和锁在进程之间共享。

from multiprocessing import Process, Array, Lock

def f(array, lock, n): #n is the dedicated location in the array
    lock.acquire()
    array[n]=-array[n]
    lock.release()

if __name__ == '__main__':
    size=100
    arr=Array('i', [3,-7])
    lock=Lock()
    p = Process(target=f, args=(arr,lock,0))
    q = Process(target=f, args=(arr,lock,1))
    p.start()
    q.start()
    q.join()
    p.join()

    print(arr[:])

这里的文档https://docs.python.org/3.5/library/multiprocessing.html有很多以

开头的例子

答案 1 :(得分:0)

由于您只是将状态从子进程返回到父进程,因此使用共享数组和显式锁是过度的。您可以使用Pool.mapPool.starmap来完全满足您的需求。例如:

from multiprocessing import Pool

class Adder:
    """I'm using this class in place of a monte carlo simulator"""

    def add(self, a, b):
        return a + b

def setup(x, y, z):
    """Sets up the worker processes of the pool. 
    Here, x, y, and z would be your global settings. They are only included
    as an example of how to pass args to setup. In this program they would
    be "some arg", "another" and 2
    """
    global adder
    adder = Adder()

def job(a, b):
    """wrapper function to start the job in the child process"""
    return adder.add(a, b)

if __name__ == "__main__":   
    args = list(zip(range(10), range(10, 20)))
    # args == [(0, 10), (1, 11), ..., (8, 18), (9, 19)]

    with Pool(initializer=setup, initargs=["some arg", "another", 2]) as pool:
        # runs jobs in parallel and returns when all are complete
        results = pool.starmap(job, args)

    print(results) # prints [10, 12, ..., 26, 28]