python多处理管理器列表错误:[Errno 2]没有这样的文件或目录

时间:2015-04-17 14:34:27

标签: python multiprocessing

我在python中编写了一个多处理程序。我使用multiprocessing.Manager().list()在子进程内共享列表。首先,我在主要过程中添加一些任务。然后,启动一些子进程来执行在共享列表中执行的任务,子进程还将任务添加到共享列表。但我得到了一个例外,如下:

    Traceback (most recent call last):
      File "/usr/lib64/python2.6/multiprocessing/process.py", line 232, in _bootstrap
        self.run()
      File "/usr/lib64/python2.6/multiprocessing/process.py", line 88, in run
        self._target(*self._args, **self._kwargs)
      File "gen_friendship.py", line 255, in worker
        if tmpu in nodes:
      File "<string>", line 2, in __contains__
      File "/usr/lib64/python2.6/multiprocessing/managers.py", line 722, in _callmethod
        self._connect()
      File "/usr/lib64/python2.6/multiprocessing/managers.py", line 709, in _connect
        conn = self._Client(self._token.address, authkey=self._authkey)
      File "/usr/lib64/python2.6/multiprocessing/connection.py", line 143, in Client
        c = SocketClient(address)
      File "/usr/lib64/python2.6/multiprocessing/connection.py", line 263, in SocketClient
        s.connect(address)
      File "<string>", line 1, in connect
    error: [Errno 2] No such file or directory

我找到了一些关于如何在this之类的python多处理中使用共享列表的方法。但仍然有一些例外。我不知道异常的含义。什么是公共列表和manager.list之间的区别?

代码如下:

    nodes = multiprocessing.Manager().list()

    lock = multiprocessing.Lock()

    AMOUNT_OF_PROCESS = 10

    def worker():
        lock.acquire()
        nodes.append(node)
        lock.release()

    if __name__ == "__main__":

        for i in range(i):
            nodes.append({"name":"username", "group":1})

        processes = [None for i in range(AMOUNT_OF_PROCESS)]

        for i in range(AMOUNT_OF_PROCESS):
            processes[i] = multiprocessing.Process(taget=worker, args=())
            processes[i].start()

1 个答案:

答案 0 :(得分:18)

问题是您的主进程在启动所有工作进程后立即退出,这会关闭Manager。当您的Manager关闭时,没有一个孩子可以使用您传递给他们的共享列表。您可以使用join等待所有孩子完成来修复它。在调用start之前,请确保您确实join了所有流程:

for i in range(AMOUNT_OF_PROCESS):
    processes[i] = multiprocessing.Process(taget=worker, args=())
    processes[i].start()
for process in processes:
    process.join()