Python Pickle抛出FileNotFoundError

时间:2014-11-18 19:49:28

标签: python pickle

我在python中有一种情况,在与其他进程一起运行的python runnable对象中,会发生以下情况:如果代码只是:

f = open(filename, "rb")
f.close()

没有错误,但是当代码更改为以下内容时,在中间引入pickle,它会抛出FileNotFoundError

f = open(filename, "rb")
object = pickle.load(f)
f.close()

我不明白为什么,如果文件存在,pickle会抛出这样的错误。完整的错误跟踪是:

task = pickle.load(f)
File "/usr/lib/python3.4/multiprocessing/managers.py", line 852, in RebuildProxy
   return func(token, serializer, incref=incref, **kwds)
File "/usr/lib/python3.4/multiprocessing/managers.py", line 706, in __init__
   self._incref()
File "/usr/lib/python3.4/multiprocessing/managers.py", line 756, in _incref
   conn = self._Client(self._token.address, authkey=self._authkey)
File "/usr/lib/python3.4/multiprocessing/connection.py", line 495, in Client
   c = SocketClient(address)
File "/usr/lib/python3.4/multiprocessing/connection.py", line 624, in SocketClient
   s.connect(address)
FileNotFoundError: [Errno 2] No such file or directory

1 个答案:

答案 0 :(得分:2)

虽然我说 @TesselatingHeckler 提供了答案,但此评论并不适用于评论部分......所以它是该答案的扩展。

您无法挑选multiprocessing QueuePipe个对象,事实上,某些腌制对象只会在load上失败。

我已经构建了一个multiprocessing的分支,允许大多数对象被腌制。主要做的是用更强大的序列化器(pickle)替换dill。 fork作为pathos包的一部分提供(在github上)。我之前尝试过序列化Pipes,并且它有效,它也适用于python socket和python Queue。但是,它仍然无法在multiprocessing Queue上发挥作用。

>>> from processing import Pipe
>>> p = Pipe()
>>> 
>>> import dill
>>> dill.loads(dill.dumps(p))
(Connection(handle=12), Connection(handle=14))
>>>
>>> from socket import socket
>>> s = socket()
>>> from Queue import Queue as que
>>> w = que()
>>> dill.loads(dill.dumps(s))
<socket._socketobject object at 0x10dae18a0>
>>> dill.loads(dill.dumps(w))
<Queue.Queue instance at 0x10db49f38>
>>>
>>> from processing import Queue
>>> q = Queue()
>>> dill.loads(dill.dumps(q))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/mmckerns/lib/python2.7/site-packages/dill-0.2.2.dev-py2.7.egg/dill/dill.py", line 180, in dumps
    dump(obj, file, protocol, byref, file_mode, safeio)
  File "/Users/mmckerns/lib/python2.7/site-packages/dill-0.2.2.dev-py2.7.egg/dill/dill.py", line 173, in dump
    pik.dump(obj)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 306, in save
    rv = reduce(self.proto)
  File "/Users/mmckerns/lib/python2.7/site-packages/processing/queue.py", line 62, in __getstate__
    assertSpawning(self)
  File "/Users/mmckerns/lib/python2.7/site-packages/processing/forking.py", line 24, in assertSpawning
    'processes through inheritance' % type(self).__name__)
RuntimeError: Queue objects should only be shared between processes through inheritance

__getstate__ multiprocessing的{​​{1}}方法似乎是硬连线引发错误。如果你把它埋在一个类中,那么在从pickle重构类实例之前它可能不会被触发。

尝试使用Queue挑选multiprocessing Queue也应该会出现上述错误。