python multiprocessing共享计数器,酸洗错误

时间:2015-02-07 17:40:38

标签: python multiprocessing pickle python-multiprocessing

我正在进行多处理以处理一些非常大的文件。

我可以使用使用multiprocessing.BaseManager子类在进程之间共享的collections.Counter集合来计算特定字符串的出现次数。

虽然我可以分享柜台,看似腌制但似乎没有正确腌制。我可以将字典复制到我可以挑选的新字典中。

我想了解如何避免"复制"选择之前共享计数器。

这是我的(伪代码):

from multiprocessing.managers import BaseManager
from collections import Counter

class MyManager(BaseManager):
    pass

MyManager.register('Counter', Counter)

def main(glob_pattern):
    # function that processes files
    def worker_process(files_split_to_allow_naive_parallelization, mycounterdict):
        # code that loops through files
        for line in file:
            # code that processes line
            my_line_items = line.split()
            index_for_read = (my_line_items[0],my_line_items[6])
            mycounterdict.update((index_for_read,))

    manager = MyManager()
    manager.start()
    mycounterdict = manager.Counter()

    # code to get glob files , split them with unix shell split and then chunk then

    for i in range(NUM_PROCS):
        p = multiprocessing.Process(target=worker_process , args = (all_index_file_tuples[chunksize * i:chunksize * (i + 1)],mycounterdict))
        procs.append(p)
        p.start()
    # Now we "join" the processes
    for p in procs:
        p.join()

    # This is the part I have trouble with
    # This yields a pickled file that fails with an error
    pickle.dump(mycounterdict,open("Combined_count_gives_error.p","wb"))

    # This however works
    # How can I avoid doing it this way?
    mycopydict = Counter()
    mydictcopy.update(mycounterdict.items())
    pickle.dump(mycopydict,open("Combined_count_that_works.p","wb"))

当我尝试加载"腌制"错误给出的文件总是一个较小的固定大小,我得到一个没有意义的错误。

如何在不通过上述伪代码创建新dict的情况下挑选共享字典。

>>> p = pickle.load(open("Combined_count_gives_error.p"))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 1378, in load
    return Unpickler(file).load()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 880, in load_eof
    raise EOFError
EOFError

1 个答案:

答案 0 :(得分:2)

您的代码存在一些问题。首先,如果您将文件悬空,则无法保证关闭该文件。其次,mycounterdict不是实际的Counter,而是代理它 - 腌制它并且你会遇到很多问题,因为它在这个过程之外是无法解决的。但是,您无需使用update进行复制:.copy制作新的Counter副本。

因此你应该使用

with open("out.p", "wb") as f:
    pickle.dump(mycounterdict.copy(), f)

至于这是一个好的模式,答案是。您可以在每个过程中单独计算,而不是使用共享计数器,以获得更简单的代码:

from multiprocessing import Pool
from collections import Counter
import pickle

def calculate(file):
    counts = Counter()
    ...
    return counts

pool = Pool(processes=NPROCS)
counts = Counter()
for result in pool.map(calculate, files):
    counts += result

with open("out.p", "wb") as f:
    pickle.dump(counts, f)