使用多处理和h5py

时间:2015-08-10 12:46:22

标签: python h5py numba joblib

尝试使用joblib / multiprocessing并行运行命令时出错:

这里的追溯:

Process PoolWorker-263:
Traceback (most recent call last):
File "/home/marcel/anaconda/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
            self.run()
          File "/home/marcel/anaconda/lib/python2.7/multiprocessing/process.py", line 114, in run
            self._target(*self._args, **self._kwargs)
          File "/home/marcel/anaconda/lib/python2.7/multiprocessing/pool.py", line 102, in worker
            task = get()
          File "/home/marcel/.local/lib/python2.7/site-packages/joblib/pool.py", line 363, in get
          File "_objects.pyx", line 240, in h5py._objects.ObjectID.__cinit__ (h5py/_objects.c:2994)
        TypeError: __cinit__() takes exactly 1 positional argument (0 given)

从错误消息中可以看出,我处理使用h5py加载的数据。为了使事情复杂化,我想要并行化的例程在其子程序之一中使用numba,但我希望这无关紧要。

这是一个正在运行的示例,您可以复制并粘贴:

from joblib import Parallel,delayed
import numpy as np
import h5py as h5
import os

def testfunc(h5data, row):
    # some very boneheaded CPU work
    data_slice = h5data[:,row,...]
    ma = np.mean(data_slice, axis = 1)
    x = row
    return ma, x

def run():
    data = np.random.random((100,100,100)) 
    print data
    f_out = h5.File('tmp.h5', 'w')
    dset = f_out.create_dataset('mydata', data = data )
    f_out.close()
    f_in = h5.File('tmp.h5', 'r')
    h5data = f_in['mydata']
    pool = Parallel(n_jobs=-1,verbose=1,pre_dispatch='all')
    results = pool(delayed(testfunc)(h5data, i) for i in range(h5data.shape[1]))
    f_in.close()
    os.remove('tmp.h5')


if __name__ == '__main__':
    run()

任何想法,我做错了什么?

编辑:好吧,至少我可以从恶人列表中排除numba ......

1 个答案:

答案 0 :(得分:0)

1您可以尝试替换'joblib with [pathos][1] which replaces pickle with dill`。这通常解决了所有酸洗问题。