python多处理struct.error

时间:2018-01-09 14:09:37

标签: python multithreading struct

我循环遍历一组大文件,并使用多处理进行操作/写入。我从我的数据帧中创建一个可迭代的并将其传递给多处理的map函数。对于较小的文件,处理很好,但当我点击较大的文件(~10g)时,我得到错误:

python struct.error: 'i' format requires -2147483648 <= number <= 2147483647

代码:

    data = np.array_split(data, 10)        
    with mp.Pool(processes=5, maxtasksperchild=1) as pool1:
                    pool1.map(write_in_parallel, data)
                    pool1.close()
                    pool1.join()

基于this answer我认为问题是我传递给地图的文件太大了。所以我尝试首先将数据帧拆分为1.5g块并将每个数据块独立传递给map,但我仍然收到相同的错误。

完整追溯:

Traceback (most recent call last):
  File "_FNMA_LLP_dataprep_final.py", line 51, in <module>
    write_files()
  File "_FNMA_LLP_dataprep_final.py", line 29, in write_files
    '.txt')
  File "/DATAPREP/appl/FNMA_LLP/code/FNMA_LLP_functions.py", line 116, in write_dynamic_columns_fannie
    pool1.map(write_in_parallel, first)
  File "/opt/Python364/lib/python3.6/multiprocessing/pool.py", line 266, in map
    return self._map_async(func, iterable, mapstar, chunksize).get()
  File "/opt/Python364/lib/python3.6/multiprocessing/pool.py", line 644, in get
    raise self._value
  File "/opt/Python364/lib/python3.6/multiprocessing/pool.py", line 424, in _handle_tasks
    put(task)
  File "/opt/Python364/lib/python3.6/multiprocessing/connection.py", line 206, in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "/opt/Python364/lib/python3.6/multiprocessing/connection.py", line 393, in _send_bytes
    header = struct.pack("!i", n)
struct.error: 'i' format requires -2147483648 <= number <= 2147483647

1 个答案:

答案 0 :(得分:3)

在你提到的answer中还有另一个要点:数据应该由子函数加载。在您的情况下,它的功能 write_in_parallel 。我建议你改变你的孩子的功能:

def write_in_parallel('/path/to/your/data'):
    """ We'll make an assumption that your data is stored in csv file""" 

    data = pd.read_csv('/path/to/your/data')
    ...

然后你的&#34;池代码&#34;应该是这样的:

with mp.Pool(processes=(mp.cpu_count() - 1)) as pool:
    chunks = pool.map(write_in_parallel, ('/path/to/your/data',))
df = pd.concat(chunks)

我希望这会对你有帮助。