Errno 24在Python中进行多处理时打开的文件过多

时间:2020-06-24 14:17:59

标签: python lambda python-multiprocessing python-multithreading

我有以下一段代码,其中我是从S3存储桶中提取对象并将其添加到进程中的。存储桶中总共有300个对象。

parent_connections = []
processes = []
for key in s3.get_matching_s3_keys(bucket_name, prefix, ".json"):
    util.log(f"Found record with key {key} in bucket {bucket_name}")
    parent_conn, child_conn = Pipe(duplex=False)
    parent_connections.append(parent_conn)
    process = Process(target=check_data, args=[bucket_name, key, obj_id, s3, child_conn])
    processes.append(process)

for process in processes:
    process.start()
    process.join()

for conn in parent_connections:
    output.append(conn.recv()[0])
    conn.close()

执行代码时,出现以下错误:

"errorMessage": "[Errno 24] Too many open files",   
"errorType": "OSError",   
"stackTrace": [
    " File "/var/lang/lib/python3.7/multiprocessing/context.py", line 62, in Pipe   return Pipe(duplex)",
" File "/var/lang/lib/python3.7/multiprocessing/connection.py", line 517, in Pipe   fd1, fd2 = os.pipe()"   ]

我不确定如何解决此问题?有人可以帮忙吗? 预先感谢。

0 个答案:

没有答案