Asyncio:如何处理多个打开的文件操作系统错误

时间:2017-08-03 21:21:39

标签: python python-3.x operating-system subprocess python-asyncio

我正在尝试运行~500个异步子进程。我在下面的主函数中将文件作为列表 p_coros 传递。

async def run_check(shell_command):
    p = await asyncio.create_subprocess_shell(shell_command,
                    stdin=PIPE, stdout=PIPE, stderr=STDOUT)
    fut = p.communicate()
    try:
        pcap_run = await asyncio.wait_for(fut, timeout=5)
    except asyncio.TimeoutError:
        p.kill()
        await p.communicate()

def get_coros():
    for pcap_loc in print_dir_cointent():
        for pcap_check in get_pcap_executables():
            tmp_coro = (run_check('{args}'
            .format(e=sys.executable, args=args)))
            if tmp_coro != False:
                coros.append(tmp_coro)
     return coros

async def main(self):
    ## Here p_coros has over 500 files
    p_coros = get_coros()
    for f in asyncio.as_completed(p_coros):
        res = await f




loop = asyncio.get_event_loop()
loop.run_until_complete(get_coros())
loop.close()

我认为这里的问题是 asyncio.as_completed ,因为它正在尝试并行打开所有文件,因为如果我删除 asyncio.as_completed 它可以正常工作但需要很多时间 我想处理打开文件问题 OSError(24,'太多打开文件')而不会浪费太多时间。

日志:

Exception ignored when trying to write to the signal wakeup fd:
BlockingIOError: [Errno 11] Resource temporarily unavailable

ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending coro=<ClassificationCheck.run_check() running at ./regression.py:74> wait_for=<Future finished exception=RuntimeError('Event loop is closed',)> cb=[as_completed.<locals>._on_completion() at /usr/lib/python3.5/asyncio/tasks.py:478]>

回溯:

Traceback (most recent call last):
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
    result = coro.send(None)
  File "./regression.py", line 74, in run_check
    stdin=PIPE, stdout=PIPE, stderr=STDOUT)
  File "/usr/lib/python3.5/asyncio/subprocess.py", line 197, in create_subprocess_shell
    stderr=stderr, **kwds)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 1049, in subprocess_shell
    protocol, cmd, True, stdin, stdout, stderr, bufsize, **kwargs)
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 184, in _make_subprocess_transport
    **kwargs)
  File "/usr/lib/python3.5/asyncio/base_subprocess.py", line 40, in __init__
    stderr=stderr, bufsize=bufsize, **kwargs)
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 640, in _start
    stdin, stdin_w = self._loop._socketpair()
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 53, in _socketpair
    return socket.socketpair()
  File "/usr/lib/python3.5/socket.py", line 478, in socketpair
    a, b = _socket.socketpair(family, type, proto)
OSError: [Errno 24] Too many open files
ERROR:asyncio:Task exception was never retrieved
future: <Task finished coro=<ClassificationCheck.run_check() done, defined at ./regression.py:72> exception=OSError(24, 'Too many open files')>

1 个答案:

答案 0 :(得分:1)

当我将大量文件传递给异步工作时,它引发了操作系统错误。 我通过创建列表列表和每个包含固定数量的PCAP的子列表来处理它的方式,这些PCAP不会导致 OS Error ,然后一次传递一个列表。

所以我了解到在继续处理更多文件之前关闭已打开的文件非常重要。

def get_coros(pcap_list):
    for pcap_loc in pcap_list:
        for pcap_check in get_pcap_executables():
            tmp_coro = (run_check('{args}'
            .format(e=sys.executable, args=args)))
            if tmp_coro != False:
                coros.append(tmp_coro)
     return coros

async def main():
    pcap_list_gen = print_dir_cointent() # Passing a list of lists
    for pcap_list in pcap_list_gen:
        p_coros = get_coros(pcap_list)
        for f in asyncio.as_completed(p_coros):
            res = await f