Python流子进程stdout / stderr实时

时间:2015-03-04 17:16:10

标签: python asynchronous subprocess

我想生成多个子进程并并行运行它们。我有一个看起来像这样的功能:

def stream_command(command):
    proc = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE,
                            stderr=subprocess.PIPE)
    while proc.poll() is None:
        line = proc.stdout.readline()
        sys.stdout.write('[%s]: %s' % (command, line))
    return proc.poll()

然后我可以用这个并行(粗略地)运行多个:

def stream_commands(commands):
    threads = []
    for command in commands:
        target = lambda: stream_command(command)
        thread = Thread(target=target)
        thread.start()
        threads.append(thread)
    while True:
        if any(t.is_alive() for t in threads):
            continue
        else:
            break

但问题是,在我的stream_command函数中,我在proc.stdout.readline()的调用中阻止。这意味着一些事情:首先,如果进程永远不会写入stdout,那么该函数将永远挂起(例如,即使子进程终止)。其次,我不能单独回复流程的stdoutstderr(我必须首先阻止读取到一个,然后到另一个...这是不太可能工作的)。我想做的是类似于我在node.js写的内容:

def stream_command(command):
    def on_stdout(line):
        sys.stdout.write('[%s]: %s' % (command, line))
    def on_stderr(line):
        sys.stdout.write('[%s (STDERR)]: %s' % (command, line))
    proc = asyncprocess.Popen(shlex.split(command),
            on_stdout=on_stdout,
            on_stderr=on_stderr
    )
    return proc.wait()

当然,asyncprocess是一个虚构的流程模块,它允许我启动子流程并为stdoutstderr传递处理函数。

那么,有什么类似于我上面的asyncprocess模块,或者失败了,有没有简单的方法来回应异步到python中子进程的事件? / p>

顺便说一下,我应该注意到我正在使用python 2.7。通过asyncio库似乎有一些python3的东西,但不幸的是,这在这里不起作用,AFAIK。

1 个答案:

答案 0 :(得分:0)

您可以使用每个数据流的线程执行此操作。假设您希望stream_commands阻止所有命令完成,您可以执行以下操作:

stdout_lock = threading.Lock()

def pipe_to_stdout(preamble, pipe):
    for line in pipe:
        with stdout_lock:
            sys.stdout.write(preamble + line)

def stream_commands(commands):
    threads = []
    procs = []
    try:
        for command in commands:
            proc = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE,
                                stderr=subprocess.PIPE)
            procs.append(proc)
            out_thread = Thread(target=target, args=('[stdout]: ', proc.stdout)
            err_thread = Thread(target=target, args=('[stderr]: ', proc.stderr)
            out_thread.start()
            err_thread.start()
            threads.append(out_thread)
            threads.append(err_thread)
    finally:
        for proc in procs:
            proc.wait()
        for thread in threads:
            thread.join()