将大型stderr写入文件时,防止子进程死锁

时间:2019-02-26 01:38:06

标签: memory subprocess deadlock stderr python-3.7

我正在使用subprocess.call将.zip文件转换为XML。我还通过将打开的文件句柄传递给stderr参数来记录每个进程的标准错误。

问题:有时转换将输出大量错误,并且进程出现死锁。活动监视器显示该进程挂起了8 GB的内存。

def convert(directory): 
    submissions = get_submissions_to_convert(directory)
    for submission in submissions:
        outpath = get_outpath(submission)
        submission_id = os.path.splitext(os.path.basename(outpath))[0]
        logfile_path = 'logs/' + submission_id + '.txt'
        try:
            with open(logfile_path, 'w+') as logfile:
                sp.call(['latexmlc', '--timeout=240', '--dest=' + outpath, submission], stderr=logfile)
        except KeyboardInterrupt:
            # If I interrupt the conversion, remove the logfile so it can be reattempted
            print('You interrupted convert()!')
            print('Removing ' + logfile_path)
            os.remove(logfile_path)
            raise
        except Exception as e:
            print('Something went wrong in convert(): ' + e)
            print('Removing ' + logfile_path)
            os.remove(logfile_path)
            raise

如果我正确理解了文档和其他问题/答案,如果我将PIPE传递给stderr并且缓冲区溢出,则该过程将挂起,等待缓冲区清除,自{ {1}}仅在过程完成后返回。我找不到有关传递打开的文件句柄是否导致相同死锁问题的任何信息。

如何将大型stderr写入文件,并防止子进程挂起?

0 个答案:

没有答案