使用python多处理队列时打开的文件太多

时间:2019-03-22 12:54:15

标签: python-3.x multiprocessing

我在python中遇到multiprocessing和multiprocessing.queues问题。 我正在使用一个父进程,并产生了很多子进程,这些子进程应该可以完成一些工作。但是,如果我运行我的代码,由于代码耗尽了所有文件描述符(我正在使用Ubuntu),它会崩溃并发生异常。由于我的工人什么也没做,所以我不知道问题出在哪里。

我正在使用以下代码:

        private void button2_Click(object sender, EventArgs e)
        {
            // Open video file
            VideoFileReader reader = new VideoFileReader();
            reader.Open(@"c:\Users\Replay.avi");

            // Read frame no. 1 to 300 and display in picturebox
            for (int j = 1; j < 300; j++)
             {

                Bitmap replay_frame = reader.ReadVideoFrame();
                pictureBox1.Image = replay_frame;
            }
            reader.Close();

        }

运行代码时,我得到 最初的例外是:

# Lists to store all results
all_results1 = []
all_results2 = []
all_results3 = []

# Queues to store analysis results
queue1 = Queue()
queue2 = Queue()
queue3 = Queue()

j = 0
processes = []
while j < 1000000000:
    for i in range(0, PROCESS_POOL_SIZE):
        if j < 1000000000:

        # Start a new process for each profile
        p = Process(target=analyze_database, args=( queue1, queue2, queue3))
        processes.append(p)
        p.start()
        j += 1

# Retrieve results from processes
running = True
while running:
    amount = sum(p.is_alive() for p in processes)
    running = amount > 0

    while not queue1.empty():
        results1.append(queue1.get_nowait())
    while not queue2.empty():
        results2.append(queue1.get_nowait())
    while not queue3.empty():
        results3.append(queue1.get_nowait())
        data = non_gdpr_queue.get_nowait()
        all_non_gdpr_results[data[0]] = data[1]

    # start new processes
    if amount < PROCESS_POOL_SIZE and j < len(db_names):
                    p = Process(target=analyze_database, args=( queue1, queue2, queue3))
        processes.append(p)
        p.start()
        j += 1
        processes.append(p)
        p.start()
        j += 1


def analyze_database(queue1, queue2, queue3):
    queue1.put_nowait(None)
    queue2.put_nowait(None)
    queue3.put_nowait(None)

似乎到我的子进程的管道未正确关闭。这是父进程的lsof:

Traceback (most recent call last):
  File "OpenWPMAnalyzer.py", line 262, in <module>
    main()
  File "OpenWPMAnalyzer.py", line 151, in main
    p.start()
  File "/usr/lib/python3.6/multiprocessing/process.py", line 105, in start
    self._popen = self._Popen(self)
  File "/usr/lib/python3.6/multiprocessing/context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "/usr/lib/python3.6/multiprocessing/context.py", line 277, in _Popen
    return Popen(process_obj)
  File "/usr/lib/python3.6/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/usr/lib/python3.6/multiprocessing/popen_fork.py", line 65, in _launch
    parent_r, child_w = os.pipe()
OSError: [Errno 24] Too many open files

0 个答案:

没有答案