Python多重处理apply_async

时间:2020-05-09 12:25:46

标签: python multiprocessing queue socketserver

我正在应用程序设计中,我从套接字(在单独的线程中运行)收集传入数据并将其存储在多处理队列中。然后,在主线程中,我使用多处理apply_async从队列中读取对象,并将它们应用到每个对象。

import time
import os
import stat
import json
import struct
import socketserver
from datetime import date

import threading
from multiprocessing import cpu_count, Pool, Queue 

from decouple import config

# Global Variable declarations
q = Queue() # Multiprocessed Queue

class RecvFileDataStreamHandler(socketserver.StreamRequestHandler):
    '''Handler for a recieving request and adding data into queues'''

    def handle(self):
        '''
        Handle multiple requests - each 4-byte length,
        followed by the converting data to dict and add in queues
        '''

        # Code to Handle incoming requests
        # Unpickle received data and store it in multiproessing
        q.put(data_dict)

class FileDataReceiver(socketserver.ThreadingTCPServer):
    daemon_threads = True

    def __init__(self, host=HOST, port=WATCHER_PORT, handler=RecvFileDataStreamHandler):
        print(f"Started listening for file_data on {(host, port)}")
        socketserver.ThreadingTCPServer.__init__(self, (host, port), handler)


def process_file(_file):
    # Main worker process


def callback_func(result):
    # Do something with data

def error_func(result):
    print("ERROR CALLBACK")


if __name__=="__main__":
    recv = FileDataReceiver()
    threading.Thread(target=recv.serve_forever, daemon=True).start()


    # Infinite Main Thread to process files in file_q
    while True:
        if q.qsize():
            result = Pool().apply_async(process_file, args=(file_q.get(False),), callback=callback_func, error_callback=error_func)

这里的问题是数据将是连续的,因此我需要无限期地进行监视,因此无法关闭Pool。如果没有关闭池,它将产生多个子进程,该子进程超出了Linux Machine上允许的限制。是否有任何解决方法,或者我无法理解的更好的应用程序设计?

0 个答案:

没有答案