Tornado中的Queue和ProcessPoolExecutor

时间:2015-11-05 20:14:26

标签: python concurrency tornado

我尝试使用新的Tornado queue对象和concurrent.futures来允许我的网络服务器将CPU密集型任务传递给其他进程。我希望能够访问Future模块中ProcessPoolExecutor返回的concurrent.futures对象,以便查询其状态以显示在前端(例如显示该进程当前正在运行;显示它已完成)。

这个方法似乎有两个障碍:

  1. 如何向q.get()提交多个ProcessPoolExecutor个对象,同时还可以访问返回的Future个对象?
  2. 如何让HomeHandler访问Future返回的ProcessPoolExecutor对象,以便我可以在前端显示状态信息?
  3. 感谢您的帮助。

    from tornado import gen
    from tornado.ioloop import IOLoop
    from tornado.queues import Queue
    
    from concurrent.futures import ProcessPoolExecutor
    
    define("port", default=8888, help="run on the given port", type=int)
    q = Queue(maxsize=2)
    
    
    def expensive_function(input_dict):
        gen.sleep(1)
    
    
    @gen.coroutine
    def consumer():
        while True:
            input_dict = yield q.get()
            try:
                with ProcessPoolExecutor(max_workers=4) as executor:
                    future = executor.submit(expensive_function, input_dict)
            finally:
                q.task_done()
    
    
    @gen.coroutine
    def producer(input_dict):
        yield q.put(input_dict)
    
    
    class Application(tornado.web.Application):
    def __init__(self):
        handlers = [
            (r"/", HomeHandler),
        ]
        settings = dict(
            blog_title=u"test",
            template_path=os.path.join(os.path.dirname(__file__), "templates"),
            static_path=os.path.join(os.path.dirname(__file__), "static"),
            debug=True,
        )
        super(Application, self).__init__(handlers, **settings)
    
    
    class HomeHandler(tornado.web.RequestHandler):
        def get(self):
            self.render("home.html")
    
        def post(self, *args, **kwargs):
            input_dict = {'foo': 'bar'}
    
            producer(input_dict)
    
            self.redirect("/")
    
    
    def main():
        tornado.options.parse_command_line()
        http_server = tornado.httpserver.HTTPServer(Application())
        http_server.listen(options.port)
        tornado.ioloop.IOLoop.current().start()
    
    
    def start_consumer():
        tornado.ioloop.IOLoop.current().spawn_callback(consumer)
    
    
    if __name__ == "__main__":
        tornado.ioloop.IOLoop.current().run_sync(start_consumer)
        main()
    

1 个答案:

答案 0 :(得分:2)

您想通过合并ContextWrapperQueue来实现什么目标?执行程序已经拥有自己的内部队列。你需要做的就是让ProcessPoolExecutor成为一个全局的(它不一定是全局的,但即使你保留队列,你也要做类似于全局的事情;它不会使每次通过ProcessPoolExecutor的循环创建一个新的ProcessPoolExecutor,并直接从处理程序向其提交内容。

consumer