如何在ProcessPoolExecutor中使用asyncio

时间:2019-05-23 13:15:11

标签: python python-3.x python-asyncio concurrent.futures

我正在网络上搜索大量地址,我想在任务中同时使用asyncio和ProcessPoolExecutor来快速搜索地址。

    async def main():
        n_jobs = 3
        addresses = [list of addresses]
        _addresses = list_splitter(data=addresses, n=n_jobs)
        with ProcessPoolExecutor(max_workers=n_jobs) as executor:
             futures_list = []
             for _address in _addresses:
                futures_list +=[asyncio.get_event_loop().run_in_executor(executor, execute_parallel, _address)]

                for f in tqdm(as_completed(futures_list, loop=asyncio.get_event_loop()), total=len(_addresses)):
                results = await f

asyncio.get_event_loop().run_until_complete(main())

预期: 我想execute_parallel函数应该并行运行。

错误:

    Traceback (most recent call last):
  File "/home/awaish/danamica/scraping/skraafoto/aerial_photos_scraper.py", line 228, in <module>
    asyncio.run(main())
  File "/usr/local/lib/python3.7/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.7/asyncio/base_events.py", line 584, in run_until_complete
    return future.result()
  File "/home/awaish/danamica/scraping/skraafoto/aerial_photos_scraper.py", line 224, in main
    results = await f
  File "/usr/local/lib/python3.7/asyncio/tasks.py", line 533, in _wait_for_one
    return f.result()  # May raise f.exception().
TypeError: can't pickle coroutine objects

1 个答案:

答案 0 :(得分:1)

我不确定我回答的是正确的问题,但是看来您的代码的目的是使用Asyncio在多个进程中运行execute_parallel函数。与使用ProcessPoolExecutor相反,为什么不尝试使用普通的多处理池并设置单独的Asyncio循环在每个循环中运行。您可能会为每个内核设置一个进程,然后让Asyncio在每个进程中发挥其魔力。

async def run_loop(addresses):
    loop = asyncio.get_event_loop()
    loops = [loop.create_task(execute_parallel, address) for address in addresses]
    loop.run_until_complete(asyncio.wait(loops))

def main():
    n_jobs = 3
    addresses = [list of addresses]
    _addresses = list_splitter(data=addresses, n=n_jobs)
    with multiprocessing.Pool(processes=n_jobs) as pool:
        pool.imap_unordered(run_loop, _addresses)

我使用Pool.imap_unordered取得了很大的成功,但是根据您的需要,您可能更喜欢Pool.map或其他功能。您可以使用块大小或每个列表中的地址数量来达到最佳效果(即,如果超时很多,您可能希望减少同时处理的地址数量)