如何在Python中运行两个并行请求并加入结果

时间:2018-01-19 20:31:41

标签: python multithreading python-2.7 parallel-processing

我有以下代码:

def do_smth(query):
    result_1 = api_request_1(query) # ['1', '2', '3']
    result_2 = api_request_2(query) # ['a', 'b', 'c']
    return result_1 + result_2      # ['1', '2', '3', 'a', 'b', 'c']

现在我想并行运行这些请求并合并结果。所以我这样做:

def do_smth_parallel(query):
    pool = Pool(processes=2)

    result = []
    arg = [ query ]
    result.extend(pool.map(api_request_1, arg)[0])
    result.extend(pool.map(api_request_2, arg)[0])

    pool.close()
    pool.join()

    return result

到目前为止一直很好,但map阻止功能。所以... do_smth_parallel并不多parallel :) 我怎么能这样做?

P.S。在Java中,我会使用ExecutorService和几个Future来实现

2 个答案:

答案 0 :(得分:1)

您正在寻找map_async https://docs.python.org/2/library/multiprocessing.html#multiprocessing.pool.AsyncResult而不是map。这是你的改编例子。这可以应用于任意数量的函数调用。所有都将异步执行。

def do_smth_parallel(query):
    pool = Pool(processes=2)

    result = []
    arg = [ query ]
    future_1 = pool.async_map(api_request_1, arg)
    future_2 = pool.async_map(api_request_2, arg)

    result_1 = future_1.get()
    results_2 = future_2.get()

    pool.close()
    pool.join()

    return result_1 + result_2

答案 1 :(得分:1)

另一种方法是使用concurrent.futures包:

from concurrent.futures import Executor

def do_smth_parallel(query):
    exc = Executor()

    req1 = exc.submit(api_request_1, query)
    req2 = exc.submit(api_request_2, query)

    return req1.result() + req2.result()