使用多处理加速python URL请求(Google API)

时间:2015-10-06 10:51:18

标签: python multithreading python-requests google-distancematrix-api

我正在使用Google的距离矩阵API,并希望在10秒内达到1000个请求的速率限制,而不必使用矩阵组件,只需更高的限制功能(方向API有10秒 - >这有效有100秒)。

修改

我的主要目标是能够精确控制我填充的请求数量(例如,在10秒内完成800次,或者在几秒钟内完成1000次),理想情况下我会使用这种伪代码但我不确定:

pool = Pool()  # Initialize a pool of max processes
start_time = time.time()
counter = 0
for one_url in URL_LIST:
    counter += 1
    # Once sent 1000 requests
    if counter = 1000:
        # If elapsed less than 10 seconds
        if time.time() - start_time < 10 seconds:
            # Wait until we hit 10 seconds
            time.sleep(time.time() - start_time)  
        else:
            #Taking more than 10 seconds
            pass
        #Reset counter and timer
        start_time = time.time()  
        counter = 0   
    RESULTS_OUT = pool.map(GeocodeHandler, one_url) 
pool.close()  # This means that no more tasks will be added to the pool
pool.join()  # This blocks the program till function is run on all the items

0 个答案:

没有答案