如何限制URL列表中的GET请求

时间:2018-08-24 14:49:48

标签: python python-3.x python-requests grequests

我有一个〜250000个url的列表,我需要从API获取数据。

我已经使用grequests库创建了一个类来进行异步调用。但是,API的限制是每秒100次调用,这超出了grequest。

使用grequests的代码:

import grequests

lst = ['url.com','url2.com']

class Test:
    def __init__(self):
        self.urls = lst

    def exception(self, request, exception):
        print ("Problem: {}: {}".format(request.url, exception))

    def async(self):
        return grequests.map((grequests.get(u) for u in self.urls), exception_handler=self.exception, size=100000)


    def collate_responses(self, results):
        return [x.text for x in results]

test = Test()
#here we collect the results returned by the async function
results = test.async()

反正我可以使用请求库每秒发出100次呼叫吗?

我尝试了请求,但在大约100000次呼叫后超时。

在这种情况下,我要将一个ID传递给URL。

import requests
L = [1,2,3]

for i in L:
    #print (row)
    url = 'url.com/Id={}'.format(i)
    xml_data1 = requests.get(url).text
    lst.append(xml_data1)
    time.sleep(1)
    print(xml_data1) 

1 个答案:

答案 0 :(得分:0)

使用多线程。

from multiprocessing.dummy import Pool as ThreadPool
def some_fun(url):
    for i in L:
    #print (row)
    url = 'url.com/Id={}'.format(i)
    xml_data1 = requests.get(url).text
    lst.append(xml_data1)
    time.sleep(1)
    print(xml_data1) 

if __name__ == '__main__':
    lst = ['url.com','url2.com']
    c_pool = ThreadPool(30) #add as many as threads you can
    c_pool.map(some_fun, lst)
    c_pool.close()
    c_pool.join()

干杯!