无论如何都要提高以下的速度。代码将修饰符插入到http请求中。修改器是5位数的电子邮件ID。目标是将每个结果添加到JSON文件中。
我知道摆脱Progessbar()会有所帮助,但我发现减速值得功能的实用性。 Try Except Else部分再次放慢速度再次退出,但是idents有4500个条目。有什么想法吗?
import urllib3
import json
import csv
from progressbar import ProgressBar
pbar = ProgressBar()
with open('blim2.csv', newline='') as csvfile:
idents = csv.reader(csvfile, delimiter=' ', quotechar='|')
json_arr = []
while True:
try:
for x in pbar(idents):
http = urllib3.PoolManager()
r = http.request('GET', 'https://api.pipedrive.com/v1/mailbox/mailMessages/'+"".join(x)+'?include_body=1&api_token=token')
mails = json.loads(r.data.decode('utf-8'))
json_arr.append(mails)
with open('test2.json', 'w') as outfile:
json.dump(json_arr, outfile)
except:
continue
else:
break
答案 0 :(得分:0)
尝试将while True
循环一直移到内部,以便仅重试单个请求。另外,最后只写一次文件。
import urllib3
import json
import csv
from progressbar import ProgressBar
import time
pbar = ProgressBar()
base_url = 'https://api.pipedrive.com/v1/mailbox/mailMessages/'
fields = {"include_body": "1", "api_token": "token"}
json_arr = []
http = urllib3.PoolManager()
with open('blim2.csv', newline='') as csvfile:
for x in pbar(csv.reader(csvfile, delimiter=' ', quotechar='|')):
while True:
try:
r = http.request('GET', base_url + "".join(x), fields=fields)
break
except Exception:
# wait a bit before trying again
time.sleep(1) # seconds
mails = json.loads(r.data.decode('utf-8'))
json_arr.append(mails)
with open('test2.json', 'w') as outfile:
json.dump(json_arr, outfile)
您还应该查看在您的案例中实际引发的异常以及仅except
。否则你可能会掩盖其他问题。我已经用except
替换了您的except Exception
,这至少允许用户按Ctrl + C中止该程序。