我是asynchronus编程的新手。我正在编写用于检查网页状态的脚本。 当然,我喜欢doint thath asynchronus。 我的片段:
import aiohttp
import asyncio
url_site = 'http://anysite.com'
fuzz_file = 'fuzz.txt'
def generate_links(file):
with open(file) as f:
return [str(url_site) + str(line.strip()) for line in f]
async def fetch_page(client, url):
async with client.get(url) as response:
return response.status
async def run():
links = generate_links(fuzz_file)
for f,link in asyncio.as_completed([fetch_page(client,link) for link in links]):
print("[INFO] [{}] {}".format(f, link))
loop = asyncio.get_event_loop()
conn = aiohttp.ProxyConnector(proxy="http://10.7.0.35:8080")
client = aiohttp.ClientSession(loop=loop, connector=conn)
loop.run_until_complete(run())
client.close()
但我收到了下一个错误:Task was destroyed but it is pending!
有人能指出我犯错的地方吗?
答案 0 :(得分:1)
来自as_completed
的文档:
返回一个迭代器,其等待的值为Future实例。
因此,您必须await
as_completed
返回的每个对象:
for f in asyncio.as_completed([fetch_page(client,link) for link in links]):
status = await f
您可能还想查看wait以获得更精细的控制。