我正在尝试同时请求一堆URL但是URL是从列表构建的。目前我正在遍历列表并且(我认为)将它们添加到队列中。它肯定比requests.get快10倍,但我不确定我是否正确执行它,因此可以进行优化。我对它进行了分析,并注意到在完成并发请求后,它仍然会锁定90%的时间,即start - > 10多个并发请求 - >锁定5秒左右 - >完成
此外,此代码最后会生成Unclosed client session
消息。知道为什么吗?很确定这是正确使用上下文管理器。
我搜索过但没有找到这个确切的问题
import signal
import sys
import asyncio
import aiohttp
import json
import requests
lists = ['eth', 'btc', 'xmr', 'req', 'xlm', 'etc', 'omg', 'neo', 'btc', 'xmr', 'req', 'xlm', 'etc', 'omg', 'neo']
loop = asyncio.get_event_loop()
client = aiohttp.ClientSession(loop=loop)
async def fetch(client, url):
async with client.get(url) as resp:
assert resp.status == 200
return await resp.text()
async def main(loop=loop, url=None):
async with aiohttp.ClientSession(loop=loop) as client:
html = await fetch(client, url)
print(html)
def signal_handler(signal, frame):
loop.stop()
client.close()
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
tasks = []
for item in lists:
url = "{url}/{endpoint}/{coin_name}".format(
url='https://coincap.io',
endpoint='page',
coin_name=item.upper()
)
print(url)
tasks.append(
asyncio.ensure_future(main(url=url))
)
loop.run_until_complete(asyncio.gather(*tasks))
答案 0 :(得分:3)
看起来你有什么作品,但正如你认为你没有做得非常正确:
Unclosed client session
)警告add_signal_handler
以下是我对您的代码的简化:
import asyncio
import aiohttp
lists = ['eth', 'btc', 'xmr', 'req', 'xlm', 'etc', 'omg', 'neo', 'btc', 'xmr', 'req', 'xlm', 'etc', 'omg', 'neo']
async def fetch(client, item):
url = 'https://coincap.io/{endpoint}/{coin_name}'.format(
endpoint='page',
coin_name=item.upper()
)
async with client.get(url) as resp:
assert resp.status == 200
html = await resp.text()
print(html)
async def main():
async with aiohttp.ClientSession() as client:
await asyncio.gather(*[
asyncio.ensure_future(fetch(client, item))
for item in lists
])
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
如果你想处理html,你可以在fetch coroutine中执行,也可以对gather
的所有结果进行操作。