使用Python3 asyncio并发HTTP get请求,连接不会关闭

时间:2014-06-19 03:01:08

标签: python-3.x python-requests python-asyncio aiohttp

我刚刚开始使用Python3.4中的asyncio库,并编写了一个小程序,试图同时获取50个网页。该程序在几百个请求之后以“太多打开文件”异常爆炸。

我认为我的fetch方法用'response.read_and_close()'方法调用关闭了连接。

有什么想法在这里发生了什么?我是否正确地解决了这个问题?

import asyncio
import aiohttp

@asyncio.coroutine
def fetch(url):
    response = yield from aiohttp.request('GET', url)
    response = yield from response.read_and_close()
    return response.decode('utf-8')

@asyncio.coroutine
def print_page(url):
    page = yield from fetch(url)
    # print(page)

@asyncio.coroutine
def process_batch_of_urls(round, urls):
  print("Round starting: %d" % round)
  coros = []
  for url in urls:
      coros.append(asyncio.Task(print_page(url)))
  yield from asyncio.gather(*coros)
  print("Round finished: %d" % round)

@asyncio.coroutine
def process_all():
  api_url = 'https://google.com'
  for i in range(10):
    urls = []
    for url in range(50):
      urls.append(api_url)
    yield from process_batch_of_urls(i, urls)


loop = asyncio.get_event_loop()
loop.run_until_complete(process_all())

我得到的错误是:

Traceback (most recent call last):
  File "/usr/local/lib/python3.4/site-packages/aiohttp/client.py", line 106, in request
  File "/usr/local/lib/python3.4/site-packages/aiohttp/connector.py", line 135, in connect
  File "/usr/local/lib/python3.4/site-packages/aiohttp/connector.py", line 242, in _create_connection
  File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/asyncio/base_events.py", line 424, in create_connection
  File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/asyncio/base_events.py", line 392, in create_connection
  File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/socket.py", line 123, in __init__
OSError: [Errno 24] Too many open files

During handling of the above exception, another exception occurred:

2 个答案:

答案 0 :(得分:5)

啊哈,我找你问题。

显式连接器绝对可以解决问题。

https://github.com/KeepSafe/aiohttp/pull/79也应修复隐式连接器。

非常感谢您在 aiohttp

中查找资源泄漏

<强> UPD 即可。 aiohttp 0.8.2没有问题。

答案 1 :(得分:2)

好的,我终于开始工作了。

原来我必须使用一个汇集连接的TCPConnector。

所以我做了这个变量:

connector = aiohttp.TCPConnector(share_cookies=True, loop=loop)

并将其传递给每个get请求。我的新获取例程如下所示:

@asyncio.coroutine
def fetch(url):
  data = ""
  try:
    yield from asyncio.sleep(1)
    response = yield from aiohttp.request('GET', url, connector=connector)
  except Exception as exc:
      print('...', url, 'has error', repr(str(exc)))
  else:
      data = (yield from response.read()).decode('utf-8', 'replace')
      response.close()

  return data