类型错误:“_SessionRequestContextManager”对象不可迭代

时间:2021-01-28 02:12:27

标签: python asynchronous python-requests python-asyncio

从这个 tutorial 对于多个异步获取请求,我复制并运行了以下代码:

import asyncio  
import aiohttp

def fetch_page(url, idx):  
    url = 'https://yahoo.com'
    response = yield from aiohttp.request('GET', url)

    if response.status == 200:
        print("data fetched successfully for: %d" % idx)
    else:
        print("data fetch failed for: %d" % idx)
        print(response.content, response.status)

def main():  
    url = 'https://yahoo.com'
    urls = [url] * 100

    coros = []
    for idx, url in enumerate(urls):
        coros.append(asyncio.Task(fetch_page(url, idx)))

    yield from asyncio.gather(*coros)

if __name__ == '__main__':  
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

但是,我收到以下错误:

<块引用>

运行时警告:协程“ClientSession._request”从未被等待

<块引用>

未关闭的客户端会话类型错误:'_SessionRequestContextManager'

<块引用>

TypeError: '_SessionRequestContextManager' 对象不可迭代

1 个答案:

答案 0 :(得分:0)

在教程中你可以看到 "Created 3 years ago" - 所以它已经过时了。
例如,目前您使用 await 而不是 yield from

最好阅读 aiohttp 的官方文档。

The aiohttp Request Lifecycle 中,您可以看到与 fetch()main() 类似的示例:

import aiohttp
import asyncio

async def fetch(session, url):
    async with session.get(url) as response:
        return await response.text()

async def main():
    async with aiohttp.ClientSession() as session:
        html = await fetch(session, 'http://python.org')
        print(html)

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

这可能是你的例子

import aiohttp
import asyncio

async def fetch(url, idx):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:

            if response.status == 200:
                print("data fetched successfully for:", idx)
                #print(await response.text(), response.status)
            else:
                print("data fetch failed for:", idx)
                print(await response.text(), response.status)

async def main():
    url = 'https://yahoo.com'
    urls = [url] * 10

    for idx, url in enumerate(urls):
        await fetch(url, idx)

if __name__ == '__main__':  
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

或没有会话

import asyncio  
import aiohttp

async def fetch_page(url, idx):  
    async with aiohttp.request('GET', url) as response:
    
        if response.status == 200:
            print("data fetched successfully for:", idx)
        else:
            print("data fetch failed for:", idx)
            print(response.content, response.status)

async def main():  
    url = 'https://yahoo.com'
    urls = [url] * 100

    for idx, url in enumerate(urls):
        await fetch_page(url, idx)

if __name__ == '__main__':  
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())