来自异步生成器的asyncio as_yielded

时间:2017-06-07 07:08:10

标签: python async-await python-asyncio coroutine control-flow

我希望能够从许多异步协同程序中获益。 Asyncio的as_completed有点接近我正在寻找的东西(即我希望任何协同程序能够在任何时候回到调用者然后继续),但是似乎只允许常规协程一次返回。

这是我到目前为止所拥有的:

import asyncio


async def test(id_):
    print(f'{id_} sleeping')
    await asyncio.sleep(id_)
    return id_


async def test_gen(id_):
    count = 0
    while True:
        print(f'{id_} sleeping')
        await asyncio.sleep(id_)
        yield id_
        count += 1
        if count > 5:
            return


async def main():
    runs = [test(i) for i in range(3)]

    for i in asyncio.as_completed(runs):
        i = await i
        print(f'{i} yielded')


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    loop.close()

runs = [test(i) for i in range(3)]替换为runs = [test_gen(i) for i in range(3)]并将for i in asyncio.as_completed(runs)替换为每次收益,这是我所追求的。

这是否可以在Python中表达,是否有任何第三方可能为协程流程标准库提供更多选项?

由于

1 个答案:

答案 0 :(得分:8)

您可以使用aiostream.stream.merge

from aiostream import stream

async def main():
    runs = [test_gen(i) for i in range(3)]
    async for x in stream.merge(*runs):
        print(f'{x} yielded')

safe context中运行它以确保在迭代后正确清理生成器:

async def main():
    runs = [test_gen(i) for i in range(3)]
    merged = stream.merge(*runs)
    async with merged.stream() as streamer:
        async for x in streamer:
            print(f'{x} yielded')

或者使用pipes

使其更紧凑
from aiostream import stream, pipe

async def main():
    runs = [test_gen(i) for i in range(3)]
    await (stream.merge(*runs) | pipe.print('{} yielded'))

documentation中的更多示例。

通过@nirvana-msu发表评论

通过相应地准备来源,可以识别产生给定值的发电机:

async def main():
    runs = [test_gen(i) for i in range(3)]
    sources = [stream.map(xs, lambda x: (i, x)) for i, xs in enumerate(runs)]
    async for i, x in stream.merge(*sources):
        print(f'ID {i}: {x}')