我想侦听来自同一对象的多个实例的事件,然后将此事件流合并为一个流。例如,如果我使用异步生成器:
class PeriodicYielder:
def __init__(self, period: int) -> None:
self.period = period
async def updates(self):
while True:
await asyncio.sleep(self.period)
yield self.period
我可以成功监听一个实例的事件:
async def get_updates_from_one():
each_1 = PeriodicYielder(1)
async for n in each_1.updates():
print(n)
# 1
# 1
# 1
# ...
但是如何从多个异步生成器获取事件?换句话说:如何按多个异步生成器准备好产生下一个值的顺序进行迭代?
async def get_updates_from_multiple():
each_1 = PeriodicYielder(1)
each_2 = PeriodicYielder(2)
async for n in magic_async_join_function(each_1.updates(), each_2.updates()):
print(n)
# 1
# 1
# 2
# 1
# 1
# 2
# ...
stdlib或3rd party模块中是否有 magic_async_join_function ?
答案 0 :(得分:5)
您可以使用精彩的aiostream库。看起来像这样:
import asyncio
from aiostream import stream
async def test1():
for _ in range(5):
await asyncio.sleep(0.1)
yield 1
async def test2():
for _ in range(5):
await asyncio.sleep(0.2)
yield 2
async def main():
combine = stream.merge(test1(), test2())
async with combine.stream() as streamer:
async for item in streamer:
print(item)
asyncio.run(main())
结果:
1
1
2
1
1
2
1
2
2
2
答案 1 :(得分:2)
如果要避免依赖外部库(或作为学习练习),则可以使用queue合并异步迭代器:
def merge_async_iters(*aiters):
# merge async iterators, proof of concept
queue = asyncio.Queue(1)
async def drain(aiter):
async for item in aiter:
await queue.put(item)
async def merged():
while not all(task.done() for task in tasks):
yield await queue.get()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()
这通过了Mikhail's answer的测试,但这并不完美:如果异步迭代器之一引发异常,它不会传播异常。另外,如果用尽merged
返回的merge_async_iters()
生成器的任务被取消,或者如果同一生成器没有用尽,则各个drain
任务将被挂起。>
更完整的版本可以通过检测异常并将其通过队列传输来处理第一个问题。第二个问题可以通过merged
生成器在放弃迭代后立即取消drain
任务来解决。经过这些更改,结果代码如下:
def merge_async_iters(*aiters):
queue = asyncio.Queue(1)
run_count = len(aiters)
cancelling = False
async def drain(aiter):
nonlocal run_count
try:
async for item in aiter:
await queue.put((False, item))
except Exception as e:
if not cancelling:
await queue.put((True, e))
else:
raise
finally:
run_count -= 1
async def merged():
try:
while run_count:
raised, next_item = await queue.get()
if raised:
cancel_tasks()
raise next_item
yield next_item
finally:
cancel_tasks()
def cancel_tasks():
nonlocal cancelling
cancelling = True
for t in tasks:
t.cancel()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()
可以在this answer中找到不同的合并异步迭代器的方法,在this one中也可以找到,其中后者允许在中间跨步添加新的流。这些实现的复杂性和微妙性表明,虽然知道如何编写非常有用,但实际上最好由经过充分测试的外部库(例如aiostream)覆盖所有边缘情况。