有没有办法访问传递给asyncio.as_completed的原始任务?

时间:2019-04-04 16:54:13

标签: python python-asyncio

我正在尝试从异步队列中提取任务,并在发生异常时调用给定的错误处理程序。排队的项目以字典(由enqueue_task列为字典)形式给出,其中包含任务,可能的错误处理程序以及错误处理程序可能需要的所有args / kwarg。由于我想在任务完成时处理所有错误,因此我将每个任务映射到出队的字典,并在发生异常时尝试访问它。

async def _check_tasks(self):
    try:
        while self._check_tasks_task or not self._check_task_queue.empty():
            tasks = []
            details = {}
            try:
                while len(tasks) < self._CHECK_TASKS_MAX_COUNT:
                    detail = self._check_task_queue.get_nowait()
                    task = detail['task']
                    tasks.append(task)
                    details[task] = detail
            except asyncio.QueueEmpty:
                pass

            if tasks:
                for task in asyncio.as_completed(tasks):
                    try:
                        await task
                    except Exception as e:
                        logger.exception('')
                        detail = details[task]
                        error_handler = detail.get('error_handler')
                        error_handler_args = detail.get('error_handler_args', [])
                        error_handler_kwargs = detail.get('error_handler_kwargs', {})

                        if error_handler:
                            logger.info('calling error handler')
                            if inspect.iscoroutinefunction(error_handler):
                                self.enqueue_task(
                                    task=error_handler(
                                        e,
                                        *error_handler_args,
                                        **error_handler_kwargs
                                    )
                                )
                            else:
                                error_handler(e, *error_handler_args, **error_handler_kwargs)
                        else:
                            logger.exception(f'Exception encountered while handling task: {str(e)}')
            else:
                await asyncio.sleep(self._QUEUE_EMPTY_SLEEP_TIME)
    except:
        logger.exception('')


def enqueue_task(self, task, error_handler=None, error_handler_args=[],
                 error_handler_kwargs={}):
    if not asyncio.isfuture(task):
        task = asyncio.ensure_future(task)

    self._app.gateway._check_task_queue.put_nowait({
        'task': task,
        'error_handler': error_handler,
        'error_handler_args': error_handler_args,
        'error_handler_kwargs': error_handler_kwargs,
    })

但是,当发生异常时,似乎在details词典中找不到用作键的任务,并且我收到以下错误:

KeyError: <generator object as_completed.<locals>._wait_for_one at 0x7fc2d1cea308>
Exception encountered while handling task: <generator object as_completed.<locals>._wait_for_one at 0x7fc2d1cea308>
Traceback (most recent call last):
  File "/app/app/gateway/gateway.py", line 64, in _check_tasks
    detail = details[task]
KeyError: <generator object as_completed.<locals>._wait_for_one at 0x7fc2d1cea308>

task产生asyncio.as_completed时,它似乎是生成器

<generator object as_completed.<locals>._wait_for_one at 0x7fc2d1cea308>

当我希望这是一项任务

<Task pending coro=<GatewayL1Component._save_tick_to_stream() running at /app/app/gateway/l1.py:320> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7fc2d4380d98>()]>>

task产生后,为什么asyncio.as_complete是生成器而不是原始任务?有没有办法访问原始任务?

2 个答案:

答案 0 :(得分:1)

  

为什么asyncio.as_complete产生任务后,任务是生成器而不是原始任务?

问题在于as_completed不是async iterator(用async for用尽的),而是普通的迭代器。如果异步迭代器的__aiter__可以在等待异步事件时挂起,则普通迭代器的__iter__必须立即提供结果。显然,它不能产生已完成的任务,因为还没有时间来完成任务,因此它会产生一个等待对象,实际上等待任务完成。这是看起来像发电机的对象。

实现的另一个结果是,与原始concurrent.futures.as_completed相比,等待该任务将为您提供原始任务的结果,而不是对任务对象的引用。这使asyncio.as_completed不太直观,更难使用,并且there is a bug report认为as_completed也应可用作异步迭代器,并提供正确的语义。 (这可以通过向后兼容的方式来完成。)

  

是否可以访问原始任务?

作为一种解决方法,您可以通过将原始任务包装到一个协程中来创建as_completed的异步版本,该协程在任务完成时会完成,并将任务作为结果:

async def as_completed_async(futures):
    loop = asyncio.get_event_loop()
    wrappers = []
    for fut in futures:
        assert isinstance(fut, asyncio.Future)  # we need Future or Task
        # Wrap the future in one that completes when the original does,
        # and whose result is the original future object.
        wrapper = loop.create_future()
        fut.add_done_callback(wrapper.set_result)
        wrappers.append(wrapper)

    for next_completed in asyncio.as_completed(wrappers):
        # awaiting next_completed will dereference the wrapper and get
        # the original future (which we know has completed), so we can
        # just yield that
        yield await next_completed

这应该可以让您获得原始任务-这是一个简单的测试用例:

async def main():
    loop = asyncio.get_event_loop()
    fut1 = loop.create_task(asyncio.sleep(.2))
    fut1.t = .2
    fut2 = loop.create_task(asyncio.sleep(.3))
    fut2.t = .3
    fut3 = loop.create_task(asyncio.sleep(.1))
    fut3.t = .1
    async for fut in as_completed_async([fut1, fut2, fut3]):
        # using the `.t` attribute shows that we've got the original tasks
        print('completed', fut.t)

asyncio.get_event_loop().run_until_complete(main())

答案 1 :(得分:0)

使用asyncio.gather代替:

async def _check_tasks(self):
    while self._check_tasks_task or not self._check_task_queue.empty():
        tasks = []
        details = []
        try:
            while len(tasks) < self._CHECK_TASKS_MAX_COUNT:
                detail = self._check_task_queue.get_nowait()
                task = detail['task']
                tasks.append(task)
                details.append(detail)
        except asyncio.QueueEmpty:
            pass

        if tasks:
            results = await asyncio.gather(*tasks, return_exceptions=True)
            for i, result in enumerate(results):
                if isinstance(result, Exception):
                    detail = details[i]
                    error_handler = detail.get('error_handler')
                    error_handler_args = detail.get('error_handler_args', [])
                    error_handler_kwargs = detail.get('error_handler_kwargs', {})

                    if error_handler:
                        logger.info('calling error handler')
                        if inspect.iscoroutinefunction(error_handler):
                            self.enqueue_task(
                                task=error_handler(
                                    result,
                                    *error_handler_args,
                                    **error_handler_kwargs
                                )
                            )
                        else:
                            error_handler(
                                result, *error_handler_args, **error_handler_kwargs
                            )
                    else:
                        msg = f'Exception encountered while handling task: {str(result)}'
                        logger.exception(msg)
        else:
            await asyncio.sleep(self._QUEUE_EMPTY_SLEEP_TIME)