我在Kubenetes上使用Airflow,它具有两个带有CeleryExecutor的节点。
我有两个队列,每个节点一个。
但是,当我在Operator参数中使用queue='new_queue'
时,任务就卡住了,再也执行不到。
错误消息是
All dependencies are met but the task instance is not running. In most cases this just means that the task will probably be scheduled soon unless:
- The scheduler is down or under heavy load
- The following configuration values may be limiting the number of queueable processes: parallelism, `dag_concurrency`, `max_active_dag_runs_per_dag`, `non_pooled_task_slot_count`
我不知道出了什么问题,我检查了Flower,他们在两个节点上都显示了工作人员。
我的工作程序运行命令:airflow worker -q new_queue