My airflow server setup is not running tasks not even the example dags. Whenever I do a manual run a DagRun
object is created for which status is running but it always stays the same. This problem is with all the dags and not just one particular dag.
Whenever I trigger the dag I can see it appear in the scheduler log but nothing appears in the celery log.
I am able to run the tasks inside a dag using airflow test
command it's the airflow trigger
or a manual trigger from that doesn't work.
I've ensured that all three of these commands are running, I've also put them under supervisor now.
Things that I've tried
LocalExecutor
instead of celery executor that didn't help. but that broker_url = redis://myhostname.com:6379/10
and result backend setting celery_result_backend = amqp://guest:guest@localhost:5672
. I've tried various combination of rabbit-mq and redis for these two setting but that didn't helpamqp://
and pyamqp://
for specifying broker urlThis is a setup running on Ubuntu 14.04.5 LTS, I've been able to run a local version of airflow successfully on my mac.
I've been stuck at it for weeks, can someone help me figure out / debug this problem?