我已经使用stable/airflow
舵图在Kubernetes上部署了一个Airflow实例。我对puckel/docker-airflow
映像进行了少许修改,以能够安装Kubernetes执行器。现在,所有任务都已在Kubernetes集群上成功执行,但是找不到这些任务的日志。
我想将日志上传到我们的Azure Blob存储帐户。我已经这样配置了环境变量:
AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER="wasb-airflow"
AIRFLOW__CORE__REMOTE_LOG_CONN_ID="wasb_default"
AIRFLOW__CORE__REMOTE_LOGGING="True"
wasb_default
连接包括Azure Blob存储帐户的登录名和密码。我已经使用WasbHook
测试了此连接,并成功删除了一个虚拟文件。
当我尝试查看日志时,显示以下消息:
*** Log file does not exist: /usr/local/airflow/logs/example_python_operator/print_the_context/2019-11-29T15:42:25+00:00/1.log
*** Fetching from: http://examplepythonoperatorprintthecontext-4a6e6a1f11fd431f8c2a1dc081:8793/log/example_python_operator/print_the_context/2019-11-29T15:42:25+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='examplepythonoperatorprintthecontext-4a6e6a1f11fd431f8c2a1dc081', port=8793): Max retries exceeded with url: /log/example_python_operator/print_the_context/2019-11-29T15:42:25+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f34ecdbe990>: Failed to establish a new connection: [Errno -2] Name or service not known'))
关于如何解决此问题的任何想法?