Airflow 1.10.10 [核心] 与 1.10.15[日志记录] AWS S3 远程日志记录

时间:2021-04-28 15:57:59

标签: amazon-web-services amazon-s3 airflow

将登录设置从 AWS S3 移动到 [core] 部分后,我无法启用远程登录到 [logging]

这是我感动的:

[logging]
# The folder where airflow should store its log files
# This path must be absolute
base_log_folder = /usr/local/airflow/logs

# Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
# Users must supply an Airflow connection id that provides access to the storage
# location. If remote_logging is set to true, see UPDATING.md for additional
# configuration requirements.
remote_logging = True
remote_log_conn_id = MyS3Conn
remote_base_log_folder = s3://bucket/tst/
encrypt_s3_logs = False

# Logging level
logging_level = INFO
fab_logging_level = WARN

# Logging class
# Specify the class that will specify the logging configuration
# This class has to be on the python classpath
# logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
logging_config_class =

# Log format
# we need to escape the curly braces by adding an additional curly brace
log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s

# Log filename format
# we need to escape the curly braces by adding an additional curly brace
log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
log_processor_filename_template = {{ filename }}.log
# Name of handler to read task instance logs.
# Default to use task handler.
task_log_reader = task

我只是移动了属性。 airflow upgrade_check 返回 Logging configuration has been moved to new section 检查是 okei。

我有 apache-airflow[crypto,postgres,ssh,s3,log]==1.10.15,现在 logging 下的所有属性都在 core 远程日志记录中工作正常。

我没有找到有关如何设置它的任何信息。我只找到了 this 但它只说以下配置已从 [core] 移动到新的 [logging] 部分。

1 个答案:

答案 0 :(得分:2)

您应该继续使用 [core] 登录 1.10.15,只有当您更新到 Airflow >= 2.0.0 时,您才应该使用 [logging] 部分。

upgrade_check 命令表示它已移至 >=2.0.0 中的 [logging] 部分。它将继续工作,只是引发弃用警告。