`docker-compose up`气流Web UI连接不可用

时间:2020-05-22 21:56:46

标签: python postgresql docker docker-compose

我正在使用Airflow和AWS s3和redshift在数据管道上工作。我正在使用docker启动气流。运行docker build -t my-airflow . && docker-compose up后,我在Chrome浏览器中输入http://localhost:8080/,并收到“未到达网站”消息。 docker-compose up输出如下。关于气流slot_pool表存在一个错误,该错误很可能是气流配置表。我尝试通过将airflow upgradedb添加到entrypoint.sh来解决该问题。在github链接下,airflow目录包含dockerfile和docker-compose.yml文件,在config /目录下是entrypoint.sh。

Github:https://github.com/marshall7m/data_engineering_capstone/tree/master/airflow

Attaching to airflow_postgres_1, airflow_webserver_1
postgres_1   | The files belonging to this database system will be owned by user "postgres".
postgres_1   | This user must also own the server process.
postgres_1   | 
postgres_1   | The database cluster will be initialized with locale "en_US.utf8".
postgres_1   | The default database encoding has accordingly been set to "UTF8".
postgres_1   | The default text search configuration will be set to "english".
postgres_1   | 
postgres_1   | Data page checksums are disabled.
postgres_1   | 
postgres_1   | fixing permissions on existing directory /var/lib/postgresql/data ... ok
postgres_1   | creating subdirectories ... ok
postgres_1   | selecting dynamic shared memory implementation ... posix
postgres_1   | selecting default max_connections ... 100
postgres_1   | selecting default shared_buffers ... 128MB
postgres_1   | selecting default time zone ... Etc/UTC
postgres_1   | creating configuration files ... ok
postgres_1   | running bootstrap script ... ok
webserver_1  | init db
postgres_1   | performing post-bootstrap initialization ... ok
postgres_1   | syncing data to disk ... ok
postgres_1   | 
postgres_1   | initdb: warning: enabling "trust" authentication for local connections
postgres_1   | You can change this by editing pg_hba.conf or using the option -A, or
postgres_1   | --auth-local and --auth-host, the next time you run initdb.
postgres_1   | 
postgres_1   | Success. You can now start the database server using:
postgres_1   | 
postgres_1   |     pg_ctl -D /var/lib/postgresql/data -l logfile start
postgres_1   | 
postgres_1   | waiting for server to start....2020-05-22 18:58:20.765 UTC [47] LOG:  starting PostgreSQL 12.3 (Debian 12.3-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
postgres_1   | 2020-05-22 18:58:20.769 UTC [47] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1   | 2020-05-22 18:58:20.815 UTC [48] LOG:  database system was shut down at 2020-05-22 18:58:20 UTC
postgres_1   | 2020-05-22 18:58:20.831 UTC [47] LOG:  database system is ready to accept connections
postgres_1   |  done
postgres_1   | server started
postgres_1   | CREATE DATABASE
postgres_1   | 
postgres_1   | 
postgres_1   | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*
postgres_1   | 
postgres_1   | 2020-05-22 18:58:21.451 UTC [47] LOG:  received fast shutdown request
postgres_1   | waiting for server to shut down....2020-05-22 18:58:21.459 UTC [47] LOG:  aborting any active transactions
postgres_1   | 2020-05-22 18:58:21.469 UTC [47] LOG:  background worker "logical replication launcher" (PID 54) exited with exit code 1
postgres_1   | 2020-05-22 18:58:21.470 UTC [49] LOG:  shutting down
postgres_1   | 2020-05-22 18:58:21.520 UTC [47] LOG:  database system is shut down
postgres_1   |  done
postgres_1   | server stopped
postgres_1   | 
postgres_1   | PostgreSQL init process complete; ready for start up.
postgres_1   | 
postgres_1   | 2020-05-22 18:58:21.590 UTC [1] LOG:  starting PostgreSQL 12.3 (Debian 12.3-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
postgres_1   | 2020-05-22 18:58:21.592 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
postgres_1   | 2020-05-22 18:58:21.593 UTC [1] LOG:  listening on IPv6 address "::", port 5432
postgres_1   | 2020-05-22 18:58:21.606 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1   | 2020-05-22 18:58:21.644 UTC [65] LOG:  database system was shut down at 2020-05-22 18:58:21 UTC
postgres_1   | 2020-05-22 18:58:21.664 UTC [1] LOG:  database system is ready to accept connections
webserver_1  | DB: postgresql://airflow_user:***@postgres:5432/airflow
webserver_1  | [2020-05-22 18:58:24,312] {db.py:378} INFO - Creating tables
webserver_1  | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
webserver_1  | INFO  [alembic.runtime.migration] Will assume transactional DDL.
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade  -> e3a246e0dc1, current schema
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 1507a7289a2f, create is_encrypted
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 1507a7289a2f -> 13eb55f81627, maintain history for compatibility with earlier migrations
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 13eb55f81627 -> 338e90f54d61, More logging into task_instance
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 338e90f54d61 -> 52d714495f0, job_id indices
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 52d714495f0 -> 502898887f84, Adding extra to Log
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 502898887f84 -> 1b38cef5b76e, add dagrun
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 1b38cef5b76e -> 2e541a1dcfed, task_duration
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 2e541a1dcfed -> 40e67319e3a9, dagrun_config
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 40e67319e3a9 -> 561833c1c74b, add password column to user
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 561833c1c74b -> 4446e08588, dagrun start end
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 4446e08588 -> bbc73705a13e, Add notification_sent column to sla_miss
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade bbc73705a13e -> bba5a7cfc896, Add a column to track the encryption state of the 'Extra' field in connection
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade bba5a7cfc896 -> 1968acfc09e3, add is_encrypted column to variable table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 1968acfc09e3 -> 2e82aab8ef20, rename user table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 2e82aab8ef20 -> 211e584da130, add TI state index
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 211e584da130 -> 64de9cddf6c9, add task fails journal table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 64de9cddf6c9 -> f2ca10b85618, add dag_stats table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 4addfa1236f1, Add fractional seconds to mysql tables
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 4addfa1236f1 -> 8504051e801b, xcom dag task indices
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 8504051e801b -> 5e7d17757c7a, add pid field to TaskInstance
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 5e7d17757c7a -> 127d2bf2dfa7, Add dag_id/state index on dag_run table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 127d2bf2dfa7 -> cc1e65623dc7, add max tries column to task instance
postgres_1   | 2020-05-22 18:58:25.603 UTC [73] ERROR:  relation "slot_pool" does not exist at character 161
postgres_1   | 2020-05-22 18:58:25.603 UTC [73] STATEMENT:  SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description 
postgres_1   |  FROM slot_pool 
postgres_1   |  WHERE slot_pool.slots = 1 AND slot_pool.pool = 'default_pool' 
postgres_1   |   LIMIT 1
webserver_1  | ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/airflow/dags/main_dag.py
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1284, in _execute_context
webserver_1  |     cursor, statement, parameters, context
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 590, in do_execute
webserver_1  |     cursor.execute(statement, parameters)
webserver_1  | psycopg2.errors.UndefinedTable: relation "slot_pool" does not exist
webserver_1  | LINE 2: FROM slot_pool 
webserver_1  |              ^
webserver_1  | 
webserver_1  | 
webserver_1  | The above exception was the direct cause of the following exception:
webserver_1  | 
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 236, in process_file
webserver_1  |     m = imp.load_source(mod_name, filepath)
webserver_1  |   File "/usr/local/lib/python3.7/imp.py", line 171, in load_source
webserver_1  |     module = _load(spec)
webserver_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
webserver_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
webserver_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
webserver_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
webserver_1  |   File "/usr/local/airflow/dags/main_dag.py", line 84, in <module>
webserver_1  |     dag=main_dag
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", line 74, in wrapper
webserver_1  |     return func(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/utils/decorators.py", line 98, in wrapper
webserver_1  |     result = func(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/operators/subdag_operator.py", line 77, in __init__
webserver_1  |     .filter(Pool.pool == self.pool)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3375, in first
webserver_1  |     ret = list(self[0:1])
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3149, in __getitem__
webserver_1  |     return list(res)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3481, in __iter__
webserver_1  |     return self._execute_and_instances(context)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3506, in _execute_and_instances
webserver_1  |     result = conn.execute(querycontext.statement, self._params)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1020, in execute
webserver_1  |     return meth(self, multiparams, params)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
webserver_1  |     return connection._execute_clauseelement(self, multiparams, params)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1139, in _execute_clauseelement
webserver_1  |     distilled_params,
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1324, in _execute_context
webserver_1  |     e, statement, parameters, cursor, context
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1518, in _handle_dbapi_exception
webserver_1  |     sqlalchemy_exception, with_traceback=exc_info[2], from_=e
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 178, in raise_
webserver_1  |     raise exception
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1284, in _execute_context
webserver_1  |     cursor, statement, parameters, context
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 590, in do_execute
webserver_1  |     cursor.execute(statement, parameters)
webserver_1  | sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedTable) relation "slot_pool" does not exist
webserver_1  | LINE 2: FROM slot_pool 
webserver_1  |              ^
webserver_1  | 
webserver_1  | [SQL: SELECT slot_pool.id AS slot_pool_id, slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots, slot_pool.description AS slot_pool_description 
webserver_1  | FROM slot_pool 
webserver_1  | WHERE slot_pool.slots = %(slots_1)s AND slot_pool.pool = %(pool_1)s 
webserver_1  |  LIMIT %(param_1)s]
webserver_1  | [parameters: {'slots_1': 1, 'pool_1': 'default_pool', 'param_1': 1}]
webserver_1  | (Background on this error at: http://sqlalche.me/e/f405)
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade cc1e65623dc7 -> bdaa763e6c56, Make xcom value column a large binary
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade bdaa763e6c56 -> 947454bf1dff, add ti job_id index
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 947454bf1dff -> d2ae31099d61, Increase text size for MySQL (not relevant for other DBs' text types)
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 0e2a74e0fc9f, Add time zone awareness
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 33ae817a1ff4, kubernetes_resource_checkpointing
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 33ae817a1ff4 -> 27c6a30d7c24, kubernetes_resource_checkpointing
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 27c6a30d7c24 -> 86770d1215c0, add kubernetes scheduler uniqueness
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 86770d1215c0, 0e2a74e0fc9f -> 05f30312d566, merge heads
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 05f30312d566 -> f23433877c24, fix mysql not null constraint
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade f23433877c24 -> 856955da8476, fix sqlite foreign key
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 856955da8476 -> 9635ae0956e7, index-faskfail
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> dd25f486b8ea, add idx_log_dag
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade dd25f486b8ea -> bf00311e1990, add index to taskinstance
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> 0a2a5b66e19d, add task_reschedule table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 0a2a5b66e19d, bf00311e1990 -> 03bc53e68815, merge_heads_2
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 03bc53e68815 -> 41f5f12752f8, add superuser field
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 41f5f12752f8 -> c8ffec048a3b, add fields to dag
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade c8ffec048a3b -> dd4ecb8fbee3, Add schedule interval to dag
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade dd4ecb8fbee3 -> 939bb1e647c8, task reschedule fk on cascade delete
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 6e96a59344a4, Make TaskInstance.pool not nullable
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> d38e04c12aa2, add serialized_dag table
webserver_1  | Revision ID: d38e04c12aa2
webserver_1  | Revises: 6e96a59344a4
webserver_1  | Create Date: 2019-08-01 14:39:35.616417
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade d38e04c12aa2 -> b3b105409875, add root_dag_id to DAG
webserver_1  | Revision ID: b3b105409875
webserver_1  | Revises: d38e04c12aa2
webserver_1  | Create Date: 2019-09-28 23:20:01.744775
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> 74effc47d867, change datetime to datetime2(6) on MSSQL tables
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 004c1210f153, increase queue name size limit
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade c8ffec048a3b -> a56c9515abdc, Remove dag_stat table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade a56c9515abdc, 004c1210f153, 74effc47d867, b3b105409875 -> 08364691d074, Merge the four heads back together
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 08364691d074 -> fe461863935f, increase_length_for_connection_password
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade fe461863935f -> 7939bcff74ba, Add DagTags table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 7939bcff74ba -> a4c2fd67d16b, add pool_slots field to task_instance
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade a4c2fd67d16b -> 852ae6c715af, Add RenderedTaskInstanceFields table
webserver_1  | INFO  [alembic.runtime.migration] Running upgrade 852ae6c715af -> 952da73b5eff, add dag_code table
webserver_1  | Done.
webserver_1  | upgrade db
webserver_1  | DB: postgresql://airflow_user:***@postgres:5432/airflow
webserver_1  | [2020-05-22 18:58:31,607] {db.py:378} INFO - Creating tables
webserver_1  | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
webserver_1  | INFO  [alembic.runtime.migration] Will assume transactional DDL.
webserver_1  |   ____________       _____________
webserver_1  |  ____    |__( )_________  __/__  /________      __
webserver_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
webserver_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
webserver_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
webserver_1  |   ____________       _____________
webserver_1  |  ____    |__( )_________  __/__  /________      __
webserver_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
webserver_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
webserver_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
webserver_1  | [2020-05-22 18:58:36,563] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:58:36,578] {scheduler_job.py:1346} INFO - Starting the scheduler
webserver_1  | [2020-05-22 18:58:36,579] {scheduler_job.py:1354} INFO - Running execute loop for -1 seconds
webserver_1  | [2020-05-22 18:58:36,581] {scheduler_job.py:1355} INFO - Processing each file at most -1 times
webserver_1  | [2020-05-22 18:58:36,584] {scheduler_job.py:1358} INFO - Searching for files in /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:58:36,705] {scheduler_job.py:1360} INFO - There are 1 files in /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:58:36,908] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:58:36,913] {dagbag.py:396} INFO - Filling up the DagBag from /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:58:37,455] {scheduler_job.py:1411} INFO - Resetting orphaned tasks for active dag runs
webserver_1  | [2020-05-22 18:58:37,514] {dag_processing.py:556} INFO - Launched DagFileProcessorManager with pid: 145
webserver_1  | [2020-05-22 18:58:37,587] {settings.py:54} INFO - Configured default timezone <Timezone [UTC]>
webserver_1  | Running the Gunicorn Server with:
webserver_1  | Workers: 4 sync
webserver_1  | Host: 0.0.0.0:8080
webserver_1  | Timeout: 120
webserver_1  | Logfiles: - -
webserver_1  | =================================================================            
webserver_1  | [2020-05-22 18:58:42 +0000] [165] [INFO] Starting gunicorn 19.10.0
webserver_1  | [2020-05-22 18:58:42 +0000] [165] [INFO] Listening at: http://0.0.0.0:8080 (165)
webserver_1  | [2020-05-22 18:58:42 +0000] [165] [INFO] Using worker: sync
webserver_1  | [2020-05-22 18:58:42 +0000] [186] [INFO] Booting worker with pid: 186
webserver_1  | [2020-05-22 18:58:42 +0000] [187] [INFO] Booting worker with pid: 187
webserver_1  | [2020-05-22 18:58:42 +0000] [188] [INFO] Booting worker with pid: 188
webserver_1  | [2020-05-22 18:58:42 +0000] [189] [INFO] Booting worker with pid: 189
webserver_1  | [2020-05-22 18:58:43,986] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:58:43,992] {dagbag.py:396} INFO - Filling up the DagBag from /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:58:44,311] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:58:44,321] {dagbag.py:396} INFO - Filling up the DagBag from /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:58:44,642] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:58:44,647] {dagbag.py:396} INFO - Filling up the DagBag from /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:58:44,814] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:58:44,823] {dagbag.py:396} INFO - Filling up the DagBag from /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:59:17 +0000] [165] [INFO] Handling signal: ttin
webserver_1  | [2020-05-22 18:59:17 +0000] [334] [INFO] Booting worker with pid: 334
webserver_1  | [2020-05-22 18:59:18,396] {__init__.py:51} INFO - Using executor LocalExecutor
webserver_1  | [2020-05-22 18:59:18,397] {dagbag.py:396} INFO - Filling up the DagBag from /usr/local/airflow/dags
webserver_1  | [2020-05-22 18:59:20 +0000] [165] [INFO] Handling signal: ttou
webserver_1  | [2020-05-22 18:59:20 +0000] [186] [INFO] Worker exiting (pid: 186)

1 个答案:

答案 0 :(得分:1)

更新您的docker-compose.yaml以将气流绑定到主机端口8080

这不会绑定到您的本地主机:8080

ports: 
  - "8080"

应如下所示:

ports: 
  - "8080:8080"