无法在 Ubuntu 容器中执行命令,从 Airflow 容器使用 DockerOperator

时间:2021-01-09 06:14:17

标签: docker ubuntu docker-compose airflow dockeroperator

我对 Docker + Airflow 非常陌生。以下是我想要完成的。

我有 4 个服务,如下面的撰写文件所示。 3 个与 Airflow 相关,一个作为测试 Ubuntu 实例。 Airflow 相关容器:airflow-databaseairflow-webserverairflow-scheduler 能够相互通信,我能够运行示例 DAG。 现在我添加了第四个服务 (ubuntu),我尝试使用 DockerOperator 从 DAG 发送一个简单的命令“/bin/sleep 10”(下面是 DAG 文件)。但由于某种原因,我收到了 Permission Denied 消息(还附上了 DAG 错误文件)。

如果我从 localhost 而不是从 docker 容器内运行 Airflow,它会起作用 无法弄清楚我错过了什么。以下是我尝试过的一些方法:

  • 在 docker_url 中:
  1. unix://var/run/docker.sock 替换为 tcp://172.20.0.1 认为它可以通过 docker 主机 ip 解析

  2. 使用 gateway.host.internal

  3. 甚至从操作员中删除了 docker_url 选项,但意识到它仍然默认为 unix://var/run/docker.sock

  4. 尝试了多种组合,tcp://172.20.0.1:2376、tcp://172.20.0.1:2375

  5. 将主机端口映射到 Ubuntu,例如 8085:8085 等

  • 可能是 Airflow 网络服务器的气流用户被 Ubuntu 踢出
  • 因此在 Ubuntu 容器中创建了一个组并向其中添加了气流用户 -- 不行
  • api_version:选项“自动”也不起作用,并不断给出未找到版本的错误。所以我不得不使用 1.41 进行硬编码,因为我在 docker version 命令中发现了这一点。不确定这是不是应该的。

预先感谢您对我可以尝试进行的其他工作的任何帮助:)

docker-compose.yml

version: '3.2'

services:
# Ubuntu Container  
  ubuntu:
    image: ubuntu
    networks:
      - mynetwork

# Airflow Database
  airflow-database:
    image: postgres:12
    env_file:
      - .env
    ports:
      - 5432:5432
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./airflow/database/data:/var/lib/postgresql/data/pgdata
      - ./airflow/database/logs:/var/lib/postgresql/data/log
    command: >
     postgres
       -c listen_addresses=*
       -c logging_collector=on
       -c log_destination=stderr
       -c max_connections=200
    networks:
      - mynetwork

# Airflow DB Init
  initdb:
      image: apache/airflow:2.0.0-python3.8
      env_file:
        - .env
      depends_on:
        - airflow-database
      volumes:
        - /var/run/docker.sock:/var/run/docker.sock
        - ./airflow/metadata-airflow/dags:/opt/airflow/dags
        - ./airflow/logs:/opt/airflow/logs
      entrypoint: /bin/bash
      command: -c "airflow db init && airflow users create --firstname admin --lastname admin --email admin@admin.com --password admin --username admin --role Admin"
      networks:
        - mynetwork

# Airflow Webserver
  airflow-webserver:
    image: apache/airflow:2.0.0-python3.8
    env_file:
      - .env
    depends_on:
      - airflow-database
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./airflow/metadata-airflow/dags:/opt/airflow/dags
      - ./airflow/logs:/opt/airflow/logs
    ports:
      - 8080:8080
    deploy:
      restart_policy:
        condition: on-failure
        delay: 8s
        max_attempts: 3
    command: webserver
    healthcheck:
      test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3
    networks:
      - mynetwork

# Airflow Scheduler
  airflow-scheduler:
    image: apache/airflow:2.0.0-python3.8
    env_file:
      - .env
    depends_on:
      - airflow-database
      - airflow-webserver
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./airflow/metadata-airflow/dags:/opt/airflow/dags
      - ./airflow/logs:/opt/airflow/logs
    deploy:
      restart_policy:
        condition: on-failure
        delay: 8s
        max_attempts: 3
    command: scheduler
    networks:
      - mynetwork

networks:
  mynetwork:

DAG 文件

from datetime import timedelta
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.providers.docker.operators.docker import DockerOperator
from airflow.utils.dates import days_ago


default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'email': ['airflow@example.com'],
    'email_on_failure': False,
    'email_on_retry': False,
}

dag = DAG(
    'docker_sample',
    default_args=default_args,
    schedule_interval=None,
    start_date=days_ago(2),
)

t1 = DockerOperator(
    task_id='docker_op_tester',
    api_version='auto', 
    image='ubuntu',
    docker_url='unix://var/run/docker.sock',
    auto_remove=True,
    command=[
        "/bin/bash",
        "-c",
        "/bin/sleep 30; "],
    network_mode='bridge',
    dag=dag,
)


t1

DAG 错误日志

*** Reading local file: /opt/airflow/logs/docker_sample/docker_op_tester/2021-01-09T05:16:17.174981+00:00/1.log
[2021-01-09 05:16:26,726] {taskinstance.py:826} INFO - Dependencies all met for <TaskInstance: docker_sample.docker_op_tester 2021-01-09T05:16:17.174981+00:00 [queued]>
[2021-01-09 05:16:26,774] {taskinstance.py:826} INFO - Dependencies all met for <TaskInstance: docker_sample.docker_op_tester 2021-01-09T05:16:17.174981+00:00 [queued]>
[2021-01-09 05:16:26,775] {taskinstance.py:1017} INFO - 
--------------------------------------------------------------------------------
[2021-01-09 05:16:26,776] {taskinstance.py:1018} INFO - Starting attempt 1 of 1
[2021-01-09 05:16:26,776] {taskinstance.py:1019} INFO - 
--------------------------------------------------------------------------------
[2021-01-09 05:16:26,790] {taskinstance.py:1038} INFO - Executing <Task(DockerOperator): docker_op_tester> on 2021-01-09T05:16:17.174981+00:00
[2021-01-09 05:16:26,794] {standard_task_runner.py:51} INFO - Started process 1057 to run task
[2021-01-09 05:16:26,817] {standard_task_runner.py:75} INFO - Running: ['airflow', 'tasks', 'run', 'docker_sample', 'docker_op_tester', '2021-01-09T05:16:17.174981+00:00', '--job-id', '360', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/example_docker.py', '--cfg-path', '/tmp/tmp4phq52dv']
[2021-01-09 05:16:26,821] {standard_task_runner.py:76} INFO - Job 360: Subtask docker_op_tester
[2021-01-09 05:16:26,932] {logging_mixin.py:103} INFO - Running <TaskInstance: docker_sample.docker_op_tester 2021-01-09T05:16:17.174981+00:00 [running]> on host 367f0fc7d092
[2021-01-09 05:16:27,036] {taskinstance.py:1230} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_EMAIL=airflow@example.com
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=docker_sample
AIRFLOW_CTX_TASK_ID=docker_op_tester
AIRFLOW_CTX_EXECUTION_DATE=2021-01-09T05:16:17.174981+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-01-09T05:16:17.174981+00:00
[2021-01-09 05:16:27,054] {taskinstance.py:1396} ERROR - ('Connection aborted.', PermissionError(13, 'Permission denied'))
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 670, in urlopen
    httplib_response = self._make_request(
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 392, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/local/lib/python3.8/http/client.py", line 1255, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/local/lib/python3.8/http/client.py", line 1301, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.8/http/client.py", line 1250, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.8/http/client.py", line 1010, in _send_output
    self.send(msg)
  File "/usr/local/lib/python3.8/http/client.py", line 950, in send
    self.connect()
  File "/home/airflow/.local/lib/python3.8/site-packages/docker/transport/unixconn.py", line 43, in connect
    sock.connect(self.unix_socket)
PermissionError: [Errno 13] Permission denied

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
    resp = conn.urlopen(
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 726, in urlopen
    retries = retries.increment(
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/util/retry.py", line 410, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/packages/six.py", line 734, in reraise
    raise value.with_traceback(tb)
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 670, in urlopen
    httplib_response = self._make_request(
  File "/home/airflow/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 392, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/local/lib/python3.8/http/client.py", line 1255, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/local/lib/python3.8/http/client.py", line 1301, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.8/http/client.py", line 1250, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.8/http/client.py", line 1010, in _send_output
    self.send(msg)
  File "/usr/local/lib/python3.8/http/client.py", line 950, in send
    self.connect()
  File "/home/airflow/.local/lib/python3.8/site-packages/docker/transport/unixconn.py", line 43, in connect
    sock.connect(self.unix_socket)
urllib3.exceptions.ProtocolError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1086, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1260, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1300, in _execute_task
    result = task_copy.execute(context=context)
  File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/docker/operators/docker.py", line 286, in execute
    if self.force_pull or not self.cli.images(name=self.image):
  File "/home/airflow/.local/lib/python3.8/site-packages/docker/api/image.py", line 89, in images
    res = self._result(self._get(self._url("/images/json"), params=params),
  File "/home/airflow/.local/lib/python3.8/site-packages/docker/utils/decorators.py", line 46, in inner
    return f(self, *args, **kwargs)
  File "/home/airflow/.local/lib/python3.8/site-packages/docker/api/client.py", line 230, in _get
    return self.get(url, **self._set_request_timeout(kwargs))
  File "/home/airflow/.local/lib/python3.8/site-packages/requests/sessions.py", line 543, in get
    return self.request('GET', url, **kwargs)
  File "/home/airflow/.local/lib/python3.8/site-packages/requests/sessions.py", line 530, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/airflow/.local/lib/python3.8/site-packages/requests/sessions.py", line 643, in send
    r = adapter.send(request, **kwargs)
  File "/home/airflow/.local/lib/python3.8/site-packages/requests/adapters.py", line 498, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', PermissionError(13, 'Permission denied'))
[2021-01-09 05:16:27,073] {taskinstance.py:1433} INFO - Marking task as FAILED. dag_id=docker_sample, task_id=docker_op_tester, execution_date=20210109T051617, start_date=20210109T051626, end_date=20210109T051627
[2021-01-09 05:16:27,136] {local_task_job.py:118} INFO - Task exited with return code 1

规格: 码头工人: 版本:20.10.2 API 版本:1.41

气流镜像:apache/airflow:2.0.0-python3.8

主机系统:MacOS BigSur

1 个答案:

答案 0 :(得分:1)

我想我明白了 - 来源:https://tomgregory.com/running-docker-in-docker-on-windows

  1. 检查 root 的组 ID:

    docker run --rm -v /var/run/docker.sock:/var/run/docker.sock debian:buster-slim stat -c %g /var/run/docker.sock

为我返回“1001”。

  1. 将引用此组 ID 的 group_add 语句添加到您的 docker-compose.yml 中:

     image: apache/airflow:2.0.0-python3.8
     group_add:
       - 1001
    

我将它添加到网络服务器和调度程序(不确定两者是否都需要它),现在它似乎对我有用(至少它稍后会崩溃;-)

编辑:

您还需要添加

AIRFLOW__CORE__ENABLE_XCOM_PICKLING=True

作为 Airflow 中的环境变量,否则您的容器在退出时崩溃 (https://github.com/apache/airflow/issues/13487)。