我正在尝试按照本指南https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/和官方存储库https://github.com/apache/incubator-airflow.git测试Airflow KubernetesPodOperator。
我能够部署Kubernetes集群(./scripts/ci/kubernetes/kube/deploy.sh -d持久性模式),但是调度程序和postgres容器之间似乎存在问题。 Scheduler无法从日志成功连接到postgres:
$ kubectl logs airflow-698ff6b8cd-gdr7f scheduler
`[2019-02-24 21:06:20,529] {settings.py:175} INFO - settings.configure_orm(): Usi
ng pool settings. pool_size=5, pool_recycle=1800, pid=1
[2019-02-24 21:06:20,830] {__init__.py:51} INFO - Using executor LocalExecutor
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2019-02-24 21:06:21,317] {jobs.py:1490} INFO - Starting the scheduler
[2019-02-24 21:06:21,317] {jobs.py:1498} INFO - Processing each file at most -1 times
[2019-02-24 21:06:21,317] {jobs.py:1501} INFO - Searching for files in /root/airflow/dags
[2019-02-24 21:06:21,547] {jobs.py:1503} INFO - There are 22 files in /root/airflow/dags
[2019-02-24 21:06:21,688] {jobs.py:1548} INFO - Resetting orphaned tasks for active dag runs
[2019-02-24 21:06:22,059] {dag_processing.py:514} INFO - Launched DagFileProcessorManager with pid: 39
[2019-02-24 21:06:22,183] {settings.py:51} INFO - Configured default timezone <Timezone [UTC]>
[2019-02-24 21:06:22,200] {settings.py:175} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=39
[2019-02-24 21:06:53,375] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:04,396] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:15,418] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:26,448] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:37,458] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:48,472] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...`
以下yaml文件: airflow.yaml
apiVersion: rbac.authorization.k8s.io/v1beta1
kind: ClusterRoleBinding
metadata:
name: admin-rbac
subjects:
- kind: ServiceAccount
# Reference to upper's `metadata.name`
name: default
# Reference to upper's `metadata.namespace`
namespace: default
roleRef:
kind: ClusterRole
name: cluster-admin
apiGroup: rbac.authorization.k8s.io
---
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: airflow
spec:
replicas: 1
template:
metadata:
labels:
name: airflow
spec:
initContainers:
- name: "init"
image: airflow:latest
imagePullPolicy: IfNotPresent
volumeMounts:
- name: airflow-configmap
mountPath: /root/airflow/airflow.cfg
subPath: airflow.cfg
- name: airflow-dags
mountPath: /root/airflow/dags
- name: test-volume
mountPath: /root/test_volume
env:
- name: SQL_ALCHEMY_CONN
valueFrom:
secretKeyRef:
name: airflow-secrets
key: sql_alchemy_conn
command:
- "bash"
args:
- "-cx"
- "./tmp/airflow-test-env-init.sh"
containers:
- name: webserver
image: airflow:latest
imagePullPolicy: IfNotPresent
ports:
- name: webserver
containerPort: 8080
args: ["webserver"]
env:
- name: AIRFLOW_KUBE_NAMESPACE
valueFrom:
fieldRef:
fieldPath: metadata.namespace
- name: SQL_ALCHEMY_CONN
valueFrom:
secretKeyRef:
name: airflow-secrets
key: sql_alchemy_conn
volumeMounts:
- name: airflow-configmap
mountPath: /root/airflow/airflow.cfg
subPath: airflow.cfg
- name: airflow-dags
mountPath: /root/airflow/dags
- name: airflow-logs
mountPath: /root/airflow/logs
- name: scheduler
image: airflow:latest
imagePullPolicy: IfNotPresent
args: ["scheduler"]
env:
- name: AIRFLOW_KUBE_NAMESPACE
valueFrom:
fieldRef:
fieldPath: metadata.namespace
- name: SQL_ALCHEMY_CONN
valueFrom:
secretKeyRef:
name: airflow-secrets
key: sql_alchemy_conn
volumeMounts:
- name: airflow-configmap
mountPath: /root/airflow/airflow.cfg
subPath: airflow.cfg
- name: airflow-dags
mountPath: /root/airflow/dags
- name: airflow-logs
mountPath: /root/airflow/logs
volumes:
- name: airflow-dags
persistentVolumeClaim:
claimName: airflow-dags
- name: airflow-dags-fake
emptyDir: {}
- name: airflow-dags-git
emptyDir: {}
- name: test-volume
persistentVolumeClaim:
claimName: test-volume
- name: airflow-logs
persistentVolumeClaim:
claimName: airflow-logs
- name: airflow-configmap
configMap:
name: airflow-configmap
---
apiVersion: v1
kind: Service
metadata:
name: airflow
spec:
type: NodePort
ports:
- port: 8080
nodePort: 30809
selector:
name: airflow
postgres.yaml:
kind: Deployment
apiVersion: extensions/v1beta1
metadata:
name: postgres-airflow
spec:
replicas: 1
template:
metadata:
labels:
name: postgres-airflow
spec:
restartPolicy: Always
containers:
- name: postgres
image: postgres
imagePullPolicy: IfNotPresent
ports:
- containerPort: 5432
protocol: TCP
volumeMounts:
- name: dbvol
mountPath: /var/lib/postgresql/data/pgdata
subPath: pgdata
env:
- name: POSTGRES_USER
value: root
- name: POSTGRES_PASSWORD
value: XXXX
- name: POSTGRES_DB
value: airflow
- name: PGDATA
value: /var/lib/postgresql/data/pgdata
- name: POD_IP
valueFrom: { fieldRef: { fieldPath: status.podIP } }
livenessProbe:
initialDelaySeconds: 60
timeoutSeconds: 5
failureThreshold: 5
exec:
command:
- /bin/sh
- -c
- exec pg_isready --host $POD_IP || if [[ $(psql -qtAc --host $POD_IP 'SELECT pg_is_in_recovery') != "f" ]]; then exit 0 else; exit 1; fi
readinessProbe:
initialDelaySeconds: 5
timeoutSeconds: 5
periodSeconds: 5
exec:
command:
- /bin/sh
- -c
- exec pg_isready --host $POD_IP
resources:
requests:
memory: .5Gi
cpu: .5
volumes:
- name: dbvol
emptyDir: {}
---
apiVersion: v1
kind: Service
metadata:
name: postgres-airflow
spec:
clusterIP: None
ports:
- port: 5432
targetPort: 5432
selector:
name: postgres-airflow
我是Kubernetes的新手,我将不胜感激!
编辑: 我能够检查initContainer日志,并且好像在pod之间建立了连接。另外,如果我打开Airflow Web UI,则看不到“最近的任务”,“达格运行”,既看不到dag的图形视图也不显示树视图,而只能看到加载的圆形图像。
编辑:
非常感谢您的帮助,我发现来自Web服务器的一些错误响应,例如static / dist / net :: ERR_ABORTED 404(NOT FOUND),因此我认为Docker映像构建未成功完成。我没有在./scripts/ci/kubernetes/docker/compile.sh上使用python setup.py compile_assets sdist -q
进行构建,而是在Dockerfile上添加了RUN pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org apache-airflow[celery,kubernetes,postgres,rabbitmq,ssh]
。
答案 0 :(得分:1)
我认为这是因为您使用的是:LocalExecutor而不是KubernetesExecutor。
我只是遵循了同一教程,无法重现您的问题:
[2019-03-01 16:07:22,053] {__init__.py:51} INFO - Using executor KubernetesExecutor
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2019-03-01 16:07:29,564] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:07:30,317] {security.py:196} INFO - Existing permissions for the role:Viewer within the database will persist.
[2019-03-01 16:07:31,281] {security.py:196} INFO - Existing permissions for the role:User within the database will persist.
[2019-03-01 16:07:31,709] {security.py:196} INFO - Existing permissions for the role:Op within the database will persist.
[2019-03-01 16:07:31,717] {security.py:374} INFO - Fetching a set of all permission, view_menu from FAB meta-table
[2019-03-01 16:07:33,357] {security.py:324} INFO - Cleaning faulty perms
[2019-03-01 16:07:37,878] {settings.py:175} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=10
[2019-03-01 16:07:38 +0000] [10] [INFO] Starting gunicorn 19.9.0
[2019-03-01 16:07:38 +0000] [10] [INFO] Listening at: http://0.0.0.0:8080 (10)
[2019-03-01 16:07:38 +0000] [10] [INFO] Using worker: sync
[2019-03-01 16:07:38 +0000] [15] [INFO] Booting worker with pid: 15
[2019-03-01 16:07:38 +0000] [16] [INFO] Booting worker with pid: 16
[2019-03-01 16:07:38 +0000] [17] [INFO] Booting worker with pid: 17
[2019-03-01 16:07:38 +0000] [18] [INFO] Booting worker with pid: 18
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
=================================================================
[2019-03-01 16:07:57,078] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,479] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,490] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,853] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:08:02,411] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,423] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,420] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,424] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:08,480] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:08:08,525] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:08:09,250] {security.py:196} INFO - Existing permissions for the role:Viewer within the database will persist.
我的设置:
minikube:v0.32.0
VirtualBox:5.2.22
Apache / Airflow构建自:b51712c