在运行DAG时,该DAG使用docker映像运行jar,
提供了 xcom_push = True ,它可以在一个容器中创建另一个容器以及Docker镜像。
DAG:
jar_task = KubernetesPodOperator(
namespace='test',
image="path to image",
image_pull_secrets="secret",
image_pull_policy="Always",
node_selectors={"d-type":"na-node-group"},
cmds=["sh","-c",..~running jar here~..],
secrets=[secret_file],
env_vars=environment_vars,
labels={"k8s-app": "airflow"},
name="airflow-pod",
config_file=k8s_config_file,
resources=pod.Resources(request_cpu=0.2,limit_cpu=0.5,request_memory='512Mi',limit_memory='1536Mi'),
in_cluster=False,
task_id="run_jar",
is_delete_operator_pod=True,
get_logs=True,
xcom_push=True,
dag=dag)
这是成功执行JAR时的错误。
[2018-11-27 11:37:21,605] {{logging_mixin.py:95}} INFO - [2018-11-27 11:37:21,605] {{pod_launcher.py:166}} INFO - Running command... cat /airflow/xcom/return.json
[2018-11-27 11:37:21,605] {{logging_mixin.py:95}} INFO -
[2018-11-27 11:37:21,647] {{logging_mixin.py:95}} INFO - [2018-11-27 11:37:21,646] {{pod_launcher.py:173}} INFO - cat: can't open '/airflow/xcom/return.json': No such file or directory
[2018-11-27 11:37:21,647] {{logging_mixin.py:95}} INFO -
[2018-11-27 11:37:21,647] {{logging_mixin.py:95}} INFO - [2018-11-27 11:37:21,647] {{pod_launcher.py:166}} INFO - Running command... kill -s SIGINT 1
[2018-11-27 11:37:21,647] {{logging_mixin.py:95}} INFO -
[2018-11-27 11:37:21,702] {{models.py:1760}} ERROR - Pod Launching failed: Failed to extract xcom from pod: airflow-pod-hippogriff-a4628b12
Traceback (most recent call last):
File "/usr/local/airflow/operators/kubernetes_pod_operator.py", line 126, in execute
get_logs=self.get_logs)
File "/usr/local/airflow/operators/pod_launcher.py", line 90, in run_pod
return self._monitor_pod(pod, get_logs)
File "/usr/local/airflow/operators/pod_launcher.py", line 110, in _monitor_pod
result = self._extract_xcom(pod)
File "/usr/local/airflow/operators/pod_launcher.py", line 161, in _extract_xcom
raise AirflowException('Failed to extract xcom from pod: {}'.format(pod.name))
airflow.exceptions.AirflowException: Failed to extract xcom from pod: airflow-pod-hippogriff-a4628b12
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 1659, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/airflow/operators/kubernetes_pod_operator.py", line 138, in execute
raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
airflow.exceptions.AirflowException: Pod Launching failed: Failed to extract xcom from pod: airflow-pod-hippogriff-a4628b12
[2018-11-27 11:37:21,704] {{models.py:1789}} INFO - All retries failed; marking task as FAILED
答案 0 :(得分:2)
如果xcom_push
为True,则KubernetesPodOperator
在Pod中与基本容器(实际工作容器)一起创建一个额外的sidecar容器(airflow-xcom-sidecar
)。
该sidecar容器从/airflow/xcom/return.json
读取数据,并作为xcom值返回。
因此,您需要在基本容器中将要返回的数据写入/airflow/xcom/return.json
文件中。
答案 1 :(得分:1)
之所以发生这种情况,是因为任务执行的结果没有按KubernetesPodOperator插件所需的预期路径推送到xcom。 查看Airflow信息库中的以下单元测试,以检查应如何实施(为方便起见,在下面提供了源代码片段,然后是该信息库的链接):
def test_xcom_push(self):
return_value = '{"foo": "bar"\n, "buzz": 2}'
k = KubernetesPodOperator(
namespace='default',
image="ubuntu:16.04",
cmds=["bash", "-cx"],
arguments=['echo \'{}\' > /airflow/xcom/return.json'.format(return_value)],
labels={"foo": "bar"},
name="test",
task_id="task",
xcom_push=True
)
self.assertEqual(k.execute(None), json.loads(return_value))
编辑:值得一提的是,推送到xcom的结果必须是json。
答案 2 :(得分:1)
我想指出关于xcom和KubernetesPodOperator
的错误,尽管它与OP的原因不同。万一有人偶然发现这个问题,因为这是关于KPO和XCom的唯一问题。
我正在使用Google Cloud Platform(GCP) Cloud Composer ,它使用的版本比最新的Airflow版本稍旧,因此当我提到官方GitHub时,它提到使用 {{1 }} ,而旧版Airflow则使用arg do_xcom_push
!