气流SparkSubmitOperator因java.lang.ClassNotFoundException而失败:类org.apache.spark.examples.SparkPi

时间:2018-08-01 15:23:30

标签: python apache-spark airflow

我正在尝试使用Airflow SparkSubmitOperator在本地Spark独立服务器上触发spark-examples.jar,但是我一直在获取异常。当我在终端上手动提交相同作业时,它可以工作:

spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://<HOSTNAME>:7077 \
  --deploy-mode cluster \
  --executor-memory 1G \
  --total-executor-cores 1 \
  /path/to/spark-examples_2.11-2.3.1.jar \
  1000

我猜想我在气流方面做得不好,但我还无法弄清楚。这是堆栈跟踪:

airflow.exceptions.AirflowException: Cannot execute: ['spark-submit', '--master', 'local', '--conf', 'master=spark://<HOSTNAME>:7077', '--num-executors', '1', '--total-executor-cores', '1', '--executor-cores', '1', '--executor-memory', '2g', '--driver-memory', '1g', '--name', u'airflow-spark-example', '--class', 'class org.apache.spark.examples.SparkPi', '--queue', u'root.default', 'path/to/spark-examples_2.11-2.3.1.jar', u'1000']. Error code is: 101.

我做了什么

  1. 我将spark_submit_operator.py复制到了$SPARK_HOME/plugins
  2. 我编辑了spark_default连接:
    • host: local
    • Extra: {"queue": "root.default", "deploy_mode": "cluster", "spark_home": "", "spark_binary": "spark-submit", "namespace": "default"}

我的天哪

from airflow import DAG

from airflow.contrib.operators.spark_submit_operator import SparkSubmitOperator
from datetime import datetime, timedelta


args = {
    'owner': 'airflow',
    'start_date': datetime(2018, 7, 31)
}
dag = DAG('spark_example_new', default_args=args, schedule_interval="*/10 * * * *")

operator = SparkSubmitOperator(
    task_id='spark_submit_job',
    conn_id='spark_default',
    java_class='class org.apache.spark.examples.SparkPi',
    application='/path/to/spark-examples_2.11-2.3.1.jar',
    total_executor_cores='1',
    executor_cores='1',
    executor_memory='2g',
    num_executors='1',
    name='airflow-spark-example',
    verbose=False,
    driver_memory='1g',
    application_args=["1000"],
    conf={'master':'spark://<HOSTNAME>:7077'},
    dag=dag,
)

1 个答案:

答案 0 :(得分:1)

您有错字:

java_class='class org.apache.spark.examples.SparkPi'

应为java_class='org.apache.spark.examples.SparkPi'