我正在尝试以“集群”模式从Airflow进行火花提交,我想在Submit操作符中指定log4j属性
task_id='spark_submit_job',
conn_id='spark_default',
files='/usr/hdp/current/spark-client/conf/hive-site.xml',
jars='/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar',
java_class='com.xxx.eim.job.SubmitSparkJob',
application='/root/airflow/code/eimdataprocessor.jar',
total_executor_cores='4',
executor_cores='4',
executor_memory='5g',
num_executors='4',
name='airflow-spark-example',
verbose=False,
driver_memory='10g',
application_args=["XXX"],
conf={'master':'yarn',
'spark.yarn.queue'='priority',
'spark.app.name'='XXX',
'spark.dynamicAllocation.enabled'='true'},
'spark.local.dir'='/opt/eim',
'spark.shuffle.service.enabled'='true',
'spark.hadoop.mapreduce.fileoutputcommitter.cleanup-failures.ignored'='true',
'spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version'='2'
},
dag=dag)
答案 0 :(得分:0)
我可以想到两种可能的方法
Log4J
属性作为Spark
配置
spark-submit
命令中尝试达到与该行相同的效果:--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties"
SparkSubmitOperator
中的conf
parameter传递operator
以在远程系统上执行spark-submit
,则传递Log4J
configuration-file可能具有挑战性为 Application-Args
spark-submit
的命令,因此请确保在需要时将它们与任何 --prefix
(es)一起传递SparkSubmitOperator
中的application_args
传递