我正在尝试在Mesos下的Spark 2.2中更改日志记录到StdErr控制台,并在集群模式下使用Scala,sbt和spark-submit。
背景
最近我们将Spark / Mesos群集重新安装了新版本的spark 2.2.0与之前的2.0.1。除了日志记录,我的所有代码都有效似乎设置日志级别的src / main / resources / log4j.properties文件在启动时未被提取。它适用于Spark 2.0.1。
我将此选项添加到spark-submit脚本以检查log4j正在执行的操作
--conf "spark.driver.extraJavaOptions=-Dlog4j.debug"
这是我获得的Spark 2.2
log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@50134894.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@50134894 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@50134894.
log4j: Using URL [file:/usr/local/spark/conf/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/usr/local/spark/conf/log4j.properties
这是我使用Spark 2.0.1的旧版本获得的
log4j: Trying to find [log4j.xml] using context classloader org.apache.spark.util.MutableURLClassLoader@10b48321.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@61e717c2 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader org.apache.spark.util.MutableURLClassLoader@10b48321.
log4j: Using URL [jar:file:/var/lib/mesos/...path.../myProject-assembly-0.1.0-SNAPSHOT.jar!/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL jar:file:/var/lib/mesos/...path.../myProject-assembly-0.1.0-SNAPSHOT.jar!/log4j.properties
问题
更新: 这是我使用的示例spark-submit命令:
spark-submit \
--class myScript \
--master mesos://masterIP:7077 \
--total-executor-cores 30 \
--driver-memory 30g \
--deploy-mode cluster \
--name myScript \
--conf "spark.driver.extraJavaOptions=-Dlog4j.debug" \
--verbose \
http://192.168.75.41/~xxx/myProject-assembly-0.1.0-SNAPSHOT.jar