无法从Spark Streaming登录

时间:2016-10-18 13:41:28

标签: apache-spark foreach log4j spark-streaming

我正在尝试记录火花流的输出,如下面的代码所示

dStream.foreachRDD { rdd =>

  if (rdd.count() > 0) {
    @transient lazy val log = Logger.getLogger(getClass.getName)  
    log.info("Process Starting")

      rdd.foreach { item =>

     log.info("Output :: "+item._1 + "," + item._2 + "," + System.currentTimeMillis())
    }
 }

使用以下命令在纱线群集上执行代码

./bin/spark-submit --class "StreamingApp" --files file:/home/user/log4j.properties  --conf  "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/home/user/log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/home/user/log4j.properties" --master yarn --deploy-mode cluster --driver-memory 4g --executor-memory 2g --executor-cores 1 /home/user/Abc.jar

当我从纱线群集中查看日志时,我可以找到在foreach之前写的日志,即log.info("Process Starting"),但foreach中的日志不会打印。

我还尝试过创建一个单独的可序列化类,如下所示

    object LoggerObj extends Serializable{

        @transient lazy val log = Logger.getLogger(getClass.getName)
   }

并使用相同的内部foreach如下

dStream.foreachRDD { rdd =>

  if (rdd.count() > 0) {

    LoggerObj.log.info("Process Starting")

      rdd.foreach { item =>

     LoggerObj.log.info("Output :: "+item._1 + "," + item._2 + "," + System.currentTimeMillis())
    }
 }

但仍然是同一个问题,只打印foreach以外的日志。

log4j.properties在下面给出

log4j.rootLogger=INFO,stdout,FILE
log4j.rootCategory=INFO,FILE
log4j.appender.file=org.apache.log4j.FileAppender
log4j.appender.FILE=org.apache.log4j.RollingFileAppender
log4j.appender.FILE.File=/tmp/Rt.log
log4j.appender.FILE.ImmediateFlush=true
log4j.appender.FILE.Threshold=debug
log4j.appender.FILE.Append=true
log4j.appender.FILE.MaxFileSize=500MB
log4j.appender.FILE.MaxBackupIndex=10
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
log4j.logger.Holder=INFO,FILE

1 个答案:

答案 0 :(得分:0)

我能够通过在每个工作节点下放置“log4j.properties”文件来修复它。