如何在Spark中打开TRACE日志记录

时间:2018-10-12 15:28:13

标签: apache-spark logging log4j

我注意到在每次催化剂改变计划时,spark中的RuleExecutor都会执行跟踪日志:

https://github.com/apache/spark/blob/78801881c405de47f7e53eea3e0420dd69593dbd/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/rules/RuleExecutor.scala#L93

我想知道的是如何配置spark以便打开跟踪日志记录?我正在使用log4j,并且遇到了以下文档: https://jaceklaskowski.gitbooks.io/mastering-apache-spark/spark-logging.html

我已经在代码中挖掘了一段时间,并且我看到您可以将'log4j.threshold = TRACE'设置为使记录器的一部分处于跟踪模式,但是我似乎无法获得记录器催化剂用于拾取设置。

我在做什么错了?

1 个答案:

答案 0 :(得分:0)

我刚刚尝试了一个简单的结构化流式程序,该程序从IntelliJ中的Kafka读取数据,以下语句对我有用,即给了我跟踪级别的日志:

SparkSession.builder().getOrCreate().sparkContext().setLogLevel("TRACE");

以下是输出的一部分,其中显示了一些跟踪日志:

...
...
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
+-----+----+-----+-----------------------+
|topic|key |value|timestamp              |
+-----+----+-----+-----------------------+
|test |null|hi345|2018-10-12 23:56:00.099|
+-----+----+-----+-----------------------+
18/10/12 23:56:02 DEBUG GenerateUnsafeProjection: code for input[0, string, true],input[1, string, true],input[2, string, true],input[3, string, true]:
/* 001 */ public java.lang.Object generate(Object[] references) {
/* 002 */   return new SpecificUnsafeProjection(references);
/* 003 */ }
/* 004 */
/* 005 */ class SpecificUnsafeProjection extends org.apache.spark.sql.catalyst.expressions.UnsafeProjection {
/* 006 */
/* 007 */   private Object[] references;
...
...

希望这会有所帮助!