IntellijIdea - 运行Spark应用程序时禁用信息消息

时间:2016-08-01 09:48:23

标签: scala hadoop apache-spark hbase

运行使用 Apache Spark Hbase / Hadoop Library 的应用程序时,我收到了很多消息。例如:

0 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory  - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])

如何禁用它,所以我直接点到仅println(varABC)的日志?

3 个答案:

答案 0 :(得分:0)

您所看到的是Spark通过log4j生成的日志,因为默认情况下它会打印到stderr的大量打印输出日志。您可以配置它,因为您通常配置log4j行为,例如通过log4j.properties配置文件。请参阅http://spark.apache.org/docs/latest/configuration.html#configuring-logging

答案 1 :(得分:0)

$SPARK_HOME/conf目录下修改log4j.properties文件 - 将值INFO更改为ERROR,如下所示:

log4j.rootLogger=${root.logger}
root.logger=ERROR,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
log4j.logger.org.apache.spark.repl.Main=WARN
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

这将禁用所有INFO日志消息,并且仅打印ERRORFATAL日志消息。您可以根据您的要求更改这些值。

答案 2 :(得分:0)

在/spark-2.0.0-bin-hadoop2.6/conf文件夹中,您有一个文件log4j.properties.template

从log4j.properties.template重命名为log4j.properties

并在log4j.properties中进行以下更改

from:log4j.rootCategory = INFO,console to:log4j.rootCategory = ERROR,console

希望这个帮助!!! ...