使用SparkContext.setLogLevel时无法加载类“org.slf4j.impl.StaticLoggerBinder”

时间:2016-08-23 07:31:38

标签: scala apache-spark slf4j

我正在使用sparkContext.setLogLevel("ERROR")更改SparkContext的日志级别,当我运行程序时,我得到:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details

一切仍然像往常一样运行,但我想知道它来自何处,因为我没有手动插入任何SLF4J依赖项。 我的POM依赖部分看起来像这样(其中scala.version是2.11.8而scala.binaty.version是2.11):

<dependencies>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>${scala.version}</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_${scala.binary.version}</artifactId>
        <version>2.2.1</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>com.amazonaws</groupId>
        <artifactId>aws-java-sdk</artifactId>
        <version>1.10.11</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>joda-time</groupId>
        <artifactId>joda-time</artifactId>
        <version>2.9.4</version>
    </dependency>
    <dependency>
        <groupId>org.scalaj</groupId>
        <artifactId>scalaj-http_${scala.binary.version}</artifactId>
        <version>2.3.0</version>
    </dependency>
    <dependency>
        <groupId>net.liftweb</groupId>
        <artifactId>lift-json_${scala.binary.version}</artifactId>
        <version>3.0-M8</version>
    </dependency>
    <dependency>
        <groupId>commons-codec</groupId>
        <artifactId>commons-codec</artifactId>
        <version>1.9</version>
    </dependency>
</dependencies>

知道造成这种情况的原因,以及我如何解决这个问题?

1 个答案:

答案 0 :(得分:1)

我猜你的一个依赖项有一个特定版本的slf4j,它会覆盖Spark使用的那个,这会导致该错误。

我建议您明确地将slf4j个依赖项放在pom.xml

例如:

<properties>
    <org.slf4j.version>1.7.5</org.slf4j.version>
</properties>

<dependencies>
    <!-- Logging -->
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
        <version>${org.slf4j.version}</version>
    </dependency>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>jcl-over-slf4j</artifactId>
        <version>${org.slf4j.version}</version>
        <scope>runtime</scope>
    </dependency>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-log4j12</artifactId>
        <version>${org.slf4j.version}</version>
        <scope>runtime</scope>
    </dependency> 
</dependencies>

此外,请确保您有一个存储在类路径中的有效log4j.xml文件,并尝试在xml中设置Spark的日志级别,而不是使用代码。

例如:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">

<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/">
  <appender name="console" class="org.apache.log4j.ConsoleAppender"> 
    <param name="Target" value="System.out"/> 
    <layout class="org.apache.log4j.PatternLayout"> 
      <param name="ConversionPattern" value="%-5p %c{1} - %m%n"/> 
    </layout> 
  </appender> 

  <logger name="org.apache.spark">
    <level value="error"/>
  </logger>

  <logger name="org.spark-project">
    <level value="error"/>
  </logger>

  <logger name="org.spark-project">
    <level value="error"/>
  </logger>

  <root> 
    <priority value ="debug" /> 
    <appender-ref ref="console" /> 
  </root>

</log4j:configuration>