Spark 2.1.1 Log4jLoggerFactory无法强制转换为LoggerContext

时间:2017-12-18 20:03:01

标签: java maven apache-spark logback

我试图在火花流中使用logback for logger。虽然我试图通过spark-submit提交工作,但我得到的例外情况如下所示。

  

线程中的异常" main" java.lang.ClassCastException:   org.slf4j.impl.Log4jLoggerFactory无法强制转换为   ch.qos.logback.classic.LoggerContext at   consumer.spark.LogBackConfigLoader。(LogBackConfigLoader.java:18)     在consumer.spark.Sample.main(Sample.java:18)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:498)at   org.apache.spark.deploy.SparkSubmit $ .ORG $阿帕奇$火花$部署$ SparkSubmit $$ runMain(SparkSubmit.scala:743)     在   org.apache.spark.deploy.SparkSubmit $$匿名$ 1.run(SparkSubmit.scala:169)     在   org.apache.spark.deploy.SparkSubmit $$匿名$ 1.run(SparkSubmit.scala:167)     在java.security.AccessController.doPrivileged(Native Method)at   javax.security.auth.Subject.doAs(Subject.java:422)at   org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)     在   org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:167)     在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:212)     在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:126)     在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我的pom.xml是:

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <slf4j.version>1.6.1</slf4j.version>
    <logback.version>1.2.3</logback.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
        <version>${slf4j.version}</version>
    </dependency>
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-classic</artifactId>
        <version>${logback.version}</version>
    </dependency>
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-core</artifactId>
        <version>${logback.version}</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>3.8.1</version>
        <scope>test</scope>
    </dependency>
</dependencies>

我的回溯代码是:

LoggerContext lc = (LoggerContext) LoggerFactory.getILoggerFactory();
JoranConfigurator configurator = new JoranConfigurator();
configurator.setContext(lc);
configurator.doConfigure(externalConfigFileLocation);

我的spark-submit命令是:

  

〜/ spark-2.1.1-bin-hadoop2.6 / bin / spark-submit --master yarn   --deploy-mode client --driver-memory 4g --executor-memory 2g --executor-cores 4 --class consumer.spark.Sample~ / SparkStreamingJob / log_testing.jar   〜/ SparkStreamingJob /火花作业/配置/ CONF / logback.xml

1 个答案:

答案 0 :(得分:0)

这似乎有两个问题:

SLF4J是日志记录实现的外观,基本上意味着您可以在不更改代码的情况下在日志记录框架之间进行更改。这也意味着您不应该使用相应的日志记录实现核心类。 SLF4J本身解决了日志记录实现,SLF4j提供的“logger”或“factory”对象绑定到该实现(在您的情况下为logback)。所有这些意味着您无法明确地将SLF4j提供的“logger”对象或“factory”强制转换为logback API类型。

此外,SLF4J似乎解析了log4jLoggerFactory而不是LogbackLoggerFactory。我相信SLF4J和Logback的桥接并不成功。