slf4j-log4j12.jar和log4j-over-slf4j.jar在相同路径中,而依赖关系在Maven POM中得到解决

时间:2017-11-21 12:29:07

标签: apache-spark slf4j apache-drill log4

我正在尝试使用spark 2.1.0访问钻孔。我把pom文件放在我的下面 项目。但在编译代码时,我发现下面的错误。虽然我正在删除钻取依赖一切正常。

我理解spark已经有了“slf4j-log4j12.jar”,并且在添加钻取依赖项时它会带来“log4j-over-slf4j.jar”,但如果我从类路径中删除它们中的任何一个,代码就无法正常工作。

非常感谢任何帮助。

POM文件: -

 <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.eric</groupId>
    <artifactId>testingFrameWork</artifactId>
    <version>1.0.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.target>1.8</maven.compiler.target>
        <maven.compiler.source>1.8</maven.compiler.source>

        <fasterxml.jackson>2.2.3</fasterxml.jackson>
        <jersey>1.9</jersey>
        <joda.version>1.8</joda.version>
        <surefire.version>2.17</surefire.version>
        <scalatest.version>1.0</scalatest.version>
        <shade.version>2.2</shade.version>
        <junit.version>4.11</junit.version>

        <spark.version>1.6.0</spark.version>
        <hbase.version>1.0.0-cdh5.5.2</hbase.version>
        <hadoop.version>2.6.0-cdh5.5.2</hadoop.version>
        <avro.version>1.7.6-cdh5.5.2</avro.version>
        <kafka.version>0.8.2.0-kafka-1.4.0</kafka.version>
        <flume.version>1.6.0-cdh5.5.2</flume.version>
        <parquet.version>1.5.0-cdh5.5.2</parquet.version>
        <solr.version>4.10.3-cdh5.5.2</solr.version>
        <maven-scala-plugin.version>2.15.2</maven-scala-plugin.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-compiler</artifactId>
            <version>2.11.8</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>8.0.8-dmr</version>
        </dependency>


        <!-- https://mvnrepository.com/artifact/org.apache.drill.exec/drill-jdbc -->
      <dependency>
            <groupId>org.apache.drill.exec</groupId>
            <artifactId>drill-jdbc</artifactId>
            <version>1.9.0</version>
      </dependency>


    </dependencies>

    <build>
        <plugins>

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>${maven.compiler.source}</source>
                    <target>${maven.compiler.target}</target>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>${surefire.version}</version>
            </plugin>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
                <version>${maven-scala-plugin.version}</version>
                <executions>
                    <execution>
                        <id>compile</id>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                        <phase>compile</phase>
                    </execution>
                    <execution>
                        <id>test-compile</id>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                        <phase>test-compile</phase>
                    </execution>
                    <execution>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>${shade.version}</version>
            </plugin>
            <plugin>
                <groupId>external.atlassian.jgitflow</groupId>
                <artifactId>jgitflow-maven-plugin</artifactId>
                <version>1.0-m5.1</version>
                <configuration>
                    <noDeploy>true</noDeploy>
                    <noReleaseBuild>true</noReleaseBuild>
                    <noFeatureBuild>true</noFeatureBuild>
                    <noHotfixBuild>true</noHotfixBuild>
                    <enableFeatureVersions>false</enableFeatureVersions>
                    <releaseBranchVersionSuffix>RC</releaseBranchVersionSuffix>
                    <allowSnapshots>false</allowSnapshots>
                    <pushReleases>true</pushReleases>
                    <pushHotfixes>true</pushHotfixes>
                    <pushFeatures>true</pushFeatures>
                </configuration>
            </plugin>
        </plugins>
        <pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.scalatest</groupId>
                    <artifactId>scalatest-maven-plugin</artifactId>
                    <version>${scalatest.version}</version>
                    <configuration>
                        <junitxml>.</junitxml>
                    </configuration>
                    <executions>
                        <execution>
                            <id>test</id>
                            <goals>
                                <goal>test</goal>
                            </goals>
                        </execution>
                    </executions>
                </plugin>
            </plugins>
        </pluginManagement>
    </build>



</project>

错误: -

    SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError. 
SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more details.
Exception in thread "main" java.lang.ExceptionInInitializerError
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.slf4j.impl.Log4jLoggerFactory.<clinit>(Log4jLoggerFactory.java:48)
    at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:111)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
    at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:73)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
    at org.apache.spark.SparkContext.log(SparkContext.scala:73)
    at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
    at org.apache.spark.SparkContext.logInfo(SparkContext.scala:73)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:184)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
    at SimpleApp$.delayedEndpoint$SimpleApp$1(SarojCode.scala:20)
    at SimpleApp$delayedInit$body.apply(SarojCode.scala:10)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
    at scala.App$class.main(App.scala:76)
    at SimpleApp$.main(SarojCode.scala:10)
    at SimpleApp.main(SarojCode.scala)
Caused by: java.lang.IllegalStateException: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError. See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more details.
    at org.apache.log4j.Log4jLoggerFactory.<clinit>(Log4jLoggerFactory.java:51)
    ... 29 more

Process finished with exit code 1

1 个答案:

答案 0 :(得分:1)

请通过更新

排除POM中的log4j-over-slf4j依赖关系
<dependency>
    <groupId>org.apache.drill.exec</groupId>
    <artifactId>drill-jdbc</artifactId>
    <version>1.9.0</version>
</dependency>

<dependency>
    <groupId>org.apache.drill.exec</groupId>
    <artifactId>drill-jdbc</artifactId>
    <version>1.9.0</version>
    <exclusions>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>log4j-over-slf4j</artifactId>
        </exclusion>
    </exclusions>
</dependency>