I am trying to resolve conflicts between libraries in my Maven project. I added the following plugin to the plugins section:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<configuration>
<rules><dependencyConvergence/></rules>
</configuration>
</plugin>
</plugins>
When I run mvn enforcer:enforce
, I get different dependency errors like this one:
Dependency convergence error for org.codehaus.jackson:jackson-mapper-asl:1.9.13 paths to dependency are:
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro-mapred:1.7.7
+-org.apache.avro:avro-ipc:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro-mapred:1.7.7
+-org.apache.avro:avro-ipc:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro-mapred:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.parquet:parquet-hadoop:1.8.2
+-org.codehaus.jackson:jackson-mapper-asl:1.9.11
So, how can I resolve these errors when I package the JAR? In SBT it is easier, but I get stuck with Maven.
答案 0 :(得分:0)
这意味着
org.codehaus.jackson:jackson-mapper-asl
在dependency:tree
的不同版本中找到。您需要选择您想要的版本。您通常使用<dependencyManagement>
之类的
<dependencyManagement>
...
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>1.9.13</version>
</dependency>
...
</dependencyManagement>