我想使用Spark 2.4.5构建一个jar文件。我定义了一个build.sbt
文件,将spark 2.4.5和Breeze包含为
name := "trialLibrary"
version := "1.0"
scalaVersion := "2.11.12"
val sparkVersion = "2.4.5"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.scalanlp" %% "breeze" % "1.0",
"org.scalanlp" %% "breeze-natives" % "1.0",
)
但是,在sbt package
在sbt控制台中运行后,在使用evicted
进行编译期间,我在库依赖项中收到很多冲突:
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
[warn] +- org.apache.spark:spark-core_2.11:2.4.5 (depends on 3.9.9.Final)
[warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.7.0.Final)
[warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final)
[warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over {1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9}
[warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2)
[warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2)
[warn] +- org.apache.spark:spark-unsafe_2.11:2.4.5 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-network-common_2.11:2.4.5 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-core_2.11:2.4.5 (depends on 1.3.9)
[warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9)
[warn] * com.google.guava:guava:16.0.1 is selected over {11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1}
[warn] +- org.apache.curator:curator-framework:2.6.0 (depends on 16.0.1)
[warn] +- org.apache.curator:curator-recipes:2.6.0 (depends on 16.0.1)
[warn] +- org.apache.curator:curator-client:2.6.0 (depends on 16.0.1)
[warn] +- org.htrace:htrace-core:3.0.4 (depends on 12.0.1)
[warn] +- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-server-common:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-common:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-client:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-api:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 11.0.2)
[info] Here are other dependency conflicts that were resolved:
[info] * log4j:log4j:1.2.17 is selected over 1.2.16
[info] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 1.2.17)
...
我已经省略了大多数[info]
消息,这些消息在“这里还有其他已解决的依赖项冲突”行之后打印。
我还尝试将scalaVersion
更改为2.12.10
,但没有成功。
我还尝试过用{p>更改build.sbt
"org.apache.spark" % "spark-core_2.11" % 2.4.5,
"org.apache.spark" % "spark-sql_2.11" % 2.4.5
但没有任何改变。
有什么办法可以解决此问题?