为什么Spark应用程序的sbt程序集会导致“模块被解决为具有冲突的跨版本后缀”?

时间:2017-10-16 05:38:43

标签: scala apache-spark sbt sbt-assembly

我正在使用带有Scala 2.11.8的Spark 2.1的CDH集群。

我使用sbt 1.0.2。

在执行assembly时,我收到错误

  

[error] java.lang.RuntimeException:冲突的跨版本后缀:org.scala-lang.modules:scala-xml,org.scala-lang.modules:scala-parser-combinators

我尝试使用dependencyOverridesforce()覆盖版本不匹配,但都没有效果。

来自sbt程序集的错误消息

[error] Modules were resolved with conflicting cross-version suffixes in {file:/D:/Tools/scala_ide/test_workspace/test/NewSp
arkTest/}newsparktest:
[error]    org.scala-lang.modules:scala-xml _2.11, _2.12
[error]    org.scala-lang.modules:scala-parser-combinators _2.11, _2.12
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.
modules:scala-parser-combinators
[error]         at scala.sys.package$.error(package.scala:27)
[error]         at sbt.librarymanagement.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:39)
[error]         at sbt.librarymanagement.ConflictWarning$.apply(ConflictWarning.scala:19)
[error]         at sbt.Classpaths$.$anonfun$ivyBaseSettings$64(Defaults.scala:1971)
[error]         at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error]         at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:42)
[error]         at sbt.std.Transform$$anon$4.work(System.scala:64)
[error]         at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
[error]         at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error]         at sbt.Execute.work(Execute.scala:266)
[error]         at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
[error]         at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
[error]         at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error]         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error]         at java.lang.Thread.run(Thread.java:748)
[error] (*:update) Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-par
ser-combinators
[error] Total time: 413 s, completed Oct 12, 2017 3:28:02 AM

build.sbt

name := "newtest"
version := "0.0.2"

scalaVersion := "2.11.8" 

sbtPlugin := true

val sparkVersion = "2.1.0"

mainClass in (Compile, run) := Some("com.testpackage.sq.newsparktest")

assemblyJarName in assembly := "newtest.jar"


libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.1.0" % "provided",
  "org.apache.spark" % "spark-sql_2.11" % "2.1.0" % "provided",
  "com.databricks" % "spark-avro_2.11" % "3.2.0",
  "org.apache.spark" % "spark-hive_2.11" % "2.1.0" % "provided")


libraryDependencies +=
     "log4j" % "log4j" % "1.2.15" excludeAll(
       ExclusionRule(organization = "com.sun.jdmk"),
       ExclusionRule(organization = "com.sun.jmx"),
       ExclusionRule(organization = "javax.jms")
     )

resolvers += "SparkPackages" at "https://dl.bintray.com/spark-packages/maven/"
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

plugins.sbt

dependencyOverrides += ("org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4")
dependencyOverrides += ("org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
resolvers += Resolver.url("bintray-sbt-plugins", url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)

1 个答案:

答案 0 :(得分:1)

tl; dr sbtPlugin := true删除build.sbt(适用于sbt插件而非应用程序)。

您还应该从dependencyOverrides删除plugins.sbt

您应该在spark-core_2.11中更改libraryDependencies和其他Spark依赖项,如下所示:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0" % "provided"

更改是使用%%(=百分号)并从依赖项的中间部分删除Scala的版本,例如上面spark-core