跨版本后缀冲突:Spark作业

时间:2018-08-07 21:28:17

标签: scala apache-spark sbt

下面是我的build.sbt文件。我搜索了其他类似问题,但是没有一个问题可以帮助我找到答案。我尝试了多种方法来显式使用2.11 scala,但由于某些原因,我一直收到此错误。

  

[错误](*:ssExtractDependencies)冲突的跨版本后缀:

     

org.json4s:json4s-ast,org.apache.spark:spark-network-shuffle,com.twitter:chill,org.json4s:json4s-jackson,com.fasterxml.jackson.module:jackson-module-scala ,org.json4s:json4s-core,org.apache.spark:spark-core,org.apache.spark:spark-network-common

     

[错误](*:update)交叉版本后缀冲突:org.json4s:json4s-ast,org.apache.spark:spark-network-shuffle,com.twitter:chill,org.json4s:json4s-杰克逊,com.fasterxml.jackson.module:jackson-module-scala,org.json4s:json4s-core,org.apache.spark:spark-core,org.apache.spark:spark-network-common

name := "ubi-journeyData-validation"

version := "2.0"

scalaVersion := "2.11.11"
dependencyOverrides += "org.scala-lang" % "scala-compiler" % scalaVersion.value

//updateOptions := updateOptions.value.withCachedResolution(false)

libraryDependencies ++= {
  val sparkVersion = "2.3.0"
  //val sparkVersion = "1.6.3"
     Seq("org.apache.spark" %% "spark-core" % sparkVersion,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.apache.spark" %% "spark-hive" % sparkVersion,
    "org.elasticsearch" %% "elasticsearch-spark-20" % "5.6.9",
    //"org.elasticsearch" %% "elasticsearch-spark-13" % "5.6.9",
    //"org.elasticsearch" %% "elasticsearch-spark-13" % "5.2.0",
    "org.spark-project.hive" % "hive-cli" % "1.2.1.spark2",
    "org.spark-project.hive" % "hive-metastore" % "1.2.1.spark2",
    "org.spark-project.hive" % "hive-exec" % "1.2.1.spark2"
    //"org.json4s" %% "json4s-jackson" % "3.2.11",
    //"org.apache.calcite" % "calcite-core" % "1.2.0-incubating",
    //"org.pentaho" % "pentaho-aggdesigner" % "5.1.5-jhyde" pomOnly(),
    //"org.pentaho" % "pentaho-aggdesigner-algorithm" % "5.1.5-jhyde" % Test
   )
}

resolvers += Resolver.mavenLocal
resolvers += "Cascading repo" at "http://conjars.org/repo"


assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

1 个答案:

答案 0 :(得分:0)

这是因为您的一个依赖项fe的依赖项:

"org.spark-project.hive" % "hive-exec" % "1.2.1.spark2"

如果您勾选for example here,则取决于:

org.apache.spark » spark-core_2.10 » 1.3.1 » (optional)

因此,除非您转储那些2.3.0模块,否则恐怕无法将Spark更新为2.11,将Scala更新为org.spark-project.hive