使用scala 2.11添加GraphFrames构建时,Build.sbt会中断

时间:2017-10-22 19:29:37

标签: scala sbt graphframes

我试图将GraphFrames添加到我的scala spark应用程序中,当我添加基于2.10的那个时,这很好。但是,一旦我尝试使用scala 2.11构建GraphFrames来构建它,它就会中断。

问题在于使用的scala版本存在冲突(2.10和2.11)。我收到以下错误:

[error] Modules were resolved with conflicting cross-version suffixes in {file:/E:/Documents/School/LSDE/hadoopcryptoledger/examples/scala-spark-graphx-bitcointransaction/}root:
[error]    org.apache.spark:spark-launcher _2.10, _2.11
[error]    org.json4s:json4s-ast _2.10, _2.11
[error]    org.apache.spark:spark-network-shuffle _2.10, _2.11
[error]    com.twitter:chill _2.10, _2.11
[error]    org.json4s:json4s-jackson _2.10, _2.11
[error]    com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error]    org.json4s:json4s-core _2.10, _2.11
[error]    org.apache.spark:spark-unsafe _2.10, _2.11
[error]    org.apache.spark:spark-core _2.10, _2.11
[error]    org.apache.spark:spark-network-common _2.10, _2.11

但是,我无法解决导致此问题的原因。这是我的完整版本.bt:

import sbt._
import Keys._
import scala._


lazy val root = (project in file("."))
.settings(
    name := "example-hcl-spark-scala-graphx-bitcointransaction",
    version := "0.1"
)
 .configs( IntegrationTest )
  .settings( Defaults.itSettings : _*)

scalacOptions += "-target:jvm-1.7"

crossScalaVersions := Seq("2.11.8")

resolvers += Resolver.mavenLocal

fork  := true

jacoco.settings

itJacoco.settings



assemblyJarName in assembly := "example-hcl-spark-scala-graphx-bitcointransaction.jar"

libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.0.7" % "compile"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.5.0" % "provided"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"


libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"

libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"

libraryDependencies += "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11"

任何人都可以确定哪个依赖项基于scala 2.10导致构建失败吗?

1 个答案:

答案 0 :(得分:0)

我发现了问题所在。显然,如果您使用:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

默认使用2.10版本。一旦我将spark core和spark graphx的依赖关系改为:

,这一切都奏效了
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"

libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.2.0" % "provided"