Apache Spark Mllib 2.1.0与Scala sbt错误

时间:2017-03-10 23:02:59

标签: scala sbt apache-spark-mllib

我与Scala的小型apache spark项目工作正常,直到我添加了Mllib。

我的sbt构建文件如下所示但是我得到了编译错误。我不能用Scala 2.11.X构建Apache Spark Mllib吗?任何指针都会有所帮助。

error] Modules were resolved with conflicting cross-version suffixes in {file::
[error]    org.apache.spark:spark-launcher _2.11, _2.10
[error]    org.apache.spark:spark-sketch _2.11, _2.10
[error]    org.json4s:json4s-ast _2.11, _2.10
[error]    org.apache.spark:spark-catalyst _2.11, _2.10
[error]    org.apache.spark:spark-network-shuffle _2.11, _2.10
[error]    org.scalatest:scalatest _2.11, _2.10
[error]    com.twitter:chill _2.11, _2.10
[error]    org.apache.spark:spark-sql _2.11, _2.10
[error]    org.json4s:json4s-jackson _2.11, _2.10
[error]    com.fasterxml.jackson.module:jackson-module-scala _2.11, _2.10
[error]    org.json4s:json4s-core _2.11, _2.10
[error]    org.apache.spark:spark-unsafe _2.11, _2.10
[error]    org.apache.spark:spark-tags _2.11, _2.10
[error]    org.apache.spark:spark-core _2.11, _2.10
[error]    org.apache.spark:spark-network-common _2.11, _2.10
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.apache.spark:spark-sketch, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, org.scalatest:scalatest, com.twitter:chill, org.apache.spark:spark-sql, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-tags, org.apache.spark:spark-core, org.apache.spark:spark-network-common
[error] Total time: 18 s, completed 10-Mar-2017 20:41:51


version := "1.0"
scalaVersion := "2.11.8"

libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
//libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.5.0"
//libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.7"
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-mllib_2.10
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.1.0"

1 个答案:

答案 0 :(得分:1)

您绝对可以使用Scala 2.11.X构建Apache Spark MLlib。为此,您必须从以下位置更改Spark MLlib的库依赖项:

libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.1.0"

libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0"

libraryDependencies += "org.apache.spark" %% "spark-mllib" % "2.1.0"