我的build.sbt文件中有以下内容:
import AssemblyKeys._
assemblySettings
name := "acme-get-flight-delays"
version := "0.0.1"
scalaVersion := "2.10.5"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-hive" % "1.6.0",
"org.scalanlp" %% "breeze" % "0.11.2",
"net.liftweb" %% "lift-json" % "2.5+",
"com.github.seratch" %% "awscala" % "0.3.+"
)
并使用sbt assembly
构建得很好。
但是,如果我尝试添加
"org.apache.spark" % "spark-mllib_2.11" % "1.6.2"
作为依赖项,我收到以下错误:
[error] Modules were resolved with conflicting cross-version suffixes in {file:/Users/paulreiners/dev/tv-insight-airport-delays/}tv-insight-airport-delays:
[error] org.apache.spark:spark-launcher _2.10, _2.11
[error] org.json4s:json4s-ast _2.10, _2.11
[error] org.apache.spark:spark-catalyst _2.10, _2.11
[error] org.apache.spark:spark-network-shuffle _2.10, _2.11
[error] com.typesafe.akka:akka-actor _2.10, _2.11
[error] com.twitter:chill _2.10, _2.11
[error] org.apache.spark:spark-sql _2.10, _2.11
[error] org.json4s:json4s-jackson _2.10, _2.11
[error] com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error] org.scalanlp:breeze-macros _2.10, _2.11
[error] org.json4s:json4s-core _2.10, _2.11
[error] org.apache.spark:spark-unsafe _2.10, _2.11
[error] org.spire-math:spire _2.10, _2.11
[error] org.scalanlp:breeze _2.10, _2.11
[error] com.typesafe.akka:akka-remote _2.10, _2.11
[error] com.typesafe.akka:akka-slf4j _2.10, _2.11
[error] org.spire-math:spire-macros _2.10, _2.11
[error] org.apache.spark:spark-core _2.10, _2.11
[error] org.apache.spark:spark-network-common _2.10, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, com.typesafe.akka:akka-actor, com.twitter:chill, org.apache.spark:spark-sql, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.scalanlp:breeze-macros, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.spire-math:spire, org.scalanlp:breeze, com.typesafe.akka:akka-remote, com.typesafe.akka:akka-slf4j, org.spire-math:spire-macros, org.apache.spark:spark-core, org.apache.spark:spark-network-common
at scala.sys.package$.error(package.scala:27)
at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
at sbt.Classpaths$$anonfun$69.apply(Defaults.scala:1219)
at sbt.Classpaths$$anonfun$69.apply(Defaults.scala:1216)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] (*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, com.typesafe.akka:akka-actor, com.twitter:chill, org.apache.spark:spark-sql, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.scalanlp:breeze-macros, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.spire-math:spire, org.scalanlp:breeze, com.typesafe.akka:akka-remote, com.typesafe.akka:akka-slf4j, org.spire-math:spire-macros, org.apache.spark:spark-core, org.apache.spark:spark-network-common
我在这里做错了什么?
答案 0 :(得分:2)
看起来版本兼容性问题,请在下面尝试。
" org.apache.spark" %" spark-mllib_2.10" %" 1.6.2"