scala和sbt的新手,不知道如何继续。我错过了更多依赖项吗?
重现的步骤:
gensort.scala: gensort source
在〜/ spark-1.3.0 / project / build.sbt中构建定义文件:
lazy val root = (project in file(".")).
settings(
name := "gensort",
version := "1.0",
scalaVersion := "2.11.6"
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-examples_2.10" % "1.1.1",
"org.apache.spark" % "spark-core_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming-mqtt_2.11" % "1.3.0",
"org.apache.spark" % "spark-streaming_2.11" % "1.3.0",
"org.apache.spark" % "spark-network-common_2.10" % "1.2.0",
"org.apache.spark" % "spark-network-shuffle_2.10" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
非常感谢有关如何前进的任何见解。谢谢!丹尼斯
答案 0 :(得分:0)
你不应该混用2.10和2.11,它们不是二进制兼容的。您的libraryDependencies
应如下所示:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-examples" % "1.1.1",
"org.apache.spark" %% "spark-core" % "1.3.0",
"org.apache.spark" %% "spark-streaming-mqtt" % "1.3.0",
"org.apache.spark" %% "spark-streaming" % "1.3.0",
"org.apache.spark" %% "spark-network-common" % "1.2.0",
"org.apache.spark" %% "spark-network-shuffle" % "1.3.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
%%
表示Scala版本作为后缀添加到库ID中。在此更改后,我收到一个错误,因为无法找到依赖项。它位于:
resolvers += "poho" at "https://repo.eclipse.org/content/repositories/paho-releases"
尽管如此,spark-examples
似乎不适用于2.11。将scalaVersion
更改为
scalaVersion := "2.10.5"
解决了所有依赖问题,编译成功。