build.sbt:如何添加spark依赖项

时间:2016-06-22 03:41:23

标签: scala apache-spark sbt spark-streaming

您好我正在尝试下载build.sbt文件中的spark-corespark-streamingtwitter4jspark-streaming-twitter

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"

libraryDependencies ++= Seq(
  "org.twitter4j" % "twitter4j-core" % "3.0.3",
  "org.twitter4j" % "twitter4j-stream" % "3.0.3"
)

libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"

我只是将这个libraryDependencies放在网上,所以我不确定使用哪个版本等。

有人可以向我解释我应该如何解决这个.sbt文件。我花了几个小时试图搞清楚,但没有一个建议工作。我通过自制软件安装了scala,我使用的是版本2.11.8

我的所有错误都是关于:

Modules were resolved with conflicting cross-version suffixes.

2 个答案:

答案 0 :(得分:42)

问题是你正在混合使用Scala 2.11和2.10工件。你有:

scalaVersion := "2.11.8"

然后:

libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"

需要2.10工件的地方。您也在使用Spark版本而不是使用一致的版本:

// spark 1.6.1
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"

// spark 1.4.1
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"

// spark 0.9.0-incubating
libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"

以下是解决这两个问题的build.sbt

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)

您也不需要手动添加twitter4j个依赖关系,因为它们是spark-streaming-twitter传递的。{/ p>

答案 1 :(得分:3)

它对我有用:

name := "spark_local"

version := "0.1"

scalaVersion := "2.11.8"


libraryDependencies ++= Seq(
  "org.twitter4j" % "twitter4j-core" % "3.0.5",
  "org.twitter4j" % "twitter4j-stream" % "3.0.5",
  "org.apache.spark" %% "spark-core" % "2.0.0",
  "org.apache.spark" %% "spark-sql" % "2.0.0",
  "org.apache.spark" %% "spark-mllib" % "2.0.0",
  "org.apache.spark" %% "spark-streaming" % "2.0.0"
)