使用sbt解决spark依赖关系

时间:2017-06-04 07:57:44

标签: java scala apache-spark dependencies sbt

我正在尝试构建一个具有spark依赖性的非常基本的scala脚本。 但是我无法用它制作罐子。

生成错误:

  

sbt.ResolveException:未解析的依赖项:org.apache.spark#spark-core_2.12; 1.6.0-SNAPSHOT:not found

我的build.sbt:

    import Dependencies._

    lazy val root = (project in file(".")).
     settings(
               inThisBuild(List(
                                 organization := "com.example",
                                 scalaVersion := "2.12.1",
                                 version      := "0.1.0-SNAPSHOT"
                              )),
               name := "Hello",
               libraryDependencies +=  "org.apache.spark" %% "spark-core" % "1.6.0-SNAPSHOT",
               resolvers += Resolver.mavenLocal
                )

`

package example
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object Hello  {
     def main(args: Array[String]) {
           val logFile = "/Users/dhruvsha/Applications/spark/README.md"                 
           val conf = new SparkConf().setAppName("Simple Application")
           val sc = new SparkContext(conf)
           val logData = sc.textFile(logFile, 2).cache()
           val numAs = logData.filter(line => line.contains("a")).count()
           val numBs = logData.filter(line => line.contains("b")).count()
           println(s"Lines with a: $numAs, Lines with b: $numBs")
           sc.stop()
         }
}

我的源scala位于:

  

/exampleapp/main/scala/example/Hello.scala

项目名称是exampleapp。

  

scala版本2.12.2

     

spark version 1.6.0

     

sbt version 0.13.13

如果你能提供资源来学习sbt和spark依赖关系,那么任何形式的帮助都会受到赞赏。

请看我是scala,spark和sbt的新手。

1 个答案:

答案 0 :(得分:1)

library dependencies中的build.sbt行似乎不对

正确应该是

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0"