尝试使用sbt导入spark时出错

时间:2018-04-12 09:49:25

标签: scala apache-spark sbt

我下载了游戏框架项目 - 简单的项目。我试图通过sbt导入spark版本2.2.0,但我收到了下一个错误:

sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.0: not found

build.sbt文件:

name := """play-scala-starter-example"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
resolvers += Resolver.sonatypeRepo("snapshots")
scalaVersion := "2.11.5"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.2" % Test
libraryDependencies += "com.h2database" % "h2" % "1.4.196"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"

build.sbt文件中的所有行都标记为红色并且具有相同的错误:

expression type must conform to setting in sbt file

plugin.sbt文件:

// The Play plugin
 addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.6.13")

澄清一下,我有两个问题:

  1. "表达式类型必须符合"在每行的build.sbt文件中。
  2. 无法通过sbt
  3. 导入火花库

1 个答案:

答案 0 :(得分:1)

构建并分发Spark 2.2.0以默认使用Scala 2.11。要在Scala中编写应用程序,您需要使用兼容的Scala版本(例如2.11.X)。而你的scala版本是2.12.X.这就是它抛出异常的原因。