Scala Spark版兼容性

时间:2018-03-22 06:54:57

标签: scala apache-spark

我试图在IntelliJ IDE中配置Scala

我的Scala&我机器中的Spark版本

Welcome to Scala 2.12.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121).

apache-spark/2.2.1

SBT档案

scalaVersion := "2.12.5"
resolvers  += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq( "org.apache.spark" %% "spark-core" % sparkVersion)
}

错误我得到了

Error:Error while importing SBT project:<br/>...<br/><pre>[info] Resolving jline#jline;2.14.5 ...
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.1: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;1.4.0: not found
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.1: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;1.4.0: not found

2 个答案:

答案 0 :(得分:3)

您在sbt项目中定义的spark核心版本无法下载。您可以查看maven dependency以获取有关可用版本的更多信息

正如您所看到的,对于spark-core 2.2.1版,要下载的最新版本是在Scala 2.11中编译的 info here

所以

您将sbt构建文件更改为

scalaVersion := "2.11.8"
resolvers  += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq( "org.apache.spark" %% "spark-core" % sparkVersion)
}

将依赖项中的构建版本定义为

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq("org.apache.spark" % "spark-core_2.11" % sparkVersion)
}

我希望答案很有帮助

答案 1 :(得分:0)

Spark-2.2.1不支持scalaVersion-2.12。你必须这样做:

  

scalaVersion:=“2.11.8”

     

libraryDependencies + =“org.apache.spark”%“spark-core”%“$ sparkVersion”

由于