SBT依赖性错误

时间:2017-10-28 19:41:12

标签: scala apache-spark sbt

这是我写的 SBT 构建文件

name := "HelloSpark"

version := "1.0"

scalaVersion := "2.12.2"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1"

当我运行sbt package时收到错误消息

  

[error](*:update)sbt.ResolveException:unresolved dependency:org.apache.spark#spark-core_2.11; 2.1.1:not found

     

[错误]总时间:2秒,完成于2017年10月29日上午1:01:13

1 个答案:

答案 0 :(得分:2)

使用Scala 2.11

scalaVersion := "2.11.11"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"

using scala 2.12 with spark 2.1

  

Spark不支持Scala 2.12

以防万一,您可以在此处找到spark-core的二进制文件: https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11