这是我写的 SBT 构建文件
name := "HelloSpark"
version := "1.0"
scalaVersion := "2.12.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1"
当我运行sbt package
时收到错误消息
[error](*:update)sbt.ResolveException:unresolved dependency:org.apache.spark#spark-core_2.11; 2.1.1:not found
[错误]总时间:2秒,完成于2017年10月29日上午1:01:13
答案 0 :(得分:2)
使用Scala 2.11
scalaVersion := "2.11.11"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
using scala 2.12 with spark 2.1
Spark不支持Scala 2.12
以防万一,您可以在此处找到spark-core
的二进制文件:
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11