我在SBT中编译Spark程序时遇到错误

时间:2018-10-14 22:54:36

标签: scala apache-spark sbt

使用的版本:

火花版本:2.3.2

Scala版本:2.11.8

SBT版本:1.2.4

build.sbt:

name := "spark_demo"

version := "0.1"

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.2",
  "org.apache.spark" %% "spark-sql"  % "2.3.2",
  "org.apache.spark" %% "spark-hive"  % "2.3.2"
)

错误:

enter image description here

0 个答案:

没有答案