我安装了sbt-1.3.4.msi
,并尝试构建示例SparkPi.scala
应用程序时,出现以下错误:
C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("Spark Pi")
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error] val spark = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
SparkPi.scala
文件位于C:\myapps\sbt\sparksample\project\src\main\scala
中(如上面的错误消息所示)。
我在这里想念什么?
C:\myapps\sbt\sparksample\sparksample.sbt
文件如下:
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
答案 0 :(得分:1)
C:\ myapps \ sbt \ sparksample \ project \ src \ main \ scala目录中有SparkPi.scala文件
那是问题。您已经在project
目录下拥有了Scala文件,该目录归sbt本身拥有(不是sbt管理的Scala项目)。
将SparkPi.scala
和其他Scala文件移动到C:\myapps\sbt\sparksample\src\main\scala
。