我正在编写一个脚本,试图让Cassandra和Spark一起工作,但我甚至无法编译程序。我使用SBT作为构建工具,我拥有所声明的程序所需的所有依赖项。我第一次运行sbt run它下载了依赖项,但是当它开始编译下面显示的scala代码时我会收到错误:
[info] Compiling 1 Scala source to /home/vagrant/ScalaTest/target/scala-2.10/classes...
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:6: not found: type SparkConf
[error] val conf = new SparkConf(true)
[error] ^
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:9: not found: type SparkContext
[error] val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Jun 5, 2015 2:40:09 PM
这是SBT构建文件
lazy val root = (project in file(".")).
settings(
name := "ScalaTest",
version := "1.0"
)
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M1"
这是实际的Scala程序
import com.datastax.spark.connector._
object ScalaTest {
def main(args: Array[String]) {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
}
}
这是我的目录结构
- ScalaTest
- build.sbt
- project
- src
- main
- scala
- ScalaTest.scala
- target
答案 0 :(得分:2)
我不知道这是否是问题,但您并未导入SparkConf
和SparkContext
类定义。因此,请尝试添加到您的scala文件:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext