我有这样的测试类:
import org.apache.spark.SparkContext
import org.scalatest.{ConfigMap, BeforeAndAfterAll, FunSuite}
class MyTrainingSuiteIT extends FunSuite with BeforeAndAfterAll {
private[this] var _sc: SparkContext = null
private[this] val defaultCoresNumber = 1
private[this] val defaultMaster = s"local[$defaultCoresNumber]"
private[this] val defaultName = "some-spark-integration-test"
override def beforeAll(configMap: ConfigMap): Unit = {
super.beforeAll()
val mode = configMap.get("mode").get
mode match {
case "local" =>
val coresNumber = configMap.get("cores").get
_sc = new SparkContext(s"local[$coresNumber]", defaultName)
case "docker" =>
println("Docker was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
case "cluster" =>
val clusterType = configMap.get("clusterType").get
println(s"Cluster of type [$clusterType] was chosen.")
_sc = new SparkContext(defaultMaster, defaultName)
case _ =>
println("Unknow mode was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
}
}
override def afterAll(): Unit = {
_sc.stop()
_sc = null
super.afterAll()
}
test("Context testing") {
assert(defaultMaster == s"local[$defaultCoresNumber]")
}
test("Fail test") {
assert(3 === 2)
}
}
首先,我在IntelliJ IDEA中编译它,然后我尝试在终端中使用这样的命令执行它:
scala -classpath /home/Downloads/scalatest_2.10.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
在我做了ScalaTest后,窗口打开了,我有这样的信息:
事件:运行中止
消息:一个所需类没有被发现。这可能是由于你的运行路径的错误。缺少类:org / apache / spark / SparkContext
摘要:运行的测试总数:0
套房:完成0,中止0
测试:成功0,失败0,取消0,忽略0,待定0
异常:java.lang.NoClassDefFoundError
我该如何解决这个问题?
答案 0 :(得分:1)
这是scala命令的工作版本:
scala -classpath /home/Downloads/scalatest_2.10.jar:/home/spark/core-1.2.19.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
异常的来源是类路径上缺少的spark lib。 正如@Ben建议的那样,构建工具(如SBT)可以更轻松地运行测试。