在简单的火花应用中找不到类

时间:2015-07-29 20:17:45

标签: scala apache-spark

我是Spark的新手,并在Scala中编写了一个非常简单的Spark应用程序,如下所示:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._


object test2object {

def main(args: Array[String]) {
 val logFile = "src/data/sample.txt"
 val sc = new SparkContext("local", "Simple App", "/path/to/spark-0.9.1-incubating",
 List("target/scala-2.10/simple-project_2.10-1.0.jar"))
 val logData = sc.textFile(logFile, 2).cache()
 val numTHEs = logData.filter(line => line.contains("the")).count()
 println("Lines with the: %s".format(numTHEs))
}
}

我在Scala IDE中编码并将spark-assembly.jar包含在我的代码中。我从我的项目生成一个jar文件,并使用此命令spark-submit --class test2object --master local[2] ./file.jar将其提交给我的本地spark集群但是我收到此错误消息:

  Exception in thread "main" java.lang.NoSuchMethodException: test2object.main([Ljava.lang.String;)
    at java.lang.Class.getMethod(Class.java:1665)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:649)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

这里有什么问题?

P.S。我的源代码位于项目根目录(project / test2object.scala)

1 个答案:

答案 0 :(得分:0)

之前我没有使用过spark 0.9.1,但我相信问题来自这行代码:

val sc = new SparkContext("local", "Simple App", "/path/to/spark-0.9.1-incubating", List("target/scala-2.10/simple-project_2.10-1.0.jar"))

如果改为:

val conf = new SparkConf().setAppName("Simple App")
val sc = new SparkContext(conf)

这样可行。