Spark-scala-Sbt的独立应用程序

时间:2016-11-22 20:29:11

标签: scala hadoop apache-spark sbt-assembly

我使用Spark-scala(推荐系统)进行了一个项目,执行我的应用程序时遇到问题。通过spark-submit,它显示出异常。 我的目录是:projectFilms / src / main / scala / AppFilms.scala 我用sbt包编译它,它很好用,它创建了一个文件jar。

[info] Set current project to system of recommandation (in build file:/root/projectFilms/)
[info] Compiling 1 Scala source to /root/projectFilms/target/scala-2.11/classes...
[info] Packaging /root/projectFilms/target/scala-2.11/system-of-recommandation_2.11-1.0.jar ...
[info] Done packaging.
[success] Total time: 35 s, completed Nov 22, 2016 7:31:59 PM

但是当我通过Spark-submit执行我的应用程序时:

spark-submit --class "AppFilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

他显示了一个例外:

java.lang.ClassNotFoundException: AppFilms
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:634)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/22 19:32:33 INFO Utils: Shutdown hook called

请你能回答我!

0 个答案:

没有答案