Spark with Java - 错误:无法从JAR加载主类

时间:2017-06-22 18:46:29

标签: java apache-spark intellij-idea apache-spark-mllib spark-submit

我正在尝试一个简单的电影推荐机器学习程序在火花。 Spark版本:2.1.1 Java版本:java 8 Scala版本:Scala代码运行器版本2.11.7 环境:Windows 7

运行这些命令以启动主从工作者

//start master
spark-class org.apache.spark.deploy.master.Master

//start worker
spark-class org.apache.spark.deploy.worker.Worker spark://valid ip:7077

我正在尝试一个非常简单的电影推荐代码:http://blogs.quovantis.com/recommendation-engine-using-apache-spark/

我已将代码更新为:

SparkConf conf = new SparkConf().setAppName("Collaborative Filtering Example").setMaster("spark://valid ip:7077");
conf.setJars(new String[] {"C:\\Spark2.1.1\\spark-2.1.1-bin-hadoop2.7\\jars\\spark-mllib_2.11-2.1.1.jar"});

我不能通过intelliJ运行这个 运行mvn clean install并将jar复制到文件夹不起作用。 我以前运行的命令:

bin\spark-submit --verbose –-jars jars\spark-mllib_2.11-2.1.1.jar –-class “com.abc.enterprise.RecommendationEngine” –-master spark://valid ip:7077 C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7\spark-mllib-example\spark-poc-1.0-SNAPSHOT.jar C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7\spark-mllib-example\ratings.csv C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7\spark-mllib-example\movies.csv 10

我看到的错误是:

C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7>bin\spark-submit --verbose --class "com.sandc.enterprise.RecommendationEngine" --master spark://10.64.98.101:7077 C:\Spark2.1.1\spark-2.1.1-
bin-hadoop2.7\spark-mllib-example\spark-poc-1.0-SNAPSHOT.jar C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7\spark-mllib-example\ratings.csv C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7\spark-m
llib-example\movies.csv 10
Using properties file: C:\Spark2.1.1\spark-2.1.1-bin-hadoop2.7\bin\..\conf\spark-defaults.conf
Adding default property: spark.serializer=org.apache.spark.serializer.KryoSerializer
Adding default property: spark.executor.extraJavaOptions=-XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
Adding default property: spark.eventLog.enabled=true
Adding default property: spark.driver.memory=5g
Adding default property: spark.master=spark://valid ip:7077
Error: Cannot load main class from JAR file:/C:/Spark2.1.1/spark-2.1.1-bin-hadoop2.7/û-class
Run with --help for usage help or --verbose for debug output

如果我给出--jar命令,它会给出错误:

Error: Cannot load main class from JAR file:/C:/Spark2.1.1/spark-2.1.1-bin-hadoop2.7/û-jars

我有什么想法可以提交这份工作来激发?

1 个答案:

答案 0 :(得分:0)

你的Jar是否正确构建? 此外,您不需要为--class选项值添加双引号。