我尝试过在我的代码中添加jar的方法:
1)
SparkConf conf = new SparkConf().setAppName("myApp")
.setMaster(local[*])
.setJars(new String[]{"jar1.jar","jar2.jar"});
2)
SparkConf conf = new SparkConf().setAppName("myApp")
.setMaster(local[*])
.set("spark.jars","path/to/jars");
3)
SparkConf conf = new SparkConf().setAppName("myApp")
.setMaster(local[*]);
JavaSparkContext sc = new JavaSparkContext(conf)
.addJar("path/to/jars")`
但仍然获得第三方库的java.lang.ClassNotFoundException
类。
当我使用--jars选项运行它时,相同的代码运行正常
./spark-submit --class com.test.spark.App --jars jar1.jar,jar2.jar main.jar
Spark版本:spark-2.0.0-bin-hadoop2.7