我正在尝试运行一个独立的Spark-Java程序示例。似乎缺少一些依赖的库...
SparkConf conf = new SparkConf().setAppName("Test").setMaster("local[1]");
SparkSession spark = SparkSession.builder().config(conf).getOrCreate();
JavaSparkContext context = new JavaSparkContext(spark.sparkContext());
SQLContext sc = new SQLContext(spark);
但是,我收到以下错误:
Exception in thread "main" java.lang.VerifyError: class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at sparkwrapper.SparkTest.main(SparkTest.java:19)
17/11/20 12:19:48 INFO SparkContext: Invoking stop() from shutdown hook
答案 0 :(得分:0)
问题原来是eclipse(霓虹灯3)的配置问题。