我有编译Java程序并尝试使用spark运行,但它显示ClassNotFound Exception,即使那里存在类文件。
package org.apache.spark.examples;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function;
public final class JavaHelloWorld
{
public static void main(String args[])throws Exception
{
SparkConf sparkConf = new SparkConf().setAppName("JavaSparkPi");
JavaSparkContext jsc = new JavaSparkContext(sparkConf);
System.out.println("Hello World... Niyat From Apache Spark");
}
}
答案 0 :(得分:2)
你必须写出类的确切名称,你的初始public class ImageAdapter extends PagerAdapter {
public ImageAdapter(Context context, String list1Details) {
// Use your extra here
}
}
小写:
j