火花字计数示例指示错误?

时间:2018-11-13 06:06:32

标签: java apache-spark word-count

我尝试了火花单词计数的代码??该程序下面没有错误,任何人都可以解决这个问题,在此先感谢!

import org.apache.spark.api.java.*;
import org.apache.spark.SparkConf;
import scala.Tuple2;
import java.util.Arrays;

public class Wordcount {

 public static void main(String[] args) {

  SparkConf conf = new SparkConf().setMaster("local").setAppName("wordCount");
  JavaSparkContext sc = new JavaSparkContext(conf);

  // Load our input data.
  String inputFile = "Path\\test.txt";

  JavaRDD < String > input = sc.textFile(inputFile);
  // Split in to list of words
  JavaRDD < String > words = input.flatMap(l -> Arrays.asList(l.split(" ")).iterator());

  // Transform into pairs and count.
  JavaPairRDD < String, Integer > pairs = words.mapToPair(w -> new Tuple2<String ,Integer>(w, 1));

  JavaPairRDD < String, Integer > counts = pairs.reduceByKey((x, y) -> x + y);

  System.out.println(counts.collect());
 }
}

但是当我运行该程序时,它将显示以下错误?

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
    at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:75)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:70)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:57)
    at Wordcount.main(Wordcount.java:10)

0 个答案:

没有答案