我正在尝试本地工作。它适用于前几个循环,然后在最后一块停止处理。我看不出为什么或错了。
SparkConf sparkConf = new SparkConf().setAppName("simpleSparkGetsStuck")
.setMaster("local[*]").set("spark.executor.memory","4g");
while(true) {
JavaSparkContext sc = new JavaSparkContext(sparkConf);
// read list to RDD
List<String> data = new ArrayList<>();
getData(data); //generates a lot of data
JavaRDD<String> items = sc.parallelize(data, 20);
// apply a function for each element of RDD
items.foreach(item -> {
... Do some job ...
}
});
sc.close();
}