我用Spark快速入门指南执行了用Java编写的简单代码:
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]");
JavaSparkContext sc = new JavaSparkContext(conf);
Accumulator<Integer> counter = sc.accumulator(0);
List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
JavaRDD<Integer> rdd = sc.parallelize(data);
rdd.foreach(counter::add);
System.out.println("Counter value " + counter);
}
按预期打印"Counter value 15"
。
我有用Scala编写的相同逻辑的代码:
object Counter extends App {
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]")
val sc = new SparkContext(conf)
val counter = sc.accumulator(0)
val data = Array(1, 2, 3, 4, 5)
val rdd = sc.parallelize(data)
rdd.foreach(x => counter += x)
println(s"Counter value: $counter")
}
但每次打印错误的结果(&lt; 15)。我的Scala代码有什么问题?
Java spark lib "org.apache.spark:spark-core_2.10:1.6.1"
Scala spark lib "org.apache.spark" %% "spark-core" % "1.6.1"
答案 0 :(得分:6)
quick-start文档中的建议说:
请注意,应用程序应该定义main()方法而不是 扩展scala.App。 scala.App的子类可能无法正常工作。
也许这就是问题?
尝试:
object Counter {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]")
val sc = new SparkContext(conf)
val counter = sc.accumulator(0)
val data = Array(1, 2, 3, 4, 5)
val rdd = sc.parallelize(data)
rdd.foreach(x => counter += x)
println(s"Counter value: $counter")
}
}