Spark匿名函数

时间:2014-09-19 16:30:26

标签: scala mapreduce apache-spark

我有以下火花代码

val result:org.apache.spark.rdd.RDD[(Int, (Int, Int))] = 
      a.join(b,paralelism)
      .map(input:(Int,(Int,(Int,Int)))) => (input._1, input._2._2))     

result.saveAsTextFile(my_hdfs_address_goes_here) 
System.exit(0)

每当我将map函数作为最后一个操作时,我都会收到以下错误

14/09/19 18:03:04 WARN TaskSetManager: Loss was due to java.lang.ClassCastException

java.lang.ClassCastException: cannot assign instance of netflix$$anonfun$2 to field 

org.apache.spark.rdd.MappedRDD.f of type scala.Function1 in instance of org.apache.spark.rdd.MappedRDD

关于如何摆脱它的任何想法?

0 个答案:

没有答案