reduceByKey没有采用泛型参数

时间:2016-04-19 13:35:33

标签: scala generics apache-spark

我想根据提供的函数减少Key Values中的泛型值 但它给出了错误:

value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, T)]
val reduce = (rDD:RDD[(String,T)]) => rDD.reduceByKey((x,y) => y)**
                                          ^ 

reduceByKey不适用于泛型。

另外

val map = rdd.map()
val reduce = (rDD:RDD[(String,T)]) => rDD.reduceByKey((x,y) => y)

当我尝试使用

进行咖喱时
map andThen reduce 

它再次说找不到和然后符号

更新

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD

class Aggragator[T]( val rdd:RDD[String], function:Function[String,T]){

val mapped = rdd.map(x => (x,function.apply(x)))
val reduce = mapped.reduceByKey((x,y) => y)

}

object Aggragator{

val conf = new SparkConf()
.setMaster("local[2]")
 .setAppName("xx")

val sc = new SparkContext(conf)
val rdd = sc.parallelize(List("1","2"))
val reduced = new Aggragator[Int](rdd, (x:String) => x.toInt).reduce.collect()
  def main( args: Array[String] ) {
    println(reduced)
 }}

0 个答案:

没有答案