value _2不是Double spark-shell的成员

时间:2017-01-28 06:06:41

标签: scala apache-spark

我在spark-scala-shell中实现aggregateByKey时遇到错误。

我试图在Scala-shell上执行的代码就是这个,

<console>:39: error: value _1 is not a member of Double
         val orderItemsMapJoinOrdersMapMapAgg = orderItemsMapJoinOrdersMapMap.aggregateByKey( 0.0,0)( (a,b) => (a._1 + b , a._2 +1), (a,b) => (a._1 + b._1 , a._2 + b._2 ))


scala> orderItemsMapJoinOrdersMapMap
res8: org.apache.spark.rdd.RDD[(String, Float)] = MapPartitionsRDD[16] at map at <console>:37

但是我收到以下错误,

if(response == true){
   window.href = $(this).attr('href')
}

有人可以帮助我理解double和Float值逻辑以及如何解决它

1 个答案:

答案 0 :(得分:1)

问题是你以错误的方式提供了第一个curried参数。它应该是这样的,

val orderItemsMapJoinOrdersMapMap: RDD[(String, Float)] = ...

// so elems of your orderItemsMapJoinOrdersMapMap are (String, Float)

// And your accumulator looks like (Double, Int)

// thus I believe that you just want to accumulate total number of elements and sum of the floats in them

val orderItemsMapJoinOrdersMapMapAgg = orderItemsMapJoinOrdersMapMap
  .aggregateByKey((0.0,0))(
      (acc, elem) => (acc._1 + elem._2 , acc._2 + 1),
      (acc1, acc2) => (acc1._1 + acc2._1 , acc1._2 + acc._2)
  )