reduceByKey类型不匹配

时间:2015-04-22 18:27:28

标签: scala apache-spark

我有这个清单

res22: Array[(String, List[(String, Int)])] = Array((Door_182,List((IN,1), (IN,1))), (Door_89,List((IN,1), (IN,1), (IN,1))), (Door_180,List((IN,1), (IN,1), (IN,1), (IN,1))), (Door_83,List((IN,1), (IN,1), (IN,1))), (Door_177,List((IN,1), (IN,1))), (Door_23,List((IN,1), (IN,1))), (Door_128,List((IN,1), (IN,1))), (Door_34,List((IN,1), (IN,1))), (Door_18,List((IN,1), (IN,1))), (Door_32,List((IN,1))), (Door_76,List((IN,1), (IN,1), (IN,1))), (Door_87,List((IN,1), (IN,1), (IN,1))), (Door_197,List((IN,1), (IN,1))), (Door_133,List((IN,1), (IN,1))), (Door_119,List((IN,1), (IN,1))), (Door_113,List((IN,1), (IN,1), (IN,1), (IN,1), (IN,1))), (Door_155,List((IN,1), (IN,1), (IN,1), (IN,1), (IN,1))), (Door_168,List((IN,1), (IN,1), (IN,1))), (Door_115,List((IN,1), (IN,1))), (Door_9,List((IN,1), (IN,1))),...

我试着用这个来计算每扇门的IN数:

scala> reduced.map(n => (n._1, n._2)).reduceByKey((v1,v2) => v1 + v2.toString).collect

我收到此错误:

<console>:32: error: type mismatch;
found   : List[(String, Int)]
required: String
          reduced.map(n => (n._1, n._2)).reduceByKey((v1,v2) => v1 + v2).collect
                                                                     ^

我如何解决这个问题?

1 个答案:

答案 0 :(得分:1)

您可以分两步完成:每个密钥聚合所有列表,然后汇总每个列表中的所有值:

val x = sc.parallelize(List(("Door_182",List(("IN",1), ("IN",1))), ("Door_89",List(("IN",1), ("IN",1), ("IN",1))), ("Door_180",List(("IN",1), ("IN",1), ("IN",1), ("IN",1))), ("Door_83",List(("IN",1), ("IN",1), ("IN",1))), ("Door_177",List(("IN",1), ("IN",1)))))
x.reduceByKey(_ ::: _)
  .map {
    case (door, list) => (door, list.foldLeft(0){
      case (count1, (in2, count2)) => count1 + count2
   })
  }.collect()
res3: Array[(String, Int)] = Array((Door_180,4), (Door_83,3), (Door_177,2), (Door_182,2), (Door_89,3))

或者在aggregateByKey的单个操作中避免额外的内存分配:

x.aggregateByKey(0)(
  {
    case (count, list) => count + list.foldLeft(0){
       case (count1, (in2, count2)) => count1 + count2}
  }, 
  _ +_ 
  )
  .collect()