如何将带有重复键的tuple2(Key,Value)的RDD转换为Map [K,List [V]]?
输入示例:
val list = List((1,a),(1,b),(2,c),(2,d))
val rdd = sparkContext.parallelize(list)
预期输出:
Map((1,List(a,b)),(2,List(c,d)))
答案 0 :(得分:1)
只需使用groupByKey
,然后使用collectAsMap
:
val rdd = sc.parallelize(List((1,"a"),(1,"b"),(2,"c"),(2,"d")))
rdd.groupByKey.collectAsMap
// res1: scala.collection.Map[Int,Iterable[String]] =
// Map(2 -> CompactBuffer(c, d), 1 -> CompactBuffer(a, b))
或者,使用map/reduceByKey
然后collectAsMap
:
rdd.map{ case (k, v) => (k, Seq(v)) }.reduceByKey(_ ++ _).
collectAsMap
// res2: scala.collection.Map[Int,Seq[String]] =
// Map(2 -> List(c, d), 1 -> List(a, b))
答案 1 :(得分:0)
您可以使用groupByKey
,collectAsMap
和map
来实现此目标,如下所示
val rdd = sc.parallelize(List((1,"a"),(1,"b"),(2,"c"),(2,"d")))
val map=rdd.groupByKey.collectAsMap.map(x=>(x._1,x._2.toList))
示例输出:
Map(2 -> List(c, d), 1 -> List(a, b))