Spark / Scala:仅使用RDD使用ReduceByKey创建嵌套结构
我想仅使用RDD创建嵌套结构。我能够使用groupBy函数执行此操作,该函数对于大数据表现不佳。所以我想用reduceByKey来做,但我无法得到我想要的东西。任何帮助将不胜感激。
输入数据:
val sales=sc.parallelize(List(
("West", "Apple", 2.0, 10),
("West", "Apple", 3.0, 15),
("West", "Orange", 5.0, 15),
("South", "Orange", 3.0, 9),
("South", "Orange", 6.0, 18),
("East", "Milk", 5.0, 5)))
必需输出是结构列表。我可以使用groupByKey
执行此操作,如下所示:
sales.map(value => (value._1 ,(value._2,value._3,value._4 )) )
.groupBy(_._1)
.map { case(k,v) => (k, v.map(_._2)) }
.collect()
.foreach(println)
// (South,List((Orange,3.0,9), (Orange,6.0,18)))
// (East,List((Milk,5.0,5)))
// (West,List((Apple,2.0,10), (Apple,3.0,15), (Orange,5.0,15)))
但我希望使用reduceByKey
实现同样的目标。我无法获得List [Struct]。相反,我可以得到List [List]。有没有办法获得List [Struct]?
sales.map(value => (value._1 ,List(value._2,value._3,value._4)))
.reduceByKey((a,b) => (a ++ b))
.collect()
.foreach(println)
// (South,List(Orange, 3.0, 9, Orange, 6.0, 18))
// (East,List(Milk, 5.0, 5))
// (West,List(Apple, 2.0, 10, Apple, 3.0, 15, Orange, 5.0, 15))
sales.map(value => (value._1 ,List(value._2,value._3,value._4)))
.reduceByKey((a,b) =>(List(a) ++ List(b)))
.collect()
.foreach(println)
// (South,List(List(Orange, 3.0, 9), List(Orange, 6.0, 18)))
// (East,List(Milk, 5.0, 5))
// (West,List(List(List(Apple, 2.0, 10), List(Apple, 3.0, 15)), List(Orange, 5.0, 15)))
答案 0 :(得分:3)
reduceByKey
需要函数(V, V) ⇒ V
,因此无法更改类型。请参阅示例Can reduceBykey be used to change type and combine values - Scala Spark? aggregateByKey
或combineByKey
,但不会提高效果,因为您的流程不会减少数据量。请参阅示例Spark groupByKey alternative。你可以获得一点(不需要临时对象):
sales.map(value => (value._1 ,(value._2,value._3,value._4)) ).groupByKey