Apache Spark - Scala - 如何将FlatMap(k,{v1,v2,v3,...})转换为((k,v1),(k,v2),(k,v3),...)

时间:2016-07-19 15:12:47

标签: scala apache-spark rdd flatmap

我明白了:

val vector: RDD[(String, Array[String])] = [("a", {v1,v2,..}),("b", {u1,u2,..})]

想转换为:

RDD[(String, String)] = [("a",v1), ("a",v2), ..., ("b",u1), ("b",u2), ...]

任何想法如何使用flatMap

3 个答案:

答案 0 :(得分:4)

此:

vector.flatMap { case (x, arr) => arr.map((x, _)) }

会给你:

scala> val vector = sc.parallelize(Vector(("a", Array("b", "c")), ("b", Array("d", "f"))))
vector: org.apache.spark.rdd.RDD[(String, Array[String])] =
               ParallelCollectionRDD[3] at parallelize at <console>:27


scala> vector.flatMap { case (x, arr) => arr.map((x, _)) }.collect
res4: Array[(String, String)] = Array((a,b), (a,c), (b,d), (b,f))

答案 1 :(得分:2)

您肯定需要像您提到的那样使用flatMap,但此外,您还需要使用scala map

例如:

val idToVectorValue: RDD[(String, String ] = vector.flatMap((id,values) => values.map(value => (id, value)))

答案 2 :(得分:0)

使用单参数功能:

vector.flatMap(data => data._2.map((data._1, _)))