将地图RDD转换为数据帧

时间:2018-01-10 06:04:03

标签: scala apache-spark apache-spark-sql

我使用的是Spark 1.6.0,我输入了地图RDD(密钥,值)对并希望转换为数据帧。

输入格式RDD:

((1, A, ABC), List(pz,A1))
((2, B, PQR), List(az,B1))
((3, C, MNR), List(cs,c1))

输出格式:

+----+----+-----+----+----+
| c1 | c2 | c3  | c4 | c5 |
+----+----+-----+----+----+
| 1  | A  | ABC | pz | A1 |
+----+----+-----+----+----+
| 2  | B  | PQR | az | B1 |
+----+----+-----+----+----+
| 3  | C  | MNR | cs | C1 |
+----+----+-----+----+----+

有人可以帮我解决这个问题吗。

2 个答案:

答案 0 :(得分:1)

  val a = Seq(((1,"A","ABC"),List("pz","A1")),((2, "B", "PQR"), 
          List("az","B1")),((3,"C", "MNR"), List("cs","c1")))
  val a1 = sc.parallelize(a);
  val a2 = a1.map(rec=>
           (rec._1._1,rec._1._2,rec._1._3,rec._2(0),rec._2(1))).toDF()
   a2.show()
    +---+---+---+---+---+
    | _1| _2| _3| _4| _5|
    +---+---+---+---+---+
    |  1|  A|ABC| pz| A1|
    +---+---+---+---+---+
    | 2 | B |PQR| az| B1|
    +---+---+---+---+---+
    | 3 | C |MNR| cs| C1|

答案 1 :(得分:1)

我建议您使用datasets datasets 优化 typesafe dataframes

首先,您需要创建一个case class作为

case class table(c1: Int, c2: String, c3: String, c4:String, c5:String)

然后您只需要一个map函数就可以将数据解析为case class并调用.toDS

rdd.map(x => table(x._1._1, x._1._2, x._1._3, x._2(0), x._2(1))).toDS().show()

你应该有以下输出

+---+---+---+---+---+
| c1| c2| c3| c4| c5|
+---+---+---+---+---+
|  1|  A|ABC| pz| A1|
|  2|  B|PQR| az| B1|
|  3|  C|MNR| cs| c1|
+---+---+---+---+---+

您也可以使用dataframe,因为您可以使用.toDF()代替.toDS()