如何在转换Scala Spark DF时保留类型 - > RDD?

时间:2017-04-26 19:30:10

标签: scala apache-spark

我试图将数据帧转换为RDD。我的DataFrame有类型列,如下所示:

df.printSchema
root
 |-- _c0: integer (nullable = true)
 |-- num_hits: integer (nullable = true)
 |-- session_name: string (nullable = true)
 |-- user_id: string (nullable = true)

当我使用df.rdd将其转换为rdd时,我得到的类型为Array[org.apache.spark.sql.Row]的rdd,但是当我使用rdd(0)(0)访问每个条目时,{{1}我得到的都是rdd(0)(1)类型。当我将DataFrame转换为RDD时,如何保持DataFrame的相同输入?换句话说:如何让我的rdd中的列包含AnyIntIntString类型,以便它们与Dataframe匹配?

1 个答案:

答案 0 :(得分:3)

您只需将DataFrame转换为Dataset[(Int, Int, String, String)],例如

即可
scala> val df = Seq((1, 2, "a", "b")).toDF("_c0", "num_hits", "session_name", "user_id")
df: org.apache.spark.sql.DataFrame = [_c0: int, num_hits: int ... 2 more fields]

scala> df.printSchema
root
 |-- _c0: integer (nullable = false)
 |-- num_hits: integer (nullable = false)
 |-- session_name: string (nullable = true)
 |-- user_id: string (nullable = true)


scala> val rdd = df.as[(Int, Int, String, String)].rdd
rdd: org.apache.spark.rdd.RDD[(Int, Int, String, String)] = MapPartitionsRDD[3] at rdd at <console>:25

如果_c0num_hits可以是null,只需将Int更改为java.lang.Integer