如何在Spark(Scala)中合并2个数据框?

时间:2019-02-05 13:55:12

标签: java scala apache-spark dataframe

我是Spark Framework的新手,需要帮助!

假设第一个DataFrame(df1)存储了用户访问呼叫中心的时间。

+---------+-------------------+
|USER_NAME|       REQUEST_DATE|
+---------+-------------------+
|     Mark|2018-02-20 00:00:00|
|     Alex|2018-03-01 00:00:00|
|      Bob|2018-03-01 00:00:00|
|     Mark|2018-07-01 00:00:00|
|     Kate|2018-07-01 00:00:00|
+---------+-------------------+

第二个DataFrame存储有关某人是否是组织成员的信息。 OUT表示用户已离开组织。 IN表示用户已加入组织。 START_DATEEND_DATE表示相应过程的开始和结束。

例如,您可以看到Alex2018-01-01 00:00:00离开了组织,此过程在2018-02-01 00:00:00结束。您会注意到,一个用户可以像Mark一样在不同的时间离开组织。

+---------+---------------------+---------------------+--------+
|NAME     | START_DATE          | END_DATE            | STATUS |
+---------+---------------------+---------------------+--------+
|     Alex| 2018-01-01 00:00:00 | 2018-02-01 00:00:00 | OUT    |
|      Bob| 2018-02-01 00:00:00 | 2018-02-05 00:00:00 | IN     |
|     Mark| 2018-02-01 00:00:00 | 2018-03-01 00:00:00 | IN     |
|     Mark| 2018-05-01 00:00:00 | 2018-08-01 00:00:00 | OUT    |
|    Meggy| 2018-02-01 00:00:00 | 2018-02-01 00:00:00 | OUT    |
+----------+--------------------+---------------------+--------+

我正试图在决赛中获得这样一个DataFrame。它必须包含第一个DataFrame中的所有记录以及一列,该列指示在请求(REQUEST_DATE)时Person是否是组织的成员。

+---------+-------------------+----------------+
|USER_NAME|       REQUEST_DATE| USER_STATUS    |
+---------+-------------------+----------------+
|     Mark|2018-02-20 00:00:00| Our user       |
|     Alex|2018-03-01 00:00:00| Not our user   |
|      Bob|2018-03-01 00:00:00| Our user       |
|     Mark|2018-07-01 00:00:00| Our user       |
|     Kate|2018-07-01 00:00:00| No Information |
+---------+-------------------+----------------+

我尝试了下一个代码,但是在finalDF中我遇到了错误:

org.apache.spark.SparkException: Task not serializable

在最终结果中,我还需要日期时间。现在,在lastRowByRequestId中,我只有日期,没有时间。

代码

val df1 = Seq(
    ("Mark", "2018-02-20 00:00:00"),
    ("Alex", "2018-03-01 00:00:00"),
    ("Bob", "2018-03-01 00:00:00"),
    ("Mark", "2018-07-01 00:00:00"),
    ("Kate", "2018-07-01 00:00:00")
).toDF("USER_NAME", "REQUEST_DATE")

df1.show()

val df2 = Seq(
    ("Alex", "2018-01-01 00:00:00", "2018-02-01 00:00:00", "OUT"),
    ("Bob", "2018-02-01 00:00:00", "2018-02-05 00:00:00", "IN"),
    ("Mark", "2018-02-01 00:00:00", "2018-03-01 00:00:00", "IN"),
    ("Mark", "2018-05-01 00:00:00", "2018-08-01 00:00:00", "OUT"),
    ("Meggy", "2018-02-01 00:00:00", "2018-02-01 00:00:00", "OUT")
).toDF("NAME", "START_DATE", "END_DATE", "STATUS")

df2.show()

import org.apache.spark.sql.Dataset
import org.apache.spark.sql.functions._

case class UserAndRequest(
                           USER_NAME:String,
                           REQUEST_DATE:java.sql.Date,
                           START_DATE:java.sql.Date,
                           END_DATE:java.sql.Date,
                           STATUS:String,
                           REQUEST_ID:Long
                         )

val joined : Dataset[UserAndRequest] = df1.withColumn("REQUEST_ID", monotonically_increasing_id).
  join(df2,$"USER_NAME" === $"NAME", "left").
  as[UserAndRequest]

val lastRowByRequestId = joined.
  groupByKey(_.REQUEST_ID).
  reduceGroups( (x,y) =>
    if (x.REQUEST_DATE.getTime > x.END_DATE.getTime && x.END_DATE.getTime > y.END_DATE.getTime) x else y
  ).map(_._2)

def logic(status: String): String = {
  if (status == "IN") "Our user"
  else if (status == "OUT") "not our user"
  else "No Information"
}

val logicUDF = udf(logic _)

val finalDF = lastRowByRequestId.withColumn("USER_STATUS",logicUDF($"REQUEST_DATE"))

1 个答案:

答案 0 :(得分:5)

我检查了您的代码并运行它。它适用于次要更新。我用状态替换了REQUEST_DATE。另外,请注意:大多数情况下,Spark不会序列化任务,如果您不使用case类,而是在Spark任务中自动编码Spark 2.x中的case类。

val finalDF = lastRowByRequestId.withColumn("USER_STATUS",logicUDF($"STATUS"))

下面是输出

+---------+------------+----------+----------+------+----------+--------------+
|USER_NAME|REQUEST_DATE|START_DATE|  END_DATE|STATUS|REQUEST_ID|   USER_STATUS|
+---------+------------+----------+----------+------+----------+--------------+
|     Mark|  2018-02-20|2018-02-01|2018-03-01|    IN|         0|      Our user|
|     Alex|  2018-03-01|2018-01-01|2018-02-01|   OUT|         1|  not our user|
|     Mark|  2018-07-01|2018-02-01|2018-03-01|    IN|         3|      Our user|
|      Bob|  2018-03-01|2018-02-01|2018-02-05|    IN|         2|      Our user|
|     Kate|  2018-07-01|      null|      null|  null|         4|No Information|
+---------+------------+----------+----------+------+----------+--------------+