我需要更新delta数据的数据帧的行号列。我已经实现了基本加载的行号,如下所示:
输入数据:
val base = List(List("001", "a", "abc"), List("001", "a", "123"),List("003", "c", "456") ,List("002", "b", "dfr"), List("003", "c", "ytr"))
.map(row => (row(0), row(1), row(2)))
val DS1 = base.toDF("KEY1", "KEY2" ,"VAL")
DS1.show()
+----+----+---+
|KEY1|KEY2|VAL|
+----+----+---+
| 001| a|abc|
| 001| a|123|
| 003| c|456|
| 002| b|dfr|
| 003| c|ytr|
+----+----+---+
现在我使用窗口函数添加了行号,如下所示:
val baseDF = DS1.select(col("KEY1"), col("KEY2"), col("VAL") ,row_number().over(Window.partitionBy(col("KEY1"), col("KEY2")).orderBy(col("KEY1"), col("KEY2").asc)).alias("Row_Num"))
baseDF.show()
+----+----+---+-------+
|KEY1|KEY2|VAL|Row_Num|
+----+----+---+-------+
|001 |a |abc|1 |
|001 |a |123|2 |
|002 |b |dfr|1 |
|003 |c |456|1 |
|003 |c |ytr|2 |
+----+----+---+-------+
现在增量负荷如下:
val delta = List(List("001", "a", "y45") ,List("002", "b", "444"))
.map(row => (row(0), row(1), row(2)))
val DS2 = delta.toDF("KEY1", "KEY2" ,"VAL")
DS2.show()
+----+----+---+
|KEY1|KEY2|VAL|
+----+----+---+
| 001| a|y45|
| 002| b|444|
+----+----+---+
所以预期的更新结果应该是:
baseDF.show()
|KEY1|KEY2|VAL|Row_Num|
+----+----+---+-------+
|001 |a |abc|1 |
|001 |a |123|2 |
| 001| a|y45|3 | -----> Delta record
|002 |b |dfr|1 |
| 002| b|444|2 | -----> Delta record
|003 |c |456|1 |
|003 |c |ytr|2 |
+----+----+---+-------+
使用数据框/数据集实施此解决方案的任何建议?
我们能用spark rdd的zipWithIndex
?
答案 0 :(得分:6)
使用更新的行号添加增量的一种方法是:1)在Row_Num
,2)union DS2
中添加一个带有大数字的列baseDF
,以及3)计算新的行号,如下所示:
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
val combinedDF = baseDF.union(
DS2.withColumn("Row_Num", lit(Long.MaxValue))
)
val resultDF = combinedDF.select(
col("KEY1"), col("KEY2"), col("VAL"), row_number().over(
Window.partitionBy(col("KEY1"), col("KEY2")).orderBy(col("Row_Num"))
).alias("New_Row_Num")
)
resultDF.show
+----+----+---+-----------+
|KEY1|KEY2|VAL|New_Row_Num|
+----+----+---+-----------+
| 003| c|456| 1|
| 003| c|ytr| 2|
| 002| b|dfr| 1|
| 002| b|444| 2|
| 001| a|abc| 1|
| 001| a|123| 2|
| 001| a|y45| 3|
+----+----+---+-----------+