Spark Hive:按照另一个DataFrame列

时间:2017-03-15 13:37:50

标签: dataframe spark-dataframe hiveql spark-hive

我有以下两个DataFrames

DataFrame "dfPromotion":
date        | store
===================
2017-01-01  | 1    
2017-01-02  | 1


DataFrame "dfOther":
date        | store
===================
2017-01-01  | 1    
2017-01-03  | 1    

稍后我需要union以上的DataFrames。但在我必须删除具有dfOther值的所有date行之前,dfPromotion中也包含该行。

以下filtering步骤的结果应如下所示:

DataFrame "dfPromotion" (this stays always the same, must not be changed in this step!)
date        | store
===================
2017-01-01  | 1    
2017-01-02  | 1


DataFrame "dfOther" (first row is removed as dfPromotion contains the date 2017-01-01 in the "date" column)
date        | store
===================
2017-01-03  | 1 

有没有办法在Java中执行此操作?我之前只找到DataFrame.except方法,但这会检查DataFrame的所有列。我需要仅通过date过滤第二个DataFrame,因为稍后可以添加其他列,其中可能包含不同的值......

调用dfOther.filter(dfOther.col("date").isin(dfPromotion.col("date")))抛出异常:

Exception in thread "main" org.apache.spark.sql.AnalysisException: resolved attribute(s) date#64 missing from date#0,store#13 in operator !Filter date#0 IN (date#64);

2 个答案:

答案 0 :(得分:2)

您可以使用减法功能

dfOther.select("date").except(dfPromotion.select("date")).join(dfOther,'date').show()

答案 1 :(得分:1)

既然你提到了Spark Hive,你可以试试下面的spark sql方法吗?

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc);
val dfpromotion = sqlContext.sql("select * from dfpromotion");

dfpromotion.show
+----------+-----+
|        dt|store|
+----------+-----+
|2017-01-01|    1|
|2017-01-02|    1|
+----------+-----+

val dfother = sqlContext.sql("select * from dfother");

dfother.show
+----------+-----+
|        dt|store|
+----------+-----+
|2017-01-01|    1|
|2017-01-03|    1|
+----------+-----+


val dfdiff = sqlContext.sql("select o.dt, o.store from dfpromotion p right         outer join dfother o on p.dt = o.dt where p.dt is null");
val dfunion = dfpromotion.union(dfdiff);


scala> dfunion.show
+----------+-----+
|        dt|store|
+----------+-----+
|2017-01-01|    1|
|2017-01-02|    1|
|2017-01-03|    1|