水平连接多个数据框

时间:2019-05-29 04:48:21

标签: scala apache-spark hadoop apache-spark-sql

我有以下数据框

val count :Dataframe = spark.sql("select 1,$database_name,$table_name count(*) from $table_name ")

输出:

  

1,库存,T076p,4332

val dist_count :Dataframe = spark.sql("1,select distinct count(*) from $table_name")`

输出:

  

4112或4332(可以相同)

val truecount : Dataframe = spark.sql("select 1,count(*) from $table_name where flag =true")`

输出:

  

4330

   val Falsecount : DataFrame = spark.sql("select 1,count(*) from $table_name where flag =false")

输出:

  

4332

问题:我如何在dataframe上方加入以获得dataframe的结果,从而给我输出结果。
如下。

  

库存,T076p,4332,4332,4330

此处的逗号用于列分隔符

P.S-我为每个dataframe添加了1,因此我可以使用join dataframes(因此此处1不是必需的。)

1 个答案:

答案 0 :(得分:1)

  

问题
  我如何加入上面的数据框以获得结果数据框   给我下面的o / p。

     

stock,T076p,4332,4332,4330-这里的逗号是列分隔符

仅检查此示例。我已经用下面的虚拟数据帧模拟了您的要求。


package com.examples

import org.apache.log4j.{Level, Logger}
import org.apache.spark.sql.SparkSession

object MultiDFJoin {
  def main(args: Array[String]) {
    import org.apache.spark.sql.functions._
    Logger.getLogger("org").setLevel(Level.OFF)

    val spark = SparkSession.builder.
      master("local")
      .appName(this.getClass.getName)
      .getOrCreate()
    import spark.implicits._
    val columns = Array("column1", "column2", "column3", "column4")
    val df1 = (Seq(
      (1, "stock", "T076p", 4332))
      ).toDF(columns: _*).as("first")
    df1.show()
    val df2 = Seq((1, 4332)).toDF(columns.slice(0, 2): _*).as("second")
    df2.show()
    val df3 = Seq((1, 4330)).toDF(columns.slice(0, 2): _*).as("third")
    df3.show()
    val df4 = Seq((1, 4332)).toDF(columns.slice(0, 2): _*).as("four")
    df4.show()
    val finalcsv = df1.join(df2, col("first.column1") === col("second.column1")).selectExpr("first.*", "second.column2")
      .join(df3, Seq("column1")).selectExpr("first.*", "third.column2")
      .join(df4, Seq("column1"))
      .selectExpr("first.*", "third.column2", "four.column2")
      .drop("column1").collect.mkString(",") // this column used for just joining hence dropping
    print(finalcsv)
  }
}

结果:

+-------+-------+-------+-------+
|column1|column2|column3|column4|
+-------+-------+-------+-------+
|      1|  stock|  T076p|   4332|
+-------+-------+-------+-------+

+-------+-------+
|column1|column2|
+-------+-------+
|      1|   4332|
+-------+-------+

+-------+-------+
|column1|column2|
+-------+-------+
|      1|   4330|
+-------+-------+

+-------+-------+
|column1|column2|
+-------+-------+
|      1|   4332|
+-------+-------+

[stock,T076p,4332,4330,4332]