我有一个很大的spark sql语句,我试图分成更小的块以获得更好的代码可读性。我不想加入它,只是合并结果。
当前工作的sql语句 -
val dfs = x.map(field => spark.sql(s"
select ‘test’ as Table_Name,
'$field' as Column_Name,
min($field) as Min_Value,
max($field) as Max_Value,
approx_count_distinct($field) as Unique_Value_Count,
(
SELECT 100 * approx_count_distinct($field)/count(1)
from tempdftable
) as perc
from tempdftable
”))
我正在尝试从上面的sql
中取出以下查询(SELECT 100 * approx_count_distinct($field)/count(1) from tempdftable) as perc
有了这个逻辑 -
val Perce = x.map(field => spark.sql(s"(SELECT 100 * approx_count_distinct($field)/count(1) from parquetDFTable)"))
然后将这个val Perce与第一个大的SQL语句合并到下面的语句但它不起作用 -
val dfs = x.map(field => spark.sql(s"
select ‘test’ as Table_Name,
'$field' as Column_Name,
min($field) as Min_Value,
max($field) as Max_Value,
approx_count_distinct($field) as Unique_Value_Count,
'"+Perce+ "'
from tempdftable
”))
我们怎么写这个?
答案 0 :(得分:2)
为什么不全力以赴并将整个表达式转换为Spark代码?
import spark.implicits._
import org.apache.spark.sql.functions._
val fraction = udf((approxCount: Double, totalCount: Double) => 100 * approxCount/totalCount)
val fields = Seq("colA", "colB", "colC")
val dfs = fields.map(field => {
tempdftable
.select(min(field) as "Min_Value", max(field) as "Max_Value", approx_count_distinct(field) as "Unique_Value_Count", count(field) as "Total_Count")
.withColumn("Table_Name", lit("test"))
.withColumn("Column_Name", lit(field))
.withColumn("Perc", fraction('Unique_Value_Count, 'Total_Count))
.select('Table_Name, 'Column_Name, 'Min_Value, 'Max_Value, 'Unique_Value_Count, 'Perc)
})
val df = dfs.reduce(_ union _)
在这样的测试示例中:
val tempdftable = spark.sparkContext.parallelize(List((3.0, 7.0, 2.0), (1.0, 4.0, 10.0), (3.0, 7.0, 2.0), (5.0, 0.0, 2.0))).toDF("colA", "colB", "colC")
tempdftable.show
+----+----+----+
|colA|colB|colC|
+----+----+----+
| 3.0| 7.0| 2.0|
| 1.0| 4.0|10.0|
| 3.0| 7.0| 2.0|
| 5.0| 0.0| 2.0|
+----+----+----+
我们得到了
df.show
+----------+-----------+---------+---------+------------------+----+
|Table_Name|Column_Name|Min_Value|Max_Value|Unique_Value_Count|Perc|
+----------+-----------+---------+---------+------------------+----+
| test| colA| 1.0| 5.0| 3|75.0|
| test| colB| 0.0| 7.0| 3|75.0|
| test| colC| 2.0| 10.0| 2|50.0|
+----------+-----------+---------+---------+------------------+----+