SparkSQL groupby创建嵌套记录

时间:2019-03-13 03:30:49

标签: sql scala apache-spark apache-spark-sql

我有以下形式的信息(显然是伪造的,但可以达到目的):

| User | Country |
|------|---------|
| A    | Sweden  |
| A    | Sweden  |
| A    | London  |
| B    | Spain   |
| B    | Denmark |
| B    | Brazil  |
| C    | India   |

这可以作为spark中的数据框使用。我当时想使用spark(也许还有SparkSQL)为每个用户计算频率图。

(A => Map((Sweden, 2), (London, 1)))
(B => Map((Spain, 1), (Brazil, 1), (Denmark, 1)))
(C => Map((India, 1)))

到目前为止,我已经做到了:

(A => (Sweden, 2))
(A => (London, 1))
(B => (Spain, 1))
(B => (Brazil, 1))
(B => (Denmark, 1))
(C => (India, 1))

通过使用以下查询:

SELECT user, country, COUNT(country) as frequency
FROM information
GROUP BY user, country

但是问题是我最终只有6行而不是3行。不确定从这里开始。

2 个答案:

答案 0 :(得分:2)

您可以使用groupBy/agg来应用另一个struct(Country, Frequency)来汇总collect_list,如下所示:

import org.apache.spark.sql.functions._
import spark.implicits._

val df = Seq(
  ("A", "Sweden"), ("A", "Sweden"), ("A", "London"),
  ("B", "Spain"), ("B", "Denmark"), ("B", "Brazil"),
  ("C", "India")
).toDF("User", "Country")

df.
  groupBy("User", "Country").agg(count("Country").as("Frequency")).
  groupBy("User").agg(collect_list(struct("Country", "Frequency")).as("Country_Counts")).
  show(false)
// +----+------------------------------------+
// |User|Country_Counts                      |
// +----+------------------------------------+
// |B   |[[Denmark,1], [Brazil,1], [Spain,1]]|
// |C   |[[India,1]]                         |
// |A   |[[London,1], [Sweden,2]]            |
// +----+------------------------------------+

请注意,第一个groupBy/agg转换等效于您的SQL查询。

答案 1 :(得分:0)

此后,您需要按用户分组并收集国家和频率的地图。下面的代码应该会有所帮助。

//Creating Test Data
val df = Seq(("A", "Sweden"), ("A", "Sweden"), ("A", "London"), ("B", "Spain"), ("B", "Denmark"), ("B", "Brazil"), ("C", "India"))
  .toDF("user", "country")

df.show(false)
+----+-------+
|user|country|
+----+-------+
|A   |Sweden |
|A   |Sweden |
|A   |London |
|B   |Spain  |
|B   |Denmark|
|B   |Brazil |
|C   |India  |
+----+-------+

df.registerTempTable("information")

val joinMap = spark.udf.register( "joinMap" , (values: Seq[Map[String,Long]]) => values.flatten.toMap )

val resultDF = spark.sql("""SELECT user, joinMap(collect_list(map(country, frequency))) as frequencyMap
                           |From ( SELECT user, country, COUNT(country) as frequency
                           |FROM information
                           |GROUP BY user, country ) A
                           |GROUP BY user""".stripMargin)

resultDF.show(false)
+----+------------------------------------------+
|user|frequencyMap                              |
+----+------------------------------------------+
|A   |Map(Sweden -> 2, London -> 1)             | 
|B   |Map(Spain -> 1, Denmark -> 1, Brazil -> 1)|
|C   |Map(India -> 1)                           |
+----+------------------------------------------+

如果要将最终结果用作Map,请使用UDF。如果没有UDF,您将以地图列表的形式获得它。