无法识别intellij中的窗口功能

时间:2019-06-26 15:48:39

标签: scala apache-spark

无法识别IntelliJ中的平均值和结束功能。就是说无法解析符号平均值。有人可以说出要导入的库吗?

object DomainSpecificLanguage {
def main(args: Array[String]): Unit = {
System.setProperty("hadoop.home.dir", "C:/winutils")
val spark = SparkSession.builder().appName("DomainSpecificLanguage").config("spark.master" , "local").getOrCreate()
spark.sparkContext.setLogLevel("ERROR")
import spark.implicits._
val empSalary =Seq(
  Salary("sales", 1, 5000),
  Salary("personnel", 2, 3900),
  Salary("sales", 3, 4800),
  Salary("sales", 4, 4800),
  Salary("personnel", 5, 3500),
  Salary("develop", 7, 4200),
  Salary("develop", 8, 6000),
  Salary("develop", 9, 4500),
  Salary("develop", 10, 5200),
  Salary("develop", 11, 5200)).toDS
val byDepName = Window.partitionBy('depName)
empSalary.withColumn("avg", avg('salary) over byDepName)
}

}

2 个答案:

答案 0 :(得分:1)

  

问题:无法识别IntelliJ中的平均值和结束函数。它   表示无法解析符号平均值和以上。有人可以说   导入哪个库。

import org.apache.spark.sql.functions._

是您所缺少的..对于功能avg

import org.apache.spark.sql.expressions.Window 

已导入Window。

查看正在运行的完整示例...

  package com.examples

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.expressions.Window

object DomainSpecificLanguage {
  def main(args: Array[String]): Unit = {
    System.setProperty("hadoop.home.dir", "C:/winutils")
    val spark = SparkSession.builder().appName("DomainSpecificLanguage").config("spark.master", "local").getOrCreate()
    spark.sparkContext.setLogLevel("ERROR")
    import spark.implicits._
    val empSalary = Seq(
      Salary("sales", 1, 5000),
      Salary("personnel", 2, 3900),
      Salary("sales", 3, 4800),
      Salary("sales", 4, 4800),
      Salary("personnel", 5, 3500),
      Salary("develop", 7, 4200),
      Salary("develop", 8, 6000),
      Salary("develop", 9, 4500),
      Salary("develop", 10, 5200),
      Salary("develop", 11, 5200)).toDS()
    val byDepName = Window.partitionBy('depName)
    import org.apache.spark.sql.functions._
    empSalary.withColumn("avg", avg('salary) over byDepName).show
  }

}

case class Salary(depName: String, deptnumber: Int, salary: Int)

结果:

+---------+----------+------+-----------------+
|  depName|deptnumber|salary|              avg|
+---------+----------+------+-----------------+
|  develop|         7|  4200|           5020.0|
|  develop|         8|  6000|           5020.0|
|  develop|         9|  4500|           5020.0|
|  develop|        10|  5200|           5020.0|
|  develop|        11|  5200|           5020.0|
|    sales|         1|  5000|4866.666666666667|
|    sales|         3|  4800|4866.666666666667|
|    sales|         4|  4800|4866.666666666667|
|personnel|         2|  3900|           3700.0|
|personnel|         5|  3500|           3700.0|
+---------+----------+------+-----------------+


答案 1 :(得分:0)

按照此网站上的说明进行操作:https://jaceklaskowski.gitbooks.io/mastering-spark-sql/spark-sql-functions-windows.html

import org.apache.spark.sql.expressions.Window

在上述URL上,还有很多有关Window函数的有用指南。