在Spark Structured Streaming中找不到“窗口”功能

时间:2018-03-16 19:18:44

标签: spark-streaming spark-structured-streaming

我在Spark Structured Streaming编写了一个小例子,我试图处理netstat命令的输出,但无法弄清楚如何调用window函数。

这些是我的build.sbt的相关行:

scalaVersion := "2.11.4"
scalacOptions += "-target:jvm-1.8"

libraryDependencies ++= {

  val sparkVer = "2.3.0"
  Seq(
    "org.apache.spark" %% "spark-streaming" % sparkVer % "provided",
    "org.apache.spark" %% "spark-streaming-kafka-0-8" % sparkVer % "provided",
    "org.apache.spark" %% "spark-core" % sparkVer % "provided" withSources(),
    "org.apache.spark" %% "spark-hive" % sparkVer % "provided",
  )
}

代码:

case class NetEntry(val timeStamp: java.sql.Timestamp, val sourceHost: String, val targetHost: String, val status: String)

def convertToNetEntry(x: String): NetEntry = {
    // tcp        0      0 eselivpi14:icl-twobase1 eselivpi149.int.e:48442 TIME_WAIT
   val array = x.replaceAll("\\s+"," ").split(" ").slice(3,6)
   NetEntry(java.sql.Timestamp.valueOf(LocalDateTime.now()), array(0),array(1),array(2))
}

def main(args: Array[String]) {

    // Initialize spark context
    val spark: SparkSession = SparkSession.builder.appName("StructuredNetworkWordCount").getOrCreate()
    spark.sparkContext.setLogLevel("ERROR")

    val lines = spark.readStream
    .format("socket")
    .option("host", args(0))
    .option("port", args(1).toInt)
    .load()

    import spark.implicits._
    val df = lines.as[String].map(x => convertToNetEntry(x))

    val wordsArr: Dataset[NetEntry] = df.as[NetEntry]
    wordsArr.printSchema()

    // Never get past this point
    val windowColumn = window($"timestamp", "10 minutes", "5 minutes")

    val windowedCounts = wordsArr.groupBy( windowColumn, $"targetHost").count()

    val query = windowedCounts.writeStream.outputMode("complete").format("console").start()
    query.awaitTermination()
}

我有Spark 2.1,2,2和2.3,结果相同。真奇怪的是,我有一个Spark Cluster,我登录Spark Shell并复制所有行......它可以工作!我知道我做错了什么?

编译时的错误:

[error] C:\code_legacy\edos-dp-mediation-spark-consumer\src\main\scala\com\ericsson\streaming\structured\StructuredStreamingMain.scala:39: not found: value window
[error]     val windowColumn = window($"timestamp", "10 minutes", "5 minutes")
[error]                        ^
[warn] 5 warnings found
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 19 s, completed 16-mar-2018 20:13:40

更新:为了让事情变得更奇怪,我检查了API文档,我在这里找不到有效的引用: https://spark.apache.org/docs/2.3.0/api/scala/index.html#org.apache.spark.sql.SparkSession $ $ implicits

1 个答案:

答案 0 :(得分:2)

您需要导入window函数进行编译,该函数已经在spark-shell中导入。

添加此导入语句:

import org.apache.spark.sql.functions.window