Spark结构化流式数据砖事件中心模式定义问题

时间:2019-06-26 20:50:38

标签: scala apache-spark databricks spark-structured-streaming

我在定义json文档的结构时遇到问题。 enter image description here

现在我正在尝试在流读取上执行相同的架构。

val jsonSchema = StructType([ StructField("associatedEntities", struct<driver:StringType,truck:StringType>, True), 
                          StructField("heading", StringType, True), 
                          StructField("location", struct<accuracyType:StringType,captureDateTime:StringType,cityStateCode:StringType,description:StringType,latitude:DoubleType,longitude:DoubleType,quality:StringType,transmitDateTime:StringType>, True), 
                          StructField("measurements", array<struct<type:StringType,uom:StringType,value:StringType>>, True), 
                          StructField("source", struct<entityType:StringType,key:StringType,vendor:StringType>, True), 
                          StructField("speed", DoubleType, True)])

val df = spark
 .readStream
 .format("eventhubs")
 //.schema(jsonSchema) 
 .options(ehConf.toMap)
 .load()

当我在笔记本中运行此单元格“:15:错误:简单表达式的非法启动 val jsonSchema = StructType([StructField(“ associatedEntities”,struct,True),“

编辑:目标是将数据放入数据框。我可以从事件中心消息的正文中获取json字符串,但是如果我无法使模式正常工作,我不确定从那里可以做什么。

1 个答案:

答案 0 :(得分:0)

由于架构定义,您收到错误消息。模式定义应如下所示:

import org.apache.spark.sql.types._

val jsonSchema = StructType(
                        Seq(StructField("associatedEntities", 
                                        StructType(Seq(
                                          StructField("driver", StringType), 
                                          StructField ("truck", StringType)
                                        ))),
                            StructField("heading", StringType),
                            StructField("measurements", ArrayType(StructType(Seq(StructField ("type", StringType), StructField ("uom", StringType), StructField("value", StringType)))))
                           )
                         )

您可以使用以下方法仔细检查架构:

jsonSchema.printTreeString

返回架构:

root
 |-- associatedEntities: struct (nullable = true)
 |    |-- driver: string (nullable = true)
 |    |-- truck: string (nullable = true)
 |-- heading: string (nullable = true)
 |-- measurements: array (nullable = true)
 |    |-- element: struct (containsNull = true)
 |    |    |-- type: string (nullable = true)
 |    |    |-- uom: string (nullable = true)
 |    |    |-- value: string (nullable = true)

如注释中所述,您将获得二进制数据。首先,您将获得原始数据帧:

val rawData = spark.readStream
  .format("eventhubs")
  .option(...)
  .load()

您必须:

  • 将数据转换为字符串
  • 解析嵌套的json
  • 并将其压扁

使用已解析的数据定义数据框:

val parsedData = rawData
   .selectExpr("cast (Body as string) as json")
   .select(from_json($"json", jsonSchema).as("data"))
   .select("data.*")