使用Spark Scala将嵌套JSON中的字符串变量转换为datetime

时间:2017-03-14 21:33:13

标签: json scala apache-spark

我在Spark中有一个嵌套的JSON数据框,如下所示

root
 |-- data: struct (nullable = true)
 |    |-- average: long (nullable = true)
 |    |-- sum: long (nullable = true)
 |    |-- time: string (nullable = true)
 |-- password: string (nullable = true)
 |-- url: string (nullable = true)
 |-- username: string (nullable = true)

我需要将数据结构下的时间变量转换为时间戳数据类型。以下是我试过的代码,但没有给我我想要的结果。

val jsonStr = """{
"url": "imap.yahoo.com",
"username": "myusername",
"password": "mypassword",
"data": {
"time":"2017-1-29 0-54-32",
"average": 234,
"sum": 123}}"""


  val json: JsValue = Json.parse(jsonStr)

  import sqlContext.implicits._
  val rdd = sc.parallelize(jsonStr::Nil);
  var df = sqlContext.read.json(rdd);
  df.printSchema()
  val dfRes = df.withColumn("data",makeTimeStamp(unix_timestamp(df("data.time"),"yyyy-MM-dd hh-mm-ss").cast("timestamp")))
  dfRes.printSchema();

case class Convert(time: java.sql.Timestamp)
val makeTimeStamp = udf((time: java.sql.Timestamp) => Convert(
  time))

我的代码结果:

root
 |-- data: struct (nullable = true)
 |    |-- time: timestamp (nullable = true)
 |-- password: string (nullable = true)
 |-- url: string (nullable = true)
 |-- username: string (nullable = true)

我的代码实际上是删除数据结构中的其他元素(平均值和求和),而不是仅将时间字符串转换为时间戳数据类型。对于JSON数据帧的基本数据管理操作,我们是否需要在需要功能时编写UDF,或者是否有可用于JSON数据管理的库。我目前正在使用Play框架来处理spark中的JSON对象。提前谢谢。

2 个答案:

答案 0 :(得分:0)

你可以试试这个:

val jsonStr = """{
"url": "imap.yahoo.com",
"username": "myusername",
"password": "mypassword",
"data": {
"time":"2017-1-29 0-54-32",
"average": 234,
"sum": 123}}"""


val json: JsValue = Json.parse(jsonStr)

import sqlContext.implicits._
val rdd = sc.parallelize(jsonStr::Nil);
var df = sqlContext.read.json(rdd);
df.printSchema()
val dfRes = df.withColumn("data",makeTimeStamp(unix_timestamp(df("data.time"),"yyyy-MM-dd hh-mm-ss").cast("timestamp"), df("data.average"), df("data.sum")))

case class Convert(time: java.sql.Timestamp, average: Long, sum: Long)
val makeTimeStamp = udf((time: java.sql.Timestamp, average: Long, sum: Long) => Convert(time, average, sum))

这将给出结果:

root
|-- url: string (nullable = true)
|-- username: string (nullable = true)
|-- password: string (nullable = true)
|-- data: struct (nullable = true)
|    |-- time: timestamp (nullable = true)
|    |-- average: long (nullable = false)
|    |-- sum: long (nullable = false)

唯一改变的是Convert案例类和makeTimeStamp UDF。

答案 1 :(得分:0)

假设您可以预先指定Spark模式,自动字符串到时间戳类型强制应该处理转换。

import org.apache.spark.sql.types._
val dschema = (new StructType).add("url", StringType).add("username", StringType).add
           ("data", (new StructType).add("sum", LongType).add("time", TimestampType))
val df = spark.read.schema(dschema).json("/your/json/on/hdfs")
df.printSchema
df.show

This article概述了一些处理不良数据的技巧;值得一读的用例。