如何一步创建带有时间戳数据类型的Spark数据帧?这是我分两步执行的方法。使用Spark 2.4
首先使用时间戳字符串创建数据框
import org.apache.spark.sql.types._
import org.apache.spark.sql.functions.to_timestamp
val eventData = Seq(
Row(1, "2014/01/01 23:00:01"),
Row(1, "2014/11/30 12:40:32"),
Row(2, "2016/12/29 09:54:00"),
Row(2, "2016/05/09 10:12:43")
)
val schema = StructType(List(
StructField("typeId", IntegerType, false),
StructField("eventTimeString", StringType, false)
))
val eventDF = spark.createDataFrame(
sc.parallelize(eventData),
schema
)
eventDF.show()
+------+-------------------+
|typeId| eventTimeString|
+------+-------------------+
| 1|2014/01/01 23:00:01|
| 1|2014/11/30 12:40:32|
| 2|2016/12/29 09:54:00|
| 2|2016/05/09 10:12:43|
+------+-------------------+
然后将字符串转换为时间戳并删除字符串列
val eventTimestampsDF = eventDF
.withColumn("eventTime", to_timestamp($"eventTimeString", "yyyy/MM/dd k:mm:ss"))
.drop($"eventTimeString")
如何消除第二步并直接创建时间戳?
答案 0 :(得分:2)
您可以这样:
import java.sql.Timestamp
import spark.implicits._
val df = Seq(
(1, Timestamp.valueOf("2014-01-01 23:00:01")),
(1, Timestamp.valueOf("2014-11-30 12:40:32")),
(2, Timestamp.valueOf("2016-12-29 09:54:00")),
(2, Timestamp.valueOf("2016-05-09 10:12:43"))
).toDF("typeId","eventTime")
无需使用Row
对象和自定义架构