如何使用Scala将带有毫秒的字符串列转换为Spark 2.1中的时间戳(以毫秒为单位)?

时间:2017-07-03 13:25:35

标签: scala datetime apache-spark

我使用Spark 2.1和Scala。

如何将带有毫秒的字符串列转换为带有毫秒的时间戳?

我尝试了问题Better way to convert a string field into timestamp in Spark

中的以下代码
import org.apache.spark.sql.functions.unix_timestamp
val tdf = Seq((1L, "05/26/2016 01:01:01.601"), (2L, "#$@#@#")).toDF("id", "dts")
val tts = unix_timestamp($"dts", "MM/dd/yyyy HH:mm:ss.SSS").cast("timestamp")
tdf.withColumn("ts", tts).show(2, false)

但是我得到的结果没有毫秒:

+---+-----------------------+---------------------+
|id |dts                    |ts                   |
+---+-----------------------+---------------------+
|1  |05/26/2016 01:01:01.601|2016-05-26 01:01:01.0|
|2  |#$@#@#                 |null                 |
+---+-----------------------+---------------------+

3 个答案:

答案 0 :(得分:5)

使用SimpleDateFormat的UDF有效。这个想法来自Ram Ghadiyaram与UDF logic的链接。

import java.text.SimpleDateFormat
import java.sql.Timestamp
import org.apache.spark.sql.functions.udf
import scala.util.{Try, Success, Failure}

val getTimestamp: (String => Option[Timestamp]) = s => s match {
  case "" => None
  case _ => {
    val format = new SimpleDateFormat("MM/dd/yyyy' 'HH:mm:ss.SSS")
    Try(new Timestamp(format.parse(s).getTime)) match {
      case Success(t) => Some(t)
      case Failure(_) => None
    }    
  }
}

val getTimestampUDF = udf(getTimestamp)
val tdf = Seq((1L, "05/26/2016 01:01:01.601"), (2L, "#$@#@#")).toDF("id", "dts")
val tts = getTimestampUDF($"dts")
tdf.withColumn("ts", tts).show(2, false)

带输出:

+---+-----------------------+-----------------------+
|id |dts                    |ts                     |
+---+-----------------------+-----------------------+
|1  |05/26/2016 01:01:01.601|2016-05-26 01:01:01.601|
|2  |#$@#@#                 |null                   |
+---+-----------------------+-----------------------+

答案 1 :(得分:3)

有一种比制作UDF更简单的方法。只需解析毫秒数据并将其添加到Unix时间戳中即可(以下代码与pyspark一起使用,并且应该非常接近于scala等效项):

timeFmt = "yyyy/MM/dd HH:mm:ss.SSS"
df = df.withColumn('ux_t', unix_timestamp(df.t, format=timeFmt) + substring(df.t, -3, 3).cast('float')/1000)

结果: '2017/03/05 14:02:41.865'已转换为1488722561.865

答案 2 :(得分:-1)

import org.apache.spark.sql.functions;
import org.apache.spark.sql.types.DataTypes;


dataFrame.withColumn(
    "time_stamp", 
    dataFrame.col("milliseconds_in_string")
        .cast(DataTypes.LongType)
        .cast(DataTypes.TimestampType)
)

代码在Java中,很容易转换为Scala