我有一个pyspark数据框,其中的字符串列格式为YYYYMMDD,并且尝试将其转换为日期列(我应具有最终日期ISO 8061)。该字段被命名为截止日期,其格式如下:
deadline
20190530
我尝试了以下解决方案:
from pyspark.sql.functions import unix_timestamp, col
from pyspark.sql.types import TimestampType
from pyspark.sql.types import StringType
from pyspark.sql.functions import from_unixtime
from pyspark.sql.types import DateType
df.select(to_date(df.deadline).alias('dt')).show()
df.withColumn('new_date',to_date(unix_timestamp(df.deadline, 'YYYYMMDD').cast('timestamp'))).show()
orders_concat.select(unix_timestamp(orders_concat.deadline, 'YYYYMMDD')).show()
df.select(unix_timestamp(df.ts_string, 'yyyy/MM/dd HH:mm:ss').cast(TimestampType()).alias("timestamp")).show()
df.select(unix_timestamp(df.deadline, 'yyyy/MM/dd HH:mm:ss').cast(TimestampType()).alias("timestamp")).show()
df.select(to_date(cast(unix_timestamp('deadline', 'YYYYMMDD').alias('timestamp').show()
ndf = df.withColumn('_1', df['deadline'].cast(DateType()))
df2 = df.select('deadline', from_unixtime(unix_timestamp('deadline', 'YYYYMMDD')).alias('date'))
我总是得到空值。
有人建议吗?
答案 0 :(得分:0)
使用正确的格式yyyyMMdd
,它可以正常工作:
from pyspark.sql import functions as F
df.withColumn('new_date',F.to_date(F.unix_timestamp(df.deadline, 'yyyyMMdd').cast('timestamp'))).show()
+--------+----------+
|deadline| new_date|
+--------+----------+
|20190530|2019-05-30|
+--------+----------+