从Elasticsearch索引读取时日期格式错误

时间:2020-06-22 18:26:18

标签: elasticsearch pyspark

我正在从elactisearch索引中读取数据。我有日期格式问题1501545600000,应该是yyyy / mm / dd格式。

from pyspark import SparkConf
from pyspark.sql import SQLContext

q ="""{
  "query": {
    "match_all": {}
  }  
}"""

es_read_conf = {
    "es.nodes" : "localhost",
    "es.port" : "9200",
    "es.resource" : "sub1",
    "es.query" : q
}

es_rdd = sc.newAPIHadoopRDD(
    inputFormatClass="org.elasticsearch.hadoop.mr.EsInputFormat",
    keyClass="org.apache.hadoop.io.NullWritable", 
    valueClass="org.elasticsearch.hadoop.mr.LinkedMapWritable", 
    conf=es_read_conf)

 df2 = [convert_ts(doc) for doc in df]

这曾经有用;不再是因为我更改了代码

    try:
        ts_from_doc = get('Refill_Bar_End_Date_and_Time', None)

        if not ts_from_doc:
            raise ValueError('`Refill_Bar_End_Date_and_Time` not found')

        # incoming as millisec so convert to sec
        as_date = dt.fromtimestamp(
            int(ts_from_doc / 1000.0)
        ).strftime('%Y-%m-%d %H:%M:%S')

        hit['Refill_Bar_End_Date_and_Time'] = as_date

    except Exception as e:
        print(e)
        pass

要使其正常运行,我必须进行哪些修改?

0 个答案:

没有答案