使用kafka-connect-spooldir连接器在Kafka Connect中解析格式为dd.MM.yyyy的日期

时间:2019-02-09 08:31:56

标签: apache-kafka apache-kafka-connect

我正在尝试使用https://github.com/jcustenborder/kafka-connect-spooldir上的SpoolDirCsvSourceConnector

我在Kafka中具有以下连接器配置:

connector.class=com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector
csv.first.row.as.header=true
finished.path=/csv/finished
tasks.max=1
parser.timestamp.date.formats=[dd.MM.yyyy, yyyy-MM-dd'T'HH:mm:ss, yyyy-MM-dd' 'HH:mm:ss]
key.schema={"name":"com.github.jcustenborder.kafka.connect.model.Key","type":"STRUCT","isOptional":false,"fieldSchemas":{}}
csv.separator.char=59
input.file.pattern=umsaetze_.*.csv
topic=test-csv
error.path=/csv/error
input.path=/csv/input
value.schema={"name":"com.github.jcustenborder.kafka.connect.model.Value","type":"STRUCT","isOptional":false,"fieldSchemas":{"Buchungstag":{"name":"org.apache.kafka.connect.data.Timestamp","type":"INT64","version":1,"isOptional":true},"Wertstellung":{"name":"org.apache.kafka.connect.data.Timestamp","type":"INT64","version":1,"isOptional":true},"Vorgang":{"type":"STRING","isOptional":false},"Buchungstext":{"type":"STRING","isOptional":false},"Umsatz":{"name":"org.apache.kafka.connect.data.Decimal","type":"BYTES","version":1,"parameters":{"scale":"2"},"isOptional":true}}}

值架构如下:

{
  "name": "com.github.jcustenborder.kafka.connect.model.Value",
  "type": "STRUCT",
  "isOptional": false,
  "fieldSchemas": {
    "Buchungstag": {
      "name": "org.apache.kafka.connect.data.Date",
      "type": "INT32",
      "version": 1,
      "isOptional": true
    },
    "Wertstellung": {
      "name": "org.apache.kafka.connect.data.Timestamp",
      "type": "INT64",
      "version": 1,
      "isOptional": true
    },
    "Vorgang": {
      "type": "STRING",
      "isOptional": false
    },
    "Buchungstext": {
      "type": "STRING",
      "isOptional": false
    },
    "Umsatz": {
      "name": "org.apache.kafka.connect.data.Decimal",
      "type": "BYTES",
      "version": 1,
      "parameters": {
        "scale": "2"
      },
      "isOptional": true
    }
  }
}

我尝试使用Date而不是时间戳

{
  "name" : "org.apache.kafka.connect.data.Date",
  "type" : "INT32",
  "version" : 1,
  "isOptional" : true
}

时间戳和日期都不适用于我,但与Buchungstag和Wertstellung字段的示例相同。我试图使用选项parser.timestamp.date.formats解决它,但没有帮助。

以下是我要导入到Kafka的CSV示例:

Buchungstag;Wertstellung;Vorgang;Buchungstext;Umsatz;
08.02.2019;08.02.2019;Lastschrift / Belastung;Auftraggeber: BlablaBuchungstext: Fahrschein XXXXXX Ref. U3436346/8423;-55,60;
08.02.2019;08.02.2019;Lastschrift / Belastung;Auftraggeber: Bank AGBuchungstext: 01.02.209:189,34 Ref. ZMKDVSDVS/5620;-189,34;

我在Kafka Connect中遇到以下异常:

org.apache.kafka.connect.errors.ConnectException: org.apache.kafka.connect.errors.DataException: Exception thrown while parsing data for 'Buchungstag'. linenumber=2
    at com.github.jcustenborder.kafka.connect.spooldir.AbstractSourceTask.read(AbstractSourceTask.java:277)
    at com.github.jcustenborder.kafka.connect.spooldir.AbstractSourceTask.poll(AbstractSourceTask.java:144)
    ... 10 more
Caused by: org.apache.kafka.connect.errors.DataException: Could not parse '08.02.2019' to 'Date'
    at com.github.jcustenborder.kafka.connect.utils.data.Parser.parseString(Parser.java:113)
    ... 11 more
Caused by: java.lang.IllegalStateException: Could not parse '08.02.2019' to java.util.Date
    at com.google.common.base.Preconditions.checkState(Preconditions.java:588)
    ... 12 more

您是否知道应该有什么价值模式来解析2001年1月1日这样的日期?

1 个答案:

答案 0 :(得分:1)

我认为问题出在您的parser.timestamp.date.formats值上。您通过了[dd.MM.yyyy, yyyy-MM-dd'T'HH:mm:ss, yyyy-MM-dd' 'HH:mm:ss]

在配置中,属性(parser.timestamp.date.formats)设置为List类型。列表应使用逗号分隔符(,)作为字符串传递。 在您的情况下,应为:dd.MM.yyyy,yyyy-MM-dd'T'HH:mm:ss,yyyy-MM-dd' 'HH:mm:ss。问题可能出在空白处,因为它们已被修剪。