Kafka将字符串连接到PostgreSQL中的json

时间:2018-07-09 09:25:03

标签: json postgresql apache-kafka apache-kafka-connect confluent

我有一个包含json字符串的主题。例如,一条消息可能是:

'{"id":"foo", "datetime":1}'

在本主题中,所有内容均视为字符串。

我想用postgresqlkafka-connect表中发送消息。我的目标是让postgresql理解消息是json。实际上,postgresql handles pretty well json

如何告诉kafka-connect或postgresql消息实际上是json?

谢谢

编辑:

现在,我使用./bin/connect-standalone config/connect-standalone.properties config/sink-sql-rules.properties

使用:

  • connect-standalone.properties

    bootstrap.servers=localhost:9092
    key.converter=org.apache.kafka.connect.json.JsonConverter
    value.converter=org.apache.kafka.connect.json.JsonConverter
    key.converter.schemas.enable=false
    value.converter.schemas.enable=false
    internal.key.converter=org.apache.kafka.connect.json.JsonConverter
    internal.value.converter=org.apache.kafka.connect.json.JsonConverter
    internal.key.converter.schemas.enable=false
    internal.value.converter.schemas.enable=false
    offset.storage.file.filename=/tmp/connect.offsets
    offset.flush.interval.ms=10000
    rest.port=8084
    plugin.path=share/java
    
  • sink-sql-rules.properties

    name=mysink
    connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
    tasks.max=1
    
    # The topics to consume from - required for sink connectors like this one
    topics=mytopic
    
    # Configuration specific to the JDBC sink connector.
    connection.url=***
    connection.user=***
    connection.password=***
    
    mode=timestamp+incremeting
    auto.create=true
    auto.evolve=true
    table.name.format=mytable
    batch.size=500
    

EDIT2:

有了那些conf,我得到这个错误: org.apache.kafka.connect.errors.ConnectException: No fields found using key and value schemas for table

0 个答案:

没有答案