Kafka connect- SpoolDir源连接器未显示

时间:2020-09-14 20:36:51

标签: apache-kafka apache-kafka-connect

我尝试使用Rest API调用创建Kafka Connect SpoolDir源连接器。启动zookeeper和Kafka服务器,并使用kafka/bin/connect-distributed.sh dir-distributed.properties启动worker之后,我从Postman进行了以下API调用:

POST http://localhost:8083/connectors
{
    "name": "csv-source-orders",
    "config": {
        "connector.class": "com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector",
        "tasks.max": "1",
        "topic": "orders",
        "input.file.pattern":"^orders.*.csv$",
        "input.path":"/Users/ivij/temp/source",
        "finished.path":"/Users/ivij/temp/finished",
        "error.path":"/Users/ivij/temp/error",
        "halt.on.error": "false",
        "csv.separator.char":"01",
        "value.schema":"{\"name\":\"com.github.jcustenborder.kafka.connect.model.Value\",\"type\":\"STRUCT\",\"isOptional\":false,\"fieldSchemas\":{\"order_id\":{\"type\":\"INT64\",\"isOptional\":false},\"customer_id\":{\"type\":\"INT64\",\"isOptional\":false},\"order_ts\":{\"type\":\"STRING\",\"isOptional\":false},\"product\":{\"type\":\"STRING\",\"isOptional\":false},\"order_total_usd\":{\"type\":\"STRING\",\"isOptional\":false}}}",
        "key.schema":"{\"name\":\"com.github.jcustenborder.kafka.connect.model.Key\",\"type\":\"STRUCT\",\"isOptional\":false,\"fieldSchemas\":{\"order_id\":{\"type\":\"INT64\",\"isOptional\":false}}}",
        "csv.first.row.as.header": "true",
        "flush.size": "100",
        "rotate.interval.ms": "1000"
    }
}

响应状态为201 Created,并显示一条消息,表明已创建新资源。但是在响应的正文中,任务字段为空:

    "tasks": [],
    "type": "source"
  • 当我尝试检查连接器的状态或使用GET localhost:8083/connectors/列出连接器时,收到的响应为[]
  • 我还尝试列出正在端口8083上创建的主题(以检查是否在API json中创建了“ orders”主题),但是出现了OutOfMemory错误。
    是否正在创建连接器?我该如何解决?

编辑:以下是“ dir-distributed.properties”文件:

#connect-distributed.properties

bootstrap.servers=localhost:9092
group.id=connect-cluster-a

rest.port=8083

schema.generation.enabled=true
schema.generation.value.name=schemavalue
schema.generation.key.name=schemakey

key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter

key.converter.schemas.enable=true
value.converter.schemas.enable=true

offset.storage.topic=connect-offsets
offset.storage.replication.factor=1

config.storage.topic=connect-configs
config.storage.replication.factor=1

status.storage.topic=connect-status
status.storage.replication.factor=1

offset.flush.interval.ms=10000
plugin.path=.../kafka-connect-spooldir/target/kafka-connect-target/usr/share/kafka-connect/

0 个答案:

没有答案