我设置了一个汇合的s3 sink connect,它将.avro文件存储在s3中 我转储这些文件,并发现它们只是消息本身,我不知道在哪里可以找到消息密钥,任何想法?
配置如下:
{
"name": "s3-sink-test",
"config": {
"connector.class": "io.confluent.connect.s3.S3SinkConnector",
"tasks.max": "1",
"topics": "book",
"s3.region": "eu-central-1",
"s3.bucket.name": "kafka",
"s3.part.size": "5242880",
"storage.class": "io.confluent.connect.s3.storage.S3Storage",
"format.class": "io.confluent.connect.s3.format.avro.AvroFormat",
"schema.generator.class": "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
"partitioner.class": "io.confluent.connect.storage.partitioner.TimeBasedPartitioner",
"path.format": "'year'=YYYY/'month'=MM/'day'=dd/'hour'=HH",
"locale": "US",
"timezone": "UTC",
"partition.duration.ms": "3600000",
"timestamp.extractor": "RecordField",
"timestamp.field": "local_timestamp",
"flush.size": "2",
"schema.compatibility": "NONE"
}
}
答案 0 :(得分:2)
开箱即用的任何存储Kafka连接器都没有保存Kafka消息密钥
尝试编译并设置Archive Transform,可以使用Connect Configuration中的这些属性进行设置
"transforms" : "tran",
"transforms.tran.type" : "com.github.jcustenborder.kafka.connect.archive.Archive"
有关Kafka Connect中SMT的更多信息,请参阅this blog post