我没有看到如何使用camel-avro组件生成和使用kafka avro消息的示例?目前我的骆驼路线是这样。为了与使用骆驼-kafka-avro消费者和生产者的架构注册表和其他类似的道具一起使用,应该进行哪些更改。
props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true);
public void configure() {
PropertiesComponent pc = getContext().getComponent("properties", PropertiesComponent.class);
pc.setLocation("classpath:application.properties");
log.info("About to start route: Kafka Server -> Log ");
from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
+ "&maxPollRecords={{consumer.maxPollRecords}}"
+ "&consumersCount={{consumer.consumersCount}}"
+ "&seekTo={{consumer.seekTo}}"
+ "&groupId={{consumer.group}}"
+"&valueDeserializer="+KafkaAvroDeserializer.class
+"&keyDeserializer="+StringDeserializer.class
)
.routeId("FromKafka")
.log("${body}");
答案 0 :(得分:1)
我正在回答自己的问题,因为我在这个问题上呆了几天。我希望这个答案对其他人有帮助。
我尝试使用io.confluent.kafka.serializers.KafkaAvroDeserializer反序列化器,但出现了kafka异常。所以我必须编写自己的解串器才能执行以下操作:
然后,我们必须访问“ schemaRegistry”,“ useSpecificAvroReader”并设置AbstractKafkaAvroDeserializer(io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer)的那些字段
这是解决方案...
\0
public static void main(String[] args) throws Exception {
LOG.info("About to run Kafka-camel integration...");
CamelContext camelContext = new DefaultCamelContext();
// Add route to send messages to Kafka
camelContext.addRoutes(new RouteBuilder() {
public void configure() throws Exception {
PropertiesComponent pc = getContext().getComponent("properties",
PropertiesComponent.class);
pc.setLocation("classpath:application.properties");
log.info("About to start route: Kafka Server -> Log ");
from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
+ "&maxPollRecords={{consumer.maxPollRecords}}"
+ "&consumersCount={{consumer.consumersCount}}"
+ "&seekTo={{consumer.seekTo}}"
+ "&groupId={{consumer.group}}"
+ "&keyDeserializer="+ StringDeserializer.class.getName()
+ "&valueDeserializer="+CustomKafkaAvroDeserializer.class.getName()
)
.routeId("FromKafka")
.log("${body}");
}
});
camelContext.start();
// let it run for 5 minutes before shutting down
Thread.sleep(5 * 60 * 1000);
camelContext.stop();
}
答案 1 :(得分:1)
使用camel-kafka-starter
(用于春季启动)version: 3.6.0
,您无需定义CustomKafkaAvroDeserializer
。而是将以下配置详细信息添加到您的application.yaml
或application.properties
文件中,然后camel-kafka组件(生产者和消费者)都将适当的SerDe应用于正在处理的对象/字节。
camel:
springboot:
main-run-controller: true # to keep the JVM running
component:
kafka:
brokers: http://localhost:9092
schema-registry-u-r-l: http://localhost:8081
#Consumer
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
#Producer
key-serializer-class: org.apache.kafka.common.serialization.StringSerializer
serializer-class: io.confluent.kafka.serializers.KafkaAvroSerializer
specific-avro-reader: true
在运行生产者/消费者之前,您还需要确保将各自的avro-schema-json
文件上传到架构注册服务器(例如confluent-schema-registry)。
答案 2 :(得分:0)
我在同一个问题上挣扎了一段时间。我用来自 confluent 的camel-quarkus 和schema-registry 做了完整的例子: https://github.com/tstuber/camel-quarkus-kafka-schema-registry