spring-kafka消费者长时间阅读消息可能是什么原因?

时间:2020-06-12 06:03:08

标签: spring-boot apache-kafka kafka-consumer-api spring-kafka

我正在使用spring-Kafka(2.2.3.RELEASE)插件来使用来自Kafka的消息。这么多天以来,它运行良好,但是昨天我发现,消费者一次又一次地重新阅读了同一批消息,大约需要6到7个小时,之后,它又可以正常工作了。谁能帮我找到潜在的问题吗?以下是代码段:

class Test{
   private String name;
   private int age;
 //Getter and Setter and toString()
}

@KafkaListener(topics = "testTopic", containerFactory = "kafkaListenerContainerFactory")
    public void onTestEvent(@Valid @Payload Test message) {
        logger.info("Found message {}", message.toString());
 }

@Bean
    public ConcurrentKafkaListenerContainerFactory<String, Test> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, Test> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(eventConsumerFactory());
        factory.setRecordFilterStrategy(consumerRecord -> {
            if (consumerRecord.serializedValueSize() < 5) {
                logger.error("Invalid Message");
                return true;
            }
      });
     return factory;
 }

@Bean
    public ConsumerFactory<String, Test> eventConsumerFactory() {
        Map<String, Object> config = new HashMap<>();
        config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "bootstrapAddress");
        config.put(ConsumerConfig.GROUP_ID_CONFIG, "consumerId");
        config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, FluturaJsonDeserializer.class);
        config.put(FluturaJsonDeserializer.VALUE_DEFAULT_TYPE, Test.class);
        return new DefaultKafkaConsumerFactory<>(config);
 }

0 个答案:

没有答案