使用消费者进行春季Kafka投票

时间:2020-03-16 17:30:47

标签: java apache-kafka spring-kafka

我正在使用Consumer.poll(Duration d)来获取记录。在Kafka主题中,我只有10条记录用于测试目的,分布在6个分区中。我禁用了自动提交,也没有手动提交(同样仅出于测试目的)。执行轮询时,它不是从所有分区中获取数据。我需要循环运行轮询以获取所有数据。我尚未从其默认值覆盖诸如max.poll.size或max.fetch.bytes之类的参数。可能是什么原因?请注意,给定的主题和组ID我只有这个使用者,因此我希望所有分区都可以分配给这个

private Consumer<String, Object> createConsumer() {
    ConsumerFactory<String, Object> consumerFactory = deadLetterConsumerFactory();
    Consumer<String, Object> consumer = consumerFactory.createConsumer();
    consumer.subscribe(Collections.singletonList(kafkaConfigProperties.getDeadLetterTopic()));
    return consumer;
}

try {
       consumer = createConsumer();
       ConsumerRecords<String, Object> records = consumer.poll(Duration.ofMillis(5000));
       processMessages (records , .,....);
} catch (Exception e) {
       ....
} finally {
     if (consumer != null) {
          consumer.unsubscribe();
          consumer.close();
      }
}

编辑 这是详细信息

ConsumerFactory<String, Object> deadLetterConsumerFactory() {
    Properties properties = new Properties();
    properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, server);
    properties.put(SCHEMA_REGISTRY_URL, url);
    properties.put(ProducerConfig.CLIENT_ID_CONFIG,
           "myid" + "-" + CONSUMER_CLIENT_ID_SEQUENCE.getAndIncrement());
    properties.put(SSL_ENDPOINT_IDFN_ALGM,  alg);
    properties.put(SaslConfigs.SASL_MECHANISM, saslmech);
    properties.put(REQUEST_TIMEOUT,  timeout);       
    properties.put(SaslConfigs.SASL_JAAS_CONFIG, config);
    properties.put(SECURITY_PROTOCOL,  protocol);
     properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false");
    consumerProperties.put(ConsumerConfig.GROUP_ID_CONFIG, "groupid");
    consumerProperties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    consumerProperties.forEach((key, value) -> {
        map.put((String) key, value);
    });
    return new DefaultKafkaConsumerFactory<>(map);
}

0 个答案:

没有答案
相关问题