我正在使用kafka高级消费者。当我启动消费者时,它会找到所有新消息。它使用Java kafka生成器生成新消息。但是一分钟后,它继续循环,但没有找到新的消息。当我在调试器中暂停执行时,消费者突然开始查找要使用的消息。 我在Java中使用0.8.0版本。 请注意,出现错误时消耗消息的过程将在单独的错误中生成消息。话题。当我停止生成这些错误消息时,我不再遇到此问题。
答案 0 :(得分:0)
问题似乎是我看不到的卡夫卡虫。 如果使用相同的ConsumerConnector创建多个ConsumerIterator(通过为其提供多个主题的映射),则会经常在ConsumerIterators中切换主题。如果您尝试查看consumerIterator,通过暂停调试器,它们会切换回来。
这是我创建具有错误的ConsumerIterators的旧代码:
/**
* @param zookeeperAddresses (includes the port number)
* @param topics all topics to be consumed.
* @return A list of ConsumerIterators.
*/
public List<ConsumerIterator> getConsumers(String zookeeperAddresses, List<String> topics) {
String groupId = "client_" + topics.get(0);
LOGGER.info("Zookeeper address = " + zookeeperAddresses + ", group id = " + groupId);
ConsumerConnector consumer = kafka.consumer.Consumer.createJavaConsumerConnector(
createConsumerConfig(zookeeperAddresses, groupId));
consumers.add(consumer);
Map<String, Integer> topicCountMap = new HashMap<>();
for (String topic : topics) {
topicCountMap.put(topic, Integer.valueOf(1));
}
Map<String, List<KafkaStream<byte[], byte[]>>> consumerMap = consumer.createMessageStreams(topicCountMap);
List<ConsumerIterator> topicConsumers = new LinkedList<>();
for (String topic : topics) {
List<KafkaStream<byte[], byte[]>> streams = consumerMap.get(topic);
assert(streams.size() == 1);
ConsumerIterator<byte[], byte[]> consumerIterator = streams.get(0).iterator();
topicConsumers.add(consumerIterator);
}
return topicConsumers;
}
这是解决此错误的固定代码:
/**
* @param zookeeperAddresses (includes the port number)
* @param topics all topics to be consumed.
* @return A list of ConsumerIterators.
*/
public List<ConsumerIterator> getConsumers(String zookeeperAddresses, List<String> topics) {
String groupId = "client_" + topics.get(0);
LOGGER.info("Zookeeper address = " + zookeeperAddresses + ", group id = " + groupId);
List<ConsumerIterator> topicConsumers = new LinkedList<>();
for (String topic : topics) {
ConsumerConnector consumer = kafka.consumer.Consumer.createJavaConsumerConnector(
createConsumerConfig(zookeeperAddresses, groupId));
consumers.add(consumer);
Map<String, Integer> topicCountMap = new HashMap<>();
topicCountMap.put(topic, Integer.valueOf(1));
Map<String, List<KafkaStream<byte[], byte[]>>> consumerMap = consumer.createMessageStreams(topicCountMap);
List<KafkaStream<byte[], byte[]>> streams = consumerMap.get(topic);
assert(streams.size() == 1);
ConsumerIterator<byte[], byte[]> consumerIterator = streams.get(0).iterator();
topicConsumers.add(consumerIterator);
}
return topicConsumers;
}