春季启动Kafka时,多个用户如何收听多个主题?

时间:2018-08-10 22:24:33

标签: java spring-boot apache-kafka

当有多个消费者时,我无法收听kafka主题(我的情况2主题)。 在下面的示例中,我有2个消费者工厂,将使用2种不同的JSON消息(一种是用户类型,另一种是事件类型)。两条消息都发布到不同的主题。在这里,当我尝试访问来自topic1的事件消息时,我无法访问,但可以访问用户主题消息。

例如:

@Configuration
@EnableKafka
public class KafkaConsumerConfiguration {      
@Autowired
private Environment environment;

@Bean
public ConsumerFactory<String,User> consumerFactory() {
    Map<String, Object> config = new HashMap<>();
    config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, environment.getProperty("bootstrap.servers"));
    config.put(ConsumerConfig.GROUP_ID_CONFIG, environment.getProperty("user.consumer.group"));
    config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class);

    return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(),
            new JsonDeserializer<>(User.class));

}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, User> kafkaListenerContainerFactory() {
    ConcurrentKafkaListenerContainerFactory<String, User> factory = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactory());
    return factory;
}

@Bean
public ConsumerFactory<String , Event> consumerFactoryEvent(){
    Map<String, Object> config = new HashMap<>();
    config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, environment.getProperty("bootstrap.servers"));
    config.put(ConsumerConfig.GROUP_ID_CONFIG, environment.getProperty("event.consumer.group"));
    config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class);

    return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(),
            new JsonDeserializer<>(Event.class));
}

@Bean
public ConcurrentKafkaListenerContainerFactory<String, Event> kafkaListenerContainerFactoryEvent() {
    ConcurrentKafkaListenerContainerFactory<String, Event> factory = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactoryEvent());
    return factory;
}
}

我的主要应用如下:

@KafkaListener(topics = "${event.topic}")
public void processEvent(Event event) {
..do something..
..post the message to User topic
}
@KafkaListener(topics = "${user.topic}")
public void processUser(User user) {
..do something..
}

我需要首先监听事件主题,并对消息进行一些按摩,然后将其发送给User主题,而我还有另一种方法将监听User主题并对该消息执行某些操作。 我试图将不同的选项传递给@KafkaListener,例如

@KafkaListener(topics="${event.topic}",containerFactory="kafkaListenerContainerFactoryEvent")

但是它不起作用。.我不确定出什么问题了。.任何建议都是有帮助的!

3 个答案:

答案 0 :(得分:1)

如果未在bean中指定名称,则方法名称将为bean名称,请在@KafkaListener中添加具有组ID的bean名称

@KafkaListener(topics="${event.topic}",containerFactory="kafkaListenerContainerFactoryEvent", groupId="")

@KafkaListener(topics="${event.topic}",containerFactory="kafkaListenerContainerFactory", groupId="")

@Bean中指定名称,然后将该名称添加到@kafkaListener

@Bean(name="kafkaListenerContainerFactoryEvent")
public ConcurrentKafkaListenerContainerFactory<String, Event> kafkaListenerContainerFactoryEvent() {
ConcurrentKafkaListenerContainerFactory<String, Event> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactoryEvent());
return factory;
}

答案 1 :(得分:0)

  

在任何文档中都不容易获得。

     

这里我以消费消息为例

     

topic = topic1和bootstrapserver = url1(JSON序列化器和反序列化器)

     

topic = topic2和bootstrapserver = url2(Avro序列化器和反序列化器)

Step1:-

@Bean
public ConsumerFactory<String, String> consumerFactory1() {
    Map<String, Object> props = new HashMap<>();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
            "localhost1:9092"); //This is dummy
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

    return new DefaultKafkaConsumerFactory<>(props);
}

@Bean
public ConsumerFactory consumerFactory2() {
    Map props = new HashMap<>();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
            "localhost2:9092"); //This is dummy
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
    props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    props.put("schema.registry.url", "https://abc.schemaregistery.example.com"); //Again this is dummy or can be avro serilaised class
    return new DefaultKafkaConsumerFactory<>(props);
}


  @Bean(name = "kafkaListenerContainerFactory1")
public ConcurrentKafkaListenerContainerFactory
kafkaListenerContainerFactory1() {
    ConcurrentKafkaListenerContainerFactory<String, String> factory
            = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactory1());
    return factory;
}

 @Bean(name = "kafkaListenerContainerFactory2")
public ConcurrentKafkaListenerContainerFactory
kafkaListenerContainerFactory2() {
    ConcurrentKafkaListenerContainerFactory<String, String> factory
            = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactory2());
    return factory;
}

第二步:-

  • @SpringBootApplication(排除= KafkaAutoConfiguration.class)=> 不要从yml或spring.kafka @ConfigurationProperties定义的属性文件中读取值

第3步:-

 @KafkaListener(
            topics = "topic1",
            containerFactory = "kafkaListenerContainerFactory1" ,
            groupId = "com.groupid1")
    public void receive(ConsumerRecord consumerRecord) throws InterruptedException {


        LOGGER.info("consuming from topic1 {}" , consumerRecord.value());
        Thread.sleep(1000000); //For testing

    }

 @KafkaListener(
            topics = "topic2",
            containerFactory = "kafkaListenerContainerFactory2" ,
            groupId = "com.groupid2")
    public void receive(ConsumerRecord consumerRecord) throws InterruptedException {


        LOGGER.info("consuming from topic2 {}" , consumerRecord.value());
        Thread.sleep(1000000); //For testing

    }

答案 2 :(得分:0)

显然已经太晚了。但它可能会帮助其他人。


您不需要创建多个 ConsumerFactory Bean。您可以在不通知配置中的类(UserEvent)的情况下执行此操作,即 new JsonDeserializer<>(Event.class) 并添加受信任的包。

@Bean
public ConsumerFactory<String,User> consumerFactory() {
    Map<String, Object> config = new HashMap<>();
    config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, environment.getProperty("bootstrap.servers"));
    config.put(ConsumerConfig.GROUP_ID_CONFIG, environment.getProperty("user.consumer.group"));
    config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class);

    // TODO: Remove "*" and add specific package name
    config.put(JsonDeserializer.TRUSTED_PACKAGES, "*"); // <<-- New config added

    return new DefaultKafkaConsumerFactory<>(config);

}

在接收记录时:

@KafkaListener(topics="${event.topic}")
void receiveUserRecord(User record){ ... } # For User POJO

@KafkaListener(topics="${event.topic}")
void receiveEventRecord(Event record){ ... } # For Event POJO
相关问题