春天的卡夫卡和一次送货保证

时间:2018-09-29 16:02:10

标签: spring spring-boot apache-kafka spring-kafka

我使用Spring Kafka和Spring Boot,只是想知道如何配置我的使用者,例如:

@KafkaListener(topics = "${kafka.topic.post.send}", containerFactory = "postKafkaListenerContainerFactory")
public void sendPost(ConsumerRecord<String, Post> consumerRecord, Acknowledgment ack) {

    // do some logic

    ack.acknowledge();
}

使用一次准确的交货保证吗?

我应该只在org.springframework.transaction.annotation.Transactional方法上添加sendPost注释,就是这样吗?或者我需要执行一些额外的步骤才能实现此目的?

已更新

这是我当前的配置

@Bean
    public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory(KafkaProperties kafkaProperties, KafkaTransactionManager<Object, Object> transactionManager) {

        kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, kafkaConsumerMaxPollIntervalMs);
        kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, kafkaConsumerMaxPollRecords);

        ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
        //factory.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
        factory.getContainerProperties().setTransactionManager(transactionManager);
        factory.setConsumerFactory(consumerFactory(kafkaProperties));

        return factory;
    }


    @Bean
    public Map<String, Object> producerConfigs() {

        Map<String, Object> props = new HashMap<>();

        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
        props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, 15000000);

        return props;
    }

    @Bean
    public ProducerFactory<String, Post> postProducerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public KafkaTemplate<String, Post> postKafkaTemplate() {
        return new KafkaTemplate<>(postProducerFactory());
    }

    @Bean
    public ProducerFactory<String, Update> updateProducerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public KafkaTemplate<String, Update> updateKafkaTemplate() {
        return new KafkaTemplate<>(updateProducerFactory());
    }

    @Bean
    public ProducerFactory<String, Message> messageProducerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public KafkaTemplate<String, Message> messageKafkaTemplate() {
        return new KafkaTemplate<>(messageProducerFactory());
    }

但是失败,并出现以下错误:

***************************
APPLICATION FAILED TO START
***************************

Description:

Parameter 0 of method kafkaTransactionManager in org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration required a single bean, but 3 were found:
    - postProducerFactory: defined by method 'postProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
    - updateProducerFactory: defined by method 'updateProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
    - messageProducerFactory: defined by method 'messageProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]

我在做什么错了?

1 个答案:

答案 0 :(得分:1)

您不应使用手动确认。相反,将KafkaTransactionManager注入到侦听器容器中,并且当侦听器方法正常退出时(否则回滚),容器会将偏移量发送到事务。

您不应仅通过消费者进行一次确认。

编辑

application.yml

spring:
  kafka:
    consumer:
      auto-offset-reset: earliest
      enable-auto-commit: false
      properties:
        isolation:
          level: read_committed
    producer:
      transaction-id-prefix: myTrans.

应用

@SpringBootApplication
public class So52570118Application {

    public static void main(String[] args) {
        SpringApplication.run(So52570118Application.class, args);
    }

    @Bean // override boot's auto-config to add txm
    public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
            ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
            ConsumerFactory<Object, Object> kafkaConsumerFactory,
            KafkaTransactionManager<Object, Object> transactionManager) {
        ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
        configurer.configure(factory, kafkaConsumerFactory);
        factory.getContainerProperties().setTransactionManager(transactionManager);
        return factory;
    }

    @Autowired
    private KafkaTemplate<String, String> template;

    @KafkaListener(id = "so52570118", topics = "so52570118")
    public void listen(String in) throws Exception {
        System.out.println(in);
        Thread.sleep(5_000);
        this.template.send("so52570118out", in.toUpperCase());
        System.out.println("sent");
    }

    @KafkaListener(id = "so52570118out", topics = "so52570118out")
    public void listenOut(String in) {
        System.out.println(in);
    }

    @Bean
    public ApplicationRunner runner() {
        return args -> this.template.executeInTransaction(t -> t.send("so52570118", "foo"));
    }

    @Bean
    public NewTopic topic1() {
        return new NewTopic("so52570118", 1, (short) 1);
    }

    @Bean
    public NewTopic topic2() {
        return new NewTopic("so52570118out", 1, (short) 1);
    }

}