Spring和Kafka流-如何使用查询API

时间:2019-11-18 16:25:37

标签: spring apache-kafka apache-kafka-streams spring-kafka

我是kafka和kafka信息流的新手。我有与kafka生产者,消费者,KStream和KTable一起使用的基本Spring服务。现在,我想检查我的KTable记录,因此,为了达到该目的,我想与Kafka Query API一起使用。

这可以通过以下方式实现(无需Spring集成):

KafkaStreams streams = new KafkaStreams(topology, config);
// Get access to the custom store
MyReadableCustomStore<String,String> store = streams.store("the-custom-store", new MyCustomStoreType<String,String>());
// Query the store
String value = store.read("key");

现在,我尝试使用基于Spring的InteractiveQueryService进行查询。.但是我在Spring引导中遇到了一些相关问题。

在Spring中使用kafka查询API的最佳方法是什么?

我的服务中的Spring kafka配置如下:

@Bean("streamsBuilder")
public StreamsBuilderFactoryBean recordsStreamBuilderFactoryBean() {
    Map<String, Object> config = new HashMap<>();
    // set some properties
    return new StreamsBuilderFactoryBean(new KafkaStreamsConfiguration(config));
}

可以请教吗?

1 个答案:

答案 0 :(得分:3)

这是一个Spring Boot应用程序,展示如何...

@SpringBootApplication
@EnableKafkaStreams
public class So58918956Application {

    public static void main(String[] args) {
        SpringApplication.run(So58918956Application.class, args);
    }

    @Bean
    public CountDownLatch latch(StreamsBuilderFactoryBean streamsBuilderFB) {
        CountDownLatch latch = new CountDownLatch(1);
        streamsBuilderFB.setStateListener((newState, oldState) -> {
            if (State.RUNNING.equals(newState)) {
                latch.countDown();
            }
        });
        return latch;
    }

    @Bean
    public KTable<String, String> table(StreamsBuilder streamsBuilder) {
        Serde<String> serde = Serdes.String();
        KTable<String, String> table = streamsBuilder.table("so58918956",
                Consumed.with(serde, serde)
                        .withOffsetResetPolicy(AutoOffsetReset.EARLIEST), 
                Materialized.as("the-custom-store"));
        return table;
    }

    @Bean
    public ApplicationRunner runner(StreamsBuilderFactoryBean streamsBuilderFB,
            KafkaTemplate<String, String> template, KTable<String, String> table) {

        return args -> {
            template.send("so58918956", "key", "value");
            latch(streamsBuilderFB).await(10, TimeUnit.SECONDS);
            ReadOnlyKeyValueStore<String, String> store = streamsBuilderFB.getKafkaStreams().store(
                    table.queryableStoreName(),
                    QueryableStoreTypes.keyValueStore());
            System.out.println(store.get("key"));
        };
    }

    @Bean
    public NewTopic topic() {
        return TopicBuilder.name("so58918956").partitions(1).replicas(1).build();
    }

}