Spring Cloud StreamListener @Output KStream Serdes似乎不起作用

时间:2019-02-25 11:28:45

标签: spring-cloud-stream spring-kafka

我有一个流监听器

@StreamListener(target = "requesti")
@SendTo("responseo")
public KStream<UUID,Account> process(KStream<UUID, Account> events) {
    // Predicate<UUID, Event> isAccount = (key, value) ->
    // value.getEntity().getClass().equals(Account.class);

    // @formatter:off
    return events
            //.filter(isAccount)
            .peek((key, value) -> {
                log.debug("Processing {} {}", key, value);
            });
            /*
            .filter(isAccount)
            .map((key, value) -> process(value))

            .peek((key, value) -> {
                log.debug("Processed {} {}", key, value);
            });
            */
    // @formatter:on

}

@Input(“ requesti”)的配置如下;

spring.cloud.stream.kafka.streams.bindings.requesti.consumer.application-id=repo-event-consumer
spring.cloud.stream.bindings.requesti.destination=request
spring.cloud.stream.bindings.requesti.content-type=application/json
spring.cloud.stream.bindings.requesti.consumer.header-mode=raw

和@output(“ responseo”)配置如下

spring.cloud.stream.kafka.streams.bindings.responseo.consumer.application-id=repo-response-producer
spring.cloud.stream.bindings.responseo.destination=response
spring.cloud.stream.bindings.responseo.content-type=application/json
spring.cloud.stream.bindings.responseo.producer.header-mode=raw
spring.cloud.stream.bindings.responseo.producer.use-native-encoding=true
spring.cloud.stream.kafka.streams.bindings.responseo.producer.key-serde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.kafka.streams.bindings.responseo.producer.value-serde=org.springframework.kafka.support.serializer.JsonSerde

“我的处理器”接收到一个请求,也可以发送输出,但输出结果如下

[Producer clientId = repo-event-consumer -49827b40-2357-4af0-8103-228343faa59e-StreamThread-1-producer]发送记录ProducerRecord(topic = response,partition = null,headers = RecordHeaders(headers = [RecordHeader(key = TypeId ,value = [117,107,46,111,114,103,46,99,97,116,97,112,117,108,116 ,46,101,115,46,99,117,98,101,46,115,101,114,118,105,99,101,115,46,97,99,99,111,117,110,116 ,46,109,111,100,101,108,46,65,99,99,111,117,110,116])],isReadOnly = true),key = [B @ 6a5e4294,value = [B @ 5a0852e1 ,时间戳为1551093349173),并向主题响应分区2回调org.apache.kafka.streams.processor.internals.RecordCollectorImpl$1@336dbba5

我与生产者记录ID混淆的一些事情不是“ repo-response-producer”,其次我不应该使用key-serde / value-serde

发送记录ProducerRecord(topic = request,partition = null,headers = RecordHeaders(headers = [RecordHeader(key = Key_TypeId ),value = [106,97,118,97,46,117, 116、105、108、46、85、85、73、68]),RecordHeader(键= TypeId ,值= [117、107、46、111、114、103、46、99, 97、116、97、112、117、108、116、46、101、115、46、99、117、98、101、46、115、101、114、118、105、99、101、115、46, 97,99,99,111,117,110,116,46,109,111,100,101,108,46,65,99,99,111,117,110,116])]],isReadOnly = true), key = 6f0f50e2-3add-4d22-a370-cac66d016af0,value = Account(),具有回调org.springframework.kafka.core.KafkaTemplate$$Lambda$582/533392019@85ab964到主题请求分区2

,默认的serdeConfig是

    spring.cloud.stream.kafka.streams.binder.configuration.default.key.serde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.kafka.streams.binder.configuration.default.value.serde=org.springframework.kafka.support.serializer.JsonSerde

Repo

1 个答案:

答案 0 :(得分:0)

这里有一个示例,演示了如何使用Kafka Streams活页夹:https://github.com/schacko-samples/json-serde-exampleclass进行出站工作。 运行该示例,并确保它可以工作。 查看JsonSerde以获得配置详细信息。我在提供的application.yml中放了一些细节。