spring-cloud-stream-binder-kafka-streams-在功能实现中阅读Avro消息

时间:2019-11-09 22:27:43

标签: spring-cloud-stream spring-cloud-stream-binder-kafka

我正尝试使用来自已在Avro中序列化的主题的消息。该文档对此工作原理非常困惑。 https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/3.0.0.M3/reference/html/spring-cloud-stream-binder-kafka.html#_inbound_deserialization

我尝试阅读的消息是avro序列化消息。我在同一项目中具有键和值的架构,并从该架构生成了类-键和值。

我的困惑是,有一些应用程序属性和代码的独特组合才能使其正常工作。现在,我似乎错了,并且我一直在尝试尝试一堆属性和代码组合,但是它们都不起作用。

我一直遇到的错误是

Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data [[0, 0, 0, 0, 7, -46, 15]] from topic [dbserver1.inventory.customers]
Caused by: com.fasterxml.jackson.core.JsonParseException: Illegal character ((CTRL-CHAR, code 0)): only regular white space (\r, \n, \t) is allowed between tokens
 at [Source: (byte[])"�"; line: 1, column: 2]
    at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1840)
    at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:712)
    at com.fasterxml.jackson.core.base.ParserMinimalBase._throwInvalidSpace(ParserMinimalBase.java:690)

似乎默认的json序列化程序正在启动并尝试反序列化avro序列化的消息。

我的代码如下所示

@SpringBootApplication
class SpringBootKafkaConsumer {

  @Bean
  fun process(): Consumer<KStream<SpecificAvroSerde<Key>, SpecificAvroSerde<Value>>> {
    return Consumer { input -> input.foreach { key, value ->
      println("============key = $key")
      println("===========value = $value")
    }}
  }
}

fun main(args: Array<String>) {
  runApplication<SpringBootKafkaConsumer>(*args)
}

application.yml

spring:
  application:
    name: customer-balance
  cloud:
    stream:
      kafka:
        streams:
          binder:
            configuration:
              application:
                id: customer-balance-1
            consumer-properties:
              key.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
              value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
              schema.registry.url: http://localhost:8081
              specific.avro.reader: true

      bindings:
        process_in:
          destination: "dbserver1.inventory.customers"
          keySerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
          valueSerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
          nativeDecoding: true
          startOffset: earliest
          content-type: application/*+avro

logging:
  level:
    org.springframework.kafka.config: trace

0 个答案:

没有答案
相关问题