我正在尝试使用spring-cloud-stream-binder-kafka-streams项目阅读KTable。我们可以使用spring @StreamListener和spring-cloud-streams提供的所有接口来读取KTable。
我在阅读KTable时遇到LongDeserializer异常。
我正在使用springCloudVersion = 'Finchley.RC1'
springBootVersion = '2.0.1.RELEASE'
该项目的github link位于, https://github.com/jaysara/KStreamAnalytics
这是堆栈跟踪,
Exception in thread "panalytics-ac0fa75f-2ae4-4b26-9a04-1f80d1479112-StreamThread-2" org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately. at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:74) at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:91) at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117) at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:549) at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:920) at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:821) at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:774) at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:744) Caused by: org.apache.kafka.common.errors.SerializationException: Size of data received by LongDeserializer is not 8
答案 0 :(得分:2)
spring.cloud.stream.bindings.policyPaidAnalytic.producer.useNativeEncoding=true
默认情况下,活页夹尝试在出站上序列化并使用application/json
作为内容类型。所以,在你的情况下,它出现了json(String),这就是为什么你得到Long序列化异常。通过将上面的标志设置为true,您要求绑定器保持不变,并让Kafka Streams使用LongSerde
进行本地序列化。
重新运行时,您可能希望清除主题policyAnalytic
或使用新主题。
希望有所帮助。