NPE同时反序列化Kafka流中的Avro消息

时间:2019-04-02 13:14:37

标签: nullpointerexception apache-kafka avro confluent-schema-registry

我编写了一个小的Java类来测试Avro编码的Kafka主题的使用情况。

    Properties appProps = new Properties();

    appProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "http://***kfk14bro1.lc:9092");
    appProps.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://***kfk14str1.lc:8081");
    appProps.put(StreamsConfig.APPLICATION_ID_CONFIG, "consumer");
    appProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest");
    appProps.put(StreamsConfig.DEFAULT_DESERIALIZATION_EXCEPTION_HANDLER_CLASS_CONFIG,LogAndContinueExceptionHandler.class);


    StreamsBuilder streamsBuilder = new StreamsBuilder();

    streamsBuilder.stream(
                  "coordinates", Consumed.with(Serdes.String(), new GenericAvroSerde()))
              .peek((key, value) -> System.out.println("key=" + key + ", value=" + value));

    new KafkaStreams(streamsBuilder.build(), appProps).start();

运行此类时,可以正常记录SerdeConfigs,可以在下面的日志中看到:

[consumer-56b0e0ca-d336-45cc-b388-46a68dbfab8b-StreamThread-1] INFO io.confluent.kafka.serializers.KafkaAvroSerializerConfig - KafkaAvroSerializerConfig values: 
    schema.registry.url = [http://***kfk14str1.lc:8081]
    basic.auth.user.info = [hidden]
    auto.register.schemas = true
    max.schemas.per.subject = 1000
    basic.auth.credentials.source = URL
    schema.registry.basic.auth.user.info = [hidden]
    value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
    key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

[normal-consumer-56b0e0ca-d336-45cc-b388-46a68dbfab8b-StreamThread-1] INFO io.confluent.kafka.serializers.KafkaAvroDeserializerConfig - KafkaAvroDeserializerConfig values: 
    schema.registry.url = [http://***kfk14str1.lc:8081]
    basic.auth.user.info = [hidden]
    auto.register.schemas = true
    max.schemas.per.subject = 1000
    basic.auth.credentials.source = URL
    schema.registry.basic.auth.user.info = [hidden]
    specific.avro.reader = false
    value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
    key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

但是消息没有被使用,并且为每条消息生成以下日志:

[normal-consumer-56b0e0ca-d336-45cc-b388-46a68dbfab8b-StreamThread-1] WARN org.apache.kafka.streams.errors.LogAndContinueExceptionHandler - Exception caught during Deserialization, taskId: 0_0, topic: coordinates, partition: 0, offset: 782205986
org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 83
Caused by: java.lang.NullPointerException
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:116)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:88)
at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55)
at io.confluent.kafka.streams.serdes.avro.GenericAvroDeserializer.deserialize(GenericAvroDeserializer.java:63)
at io.confluent.kafka.streams.serdes.avro.GenericAvroDeserializer.deserialize(GenericAvroDeserializer.java:39)
at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:58)
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:60)

但是我能够从avro控制台使用者那里读得很好,所以我知道写入该主题的数据没有错。下面的命令可以打印日志:

~/kafka/confluent-5.1.2/bin/kafka-avro-console-consumer --bootstrap-server http://***kfk14bro1.lc:9092 --topic coordinates --property schema.registry.url=http://***kfk14str1.lc:8081 --property auto.offset.reset=latest

1 个答案:

答案 0 :(得分:1)

当您自己实例化Avro Serde时,不会自动使用架构注册URL对其进行配置。

因此,您必须自己进行配置,或者通过添加以下内容来定义默认Serdes:

appProps.setProperty(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
appProps.setProperty(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class.getName());

然后删除

Consumed.with(Serdes.String(), new GenericAvroSerde())

要配置Serde,请使用以下代码(根据您的情况进行调整):

GenericAvroSerde genericAvroSerde = new GenericAvroSerde();
boolean isKeySerde = false;
genericAvroSerde.configure(
     Collections.singletonMap(
         AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG,
         "http://confluent-schema-registry-server:8081/"),
     isKeySerde);