空指针异常/未找到当我尝试处理&在Avro架构中接收数据

时间:2017-07-07 20:48:34

标签: apache-kafka avro kafka-producer-api apache-kafka-streams confluent

我正在使用处理器来使用来自主题的字节数组serdes的字节数组数据,将它们处理成通用记录(基于我从HTTP GET请求获得的模式)并将它们发送到具有格式化avro模式的主题注册表中。

我从HTTP GET请求中检索模式并根据它映射我的数据以生成遵循模式的通用记录没有问题。然而,当我试图将它沉入主题时,我得到一个空指针异常:

org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
    Caused by: java.lang.NullPointerException
atio.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.
    java:72        )
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:54)
at 
   org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:78)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:79)
atorg.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl
     .java:83)
at streamProcessor.XXXXprocessor.process(XXXXprocessor.java:80)
at streamProcessor.XXXXprocessor.process(XXXXprocessor.java:1)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
atorg.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetr
    icsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
atorg.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl
    .java:111)
at streamProcessor.SelectorProcessor.process(SelectorProcessor.java:33)
at streamProcessor.SelectorProcessor.process(SelectorProcessor.java:1)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
atorg.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetr
    icsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
atorg.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl
    .java:83)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:627)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:361)

这是我的拓扑代码:

//Stream Properties
Properties config = new Properties();
config.put(StreamsConfig.APPLICATION_ID_CONFIG, "processor-kafka-streams234");
config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "xxxxxxxxxxxxxxxxxxxxxx:xxxx");
config.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
config.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, 
        Serdes.ByteArray().getClass().getName());

config.put(StreamsConfig.TIMESTAMP_EXTRACTOR_CLASS_CONFIG, 
        WallclockTimestampExtractor.class);



//Build topology
TopologyBuilder builder = new TopologyBuilder();
builder.addSource("messages-source", "mytest2");
builder.addProcessor("selector-processor", () -> new SelectorProcessor(), "messages-source");

builder.addProcessor("XXXX-processor", () -> new XXXXprocessor(), "selector-processor");
builder.addSink("XXXX-sink", "XXXXavrotest", new KafkaAvroSerializer(), new               
        KafkaAvroSerializer(), "XXXX-processor");



//Start Streaming
KafkaStreams streaming = new KafkaStreams(builder, config);
streaming.start();
System.out.println("processor streaming...");

在问题论坛上阅读了一些内容后,我发现在创建KafkaAvroSerializer时我可能需要注入客户端,所以我将该行更改为:

  SchemaRegistryClient client = new 
  CachedSchemaRegistryClient("xxxxxxxxxxxxxxxxxxxxxx:xxxx/subjects/xxxxschemas/versions", 1000);
  builder.addSink("XXXX-sink", "XXXXavrotest", new KafkaAvroSerializer(client), new 
  KafkaAvroSerializer(client), "XXXX-processor");

导致HTTP 404 Not Found Exception ...

1 个答案:

答案 0 :(得分:0)

我的网址有误:P

由于我的主题上有cleanup.policy设置,因此除了null之外,键必须是init。