Apache Kafka KTable聚合ClassCastException

时间:2018-11-29 11:48:33

标签: java apache-kafka apache-kafka-streams

我有一个与流有关的apache kafka主题“ historyTopic”:

KStream<Long, byte[]> stream = builder.stream("historyTopic");

我还有一个与流关联的表:

KTable<Long, HistoryObject> table = stream.groupByKey().aggregate(() -> {
            return new HistoryObject();
        }, (key, value, aggregate) -> {
        try {
           aggregate.add(deserializeModel(value));
        } catch (IOException e) {
           // TODO Auto-generated catch block
           e.printStackTrace();
        }
        return aggregate;
    }, Materialized.with(Serdes.Long(), Serdes.serdeFrom(serializer, deserializer)).as("history-store"));

我希望聚合中的值为byte [],聚合变量为HistoryObject类的对象,在初始化程序中,我首先创建一个新对象,然后在聚合方法中,我希望反序列化byte []值并将其添加到聚合中的列表(HistoryObject),但是出现以下异常:

java.lang.ClassCastException: [B cannot be cast to com.maxflow.historyservice.model.HistoryObject
at org.apache.kafka.streams.kstream.internals.KStreamAggregate$KStreamAggregateProcessor.process(KStreamAggregate.java:91) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:50) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.runAndMeasureLatency(ProcessorNode.java:244) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:133) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:87) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:302) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:94) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:409) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.processAndMaybeCommit(StreamThread.java:964) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:832) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:767) ~[kafka-streams-2.0.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:736) ~[kafka-streams-2.0.1.jar:na]

deserializeModel方法:

@SuppressWarnings("unchecked")
    private static Card deserializeModel(byte[] serialized) throws IOException {
    ByteArrayInputStream bis = new ByteArrayInputStream(serialized);
    ObjectInput in = null;
    Card  historyData = null;
    try {
        in = new ObjectInputStream(bis);
        historyData = (Card) in.readObject();
    } catch (Exception e) {
        e.printStackTrace();
    } finally {
        bis.close();
    }
    return historyData;
    }

在这种方法中,我将流中的byte []反序列化为Card对象,然后尝试将其从HistoryObject添加到列表中

我的序列化器和反序列化器类如下:

public class HistoryDataValueSerializer implements Closeable, AutoCloseable, Serializer<HistoryObject> {

    @SuppressWarnings("unused")
    private boolean isKey;

    @Override
    public void configure(Map<String, ?> configs, boolean isKey)
    {
        this.isKey = isKey;
    }

    @Override public byte[] serialize(String arg0, HistoryObject value) {
        byte[] retVal = null;
        ObjectMapper objectMapper = new ObjectMapper();
        try {
          retVal = objectMapper.writeValueAsString(value).getBytes();
        } catch (Exception e) {
          e.printStackTrace();
        }
        return retVal;
      }

    @Override
    public void close()
    {

    }

  }


public class HistoryDataValueDeserializer implements Closeable, AutoCloseable, Deserializer<HistoryObject> {

    @Override
    public void configure(Map<String, ?> configs, boolean isKey) {

    }

    @Override
    public HistoryObject deserialize(String s, byte[] value) {
    ObjectMapper mapper = new ObjectMapper();
    HistoryObject historyObject = null;
    try {
        historyObject =  mapper.readValue(value, HistoryObject.class);    
    } catch (Exception e) {

        e.printStackTrace();
    }
    return historyObject;
    }


    @Override
    public void close() {

    }

  }

0 个答案:

没有答案