我使用以下序列化程序序列化了 avro 数据。
public class AvroSerializer<T extends GenericRecord> implements Serializer<T> {
private static final Logger LOGGER = LoggerFactory.getLogger(AvroSerializer.class);
@Override
public void close() {
// No-op
}
@Override
public void configure(Map<String, ?> arg0, boolean arg1) {
// No-op
}
@Override
public byte[] serialize(String topic, T data) {
try {
byte[] result = null;
if (data != null) {
LOGGER.debug("data='{}'", data);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
BinaryEncoder binaryEncoder =
EncoderFactory.get().binaryEncoder(byteArrayOutputStream, null);
DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(data.getSchema());
datumWriter.write(data, binaryEncoder);
binaryEncoder.flush();
byteArrayOutputStream.close();
result = byteArrayOutputStream.toByteArray();
LOGGER.debug("serialized data='{}'", DatatypeConverter.printHexBinary(result));
}
return result;
} catch (IOException ex) {
throw new SerializationException(
"Can't serialize data='" + data + "' for topic='" + topic + "'", ex);
}
}
}
为了反序列化它,我使用了以下反序列化器。
package com.rms.rsc.kafkaavro.util;
import java.util.Arrays;
import java.util.Map;
import javax.xml.bind.DatatypeConverter;
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.Decoder;
import org.apache.avro.io.DecoderFactory;
import org.apache.avro.specific.SpecificDatumReader;
import org.apache.avro.specific.SpecificRecordBase;
import org.apache.kafka.common.errors.SerializationException;
import org.apache.kafka.common.serialization.Deserializer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AvroDeserializer<T extends GenericRecord> implements Deserializer<T> {
private static final Logger LOGGER = LoggerFactory.getLogger(AvroDeserializer.class);
protected final Class<T> targetType;
public AvroDeserializer(Class<T> targetType) {
this.targetType = targetType;
}
@Override
public void close() {
// No-op
}
@Override
public void configure(Map<String, ?> arg0, boolean arg1) {
// No-op
}
@SuppressWarnings("unchecked")
@Override
public T deserialize(String topic, byte[] data) {
try {
T result = null;
if (data != null) {
LOGGER.debug("data='{}'", DatatypeConverter.printHexBinary(data));
DatumReader<GenericRecord> datumReader =
new SpecificDatumReader<>(targetType.newInstance().getSchema());
Decoder decoder = DecoderFactory.get().binaryDecoder(data, null);
result = (T) datumReader.read(null, decoder);
LOGGER.debug("deserialized data='{}'", result);
}
return result;
} catch (Exception ex) {
throw new SerializationException(
"Can't deserialize data '" + Arrays.toString(data) + "' from topic '" + topic + "'", ex);
}
}
}
但是现在当我尝试反序列化数据时,它显示以下错误。
2021-03-12 15:45:47.138 ERROR 19984 --- [nio-8089-exec-3] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer] with root cause
java.lang.NoSuchMethodException: com.rms.rsc.kafkaavro.util.AvroDeserializer.<init>()
at java.base/java.lang.Class.getConstructor0(Class.java:3349) ~[na:na]
at java.base/java.lang.Class.getDeclaredConstructor(Class.java:2553) ~[na:na]
at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:351) ~[kafka-clients-2.7.0.jar:na]
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:384) ~[kafka-clients-2.7.0.jar:na]
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:406) ~[kafka-clients-2.7.0.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:713) ~[kafka-clients-2.7.0.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:646) ~[kafka-clients-2.7.0.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:626) ~[kafka-clients-2.7.0.jar:na]
属性配置如下所示。
Properties properties = new Properties();
// normal consumer
properties.setProperty("bootstrap.servers","127.0.0.1:9092");
properties.put("auto.commit.enable", "false");
properties.put("auto.offset.reset", "earliest");
// avro part (deserializer)
properties.setProperty("key.deserializer", StringDeserializer.class.getName());
properties.setProperty("value.deserializer", AvroDeserializer.class.getName());
KafkaConsumer<String, GenericRecord> kafkaConsumer = new KafkaConsumer<>(properties);
并且错误在下面一行中被抛出。
KafkaConsumer<String, GenericRecord> kafkaConsumer = new KafkaConsumer<>(properties);
在发送到 kafka 主题之前,我已将我的数据转换为 GenericData
。这里可能有什么问题?
答案 0 :(得分:0)
问题存在是因为我尝试使用 GenericRecord 类来生成模式。我没有传递我最初用来编码数据的模式。传递模式后,我能够检索数据。
DatumReader<GenericRecord> datumReader =
new SpecificDatumReader<>(targetType.newInstance().getSchema());
这里我使用的是具有其他模式的 GenericRecord。传递我自己的架构解决了这个问题。