我在kafka中有两个用于序列化和反序列化的类。序列化工作正常,但是反序列化有问题。
我找到了很多解决方案,但是没有用。
具有通用类T的反序列化器
public class DeserializerU<T> implements Deserializer<T> {
@Override
public void configure(Map map, boolean bln) {
}
@Override
public void close() {
}
@Override
public T deserialize(String string, byte[] bytes) {
ObjectMapper mapper = new ObjectMapper();
T object = null;
try {
object = mapper.readValue(bytes, new TypeReference<T>() {});
} catch (Exception e) {
e.printStackTrace();
}
return object;
}
序列化器
public class MyObjectSerializer implements Serializer {
@Override
public void configure(Map map, boolean bln) {
}
@Override
public byte[] serialize(String string, Object t) {
byte[] retVal = null;
ObjectMapper objectMapper = new ObjectMapper();
try {
retVal = objectMapper.writeValueAsString(t).getBytes();
} catch (Exception e) {
e.printStackTrace();
}
return retVal;
}
@Override
public void close() {
}
属性设置反序列化器
Properties props = new Properties();
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, new DeserializerU<MyOwnObject>().getClass());
如果我将“ TypeRefence(){}”替换为特定类型,则反序列化器可以工作,但是我需要对许多对象使用反序列化器。我还尝试了convertValue而不是readValue,但是所有内容都返回了LinkedHashMap,该链接无法转换为我的对象。有关如何执行此操作的任何建议? 感谢您的帮助
答案 0 :(得分:0)
我的回答可能会晚一些,但是,可以帮助某人。
有了这个,我不需要为每个主题创建一个 ConsumerFactory,而且,如果需要,我可以解析 json。
@KafkaListener(topics = "${topic...}")
public void consume(MyObject message) { ... }
无需为每个对象创建一个 JsonDeserializer。
我的自定义解串器类:
@Component
public class CustomJsonDeserializer implements Deserializer<Object> {
private final ObjectMapper mapper;
// This MAP maps the topic to the class that I need to convert.
private final Map<String, Class<?>> maps;
public CustomJsonDeserializer(
// I can access the application properties that were defined
final Environment environment,
final ObjectMapper mapper
) {
this.mapper = mapper;
maps = new HashMap<>(2);
maps.put(environment.getProperty("my-topic-1"), MyClass1.class);
maps.put(environment.getProperty("my-topic-2"), MyClass2.class);
}
@Override
public Object deserialize(String topic, byte[] data) {
if (Objects.isNull(data) || data.length == 0) {
return null;
}
try {
return mapper.readValue(data, target);
} catch (IOException e) {
// TODO
e.printStackTrace();
throw new RuntimeException(e);
}
}
我的 KafkaConfiguration 类:
@EnableKafka
@Configuration
public class KafkaConfiguration {
@Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
@Value("${spring.kafka.consumer.group-id}")
private String groupId;
// You need to create a bean of ObjectMapper, so, spring can inject it into here
@Autowired
private CustomJsonDeserializer customJsonDeserializer;
@Bean
public ConsumerFactory<String, Object> consumerFactory() {
final var properties = new HashMap<String, Object>(6);
properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
properties.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
properties.put(ConsumerConfig.ALLOW_AUTO_CREATE_TOPICS_CONFIG, false);
return new DefaultKafkaConsumerFactory<String, Object>(properties, new StringDeserializer(), customJsonDeserializer);
}
@Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, Object>> kafkaListenerContainerFactory() {
final var factory = new ConcurrentKafkaListenerContainerFactory<String, Object>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
然后,在我的消费者身上,我可以写:
MyConsumers.java
@KafkaListener(topics = "${my-topic-1}")
public void consume(final MyClass1 item) {
//Do what whatever you want
}
@KafkaListener(topics = "${my-topic-2}")
public void consume(final MyClass2 item) {
//Do what whatever you want
}