无法使用apache avro反序列化数据

时间:2018-01-17 09:05:31

标签: spring-boot apache-kafka avro

我有一个spring boot应用程序,它从kafka代理发送和接收数据,我使用apache avro作为SerDer。 我到目前为止所做的是使用maven插件生成类,模式非常简单:

{"namespace": "com.domain",
  "type": "record",
  "name": "User",
  "fields": [
    {"name": "name", "type": "string"},
    {"name": "favorite_number",  "type": ["int", "null"]},
    {"name": "favorite_color", "type": ["string", "null"]}
  ]
}

但是,我得到了一个非常奇怪的例外:

  

org.springframework.messaging.converter.MessageConversionException:   无法从[com.domain.User]转换为[com.domain.User]   GenericMessage [payload = {" name":" a"," favorite_number":4,   " favorite_color":" test6"},headers = {kafka_offset = 0,   kafka_receivedMessageKey = null,kafka_receivedPartitionId = 1,   kafka_receivedTopic = batchqueue}]

以下是我的配置类:

@Configuration
public class ConsumerConfig {
    @Bean
    public Map<String, Object> consumerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(org.apache.kafka.clients.consumer.ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(org.apache.kafka.clients.consumer.ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(org.apache.kafka.clients.consumer.ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, AvroDeserializer.class);
        props.put(org.apache.kafka.clients.consumer.ConsumerConfig.GROUP_ID_CONFIG, "avro");

        return props;
    }

    @Bean
    public ConsumerFactory<String, User> consumerFactory() {
        return new DefaultKafkaConsumerFactory<>(consumerConfigs(),
            new StringDeserializer(),
            new AvroDeserializer<>(User.class));
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, User> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, User> factory =
            new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());

        return factory;
    }

    @Bean
    public Consumer receiver() {
        return new Consumer();
    }
}

public class ProducerConfig {
    @Bean
    public Map<String, Object> producerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(org.apache.kafka.clients.producer.ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(org.apache.kafka.clients.producer.ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(org.apache.kafka.clients.producer.ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, AvroSerializer.class);

        return props;
    }

    @Bean
    public ProducerFactory<String, User> producerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public KafkaTemplate<String, User> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }

    @Bean
    public Producer sender() {
        return new Producer();
    }

}

知道为什么会这样吗?

更新 我没有使用包含&#34; AvroSerializer&#34;我正在使用从this教程复制的类。

对于制作人类:

public class Producer {
    private static final Logger LOGGER = LoggerFactory.getLogger(Producer.class);

    @Autowired
    private KafkaTemplate<String, User> kafkaTemplate;

    public void send(User data) {
        LOGGER.info("sending job data='{}'", data.toString());
        kafkaTemplate.send(topicname, data);
    }
}

0 个答案:

没有答案