* ERROR o.s.k.listener.LoggingErrorHandler - 处理时出错*

时间:2018-01-22 09:54:27

标签: maven spring-boot spring-kafka

我正在尝试创建一个简单的应用程序,我可以在数据库中保存一个用户信息。因此,我试图通过生产者发送用户信息,并希望在消费者中使用该用户信息。为此,我制作了一个EventModel,我在其中设置所有用户信息,以及我通过制作人传递的信息。下面我写了可用的代码。问题我弄清楚抛出日志发送用户信息,但它无法使用这些用户信息。

kafka.log中的例外

    []2018-01-22T06:41:15,797Z INFO  
   o.s.k.l.KafkaMessageListenerContainer - partitions revoked:[]
  []2018-01-22T06:41:15,828Z INFO  o.s.k.l.KafkaMessageListenerContainer 
  - partitions assigned:[com.combine.domain.addUser-0]
  []2018-01-22T06:42:30,962Z ERROR o.s.k.listener.LoggingErrorHandler - 
   Error while processing: ConsumerRecord(topic = 
      com.combine.domain.addUser, partition = 0, 
      offset = 1, key = null, value = AddUserEventModel(name=Test-User-
           Kafka-2, address=Test-User-Kafka-2, age=26))
  org.springframework.kafka.KafkaException: No method found for class com.example.data.combine.eventmodel.AddUserEventModel
at org.springframework.kafka.listener.adapter.DelegatingInvocableHandler.getHandlerForPayload(DelegatingInvocableHandler.java:92)
at org.springframework.kafka.listener.adapter.DelegatingInvocableHandler.getMethodNameFor(DelegatingInvocableHandler.java:146)
at org.springframework.kafka.listener.adapter.HandlerAdapter.getMethodAsString(HandlerAdapter.java:60)
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:131)
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.onMessage(MessagingMessageListenerAdapter.java:101)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:618)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.access$1500(KafkaMessageListenerContainer.java:236)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer$ListenerInvoker.run(KafkaMessageListenerContainer.java:797)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:745)
[]2018-01-22T08:21:20,074Z INFO  o.s.k.l.KafkaMessageListenerContainer- 
partitions revoked:[com.combine.domain.addUser-0]
[]2018-01-22T08:21:20,081Z INFO  o.s.k.l.KafkaMessageListenerContainer-
partitions assigned:[com.combine.domain.addUser-0]

app.log中的例外

   []2018-01-22T06:42:30,800Z INFO  c.e.d.c.controller.MongoController 
    - # send user by kafka model : AddUserEventModel(name=Test-User-
      Kafka-2, address=Test-User-Kafka-2, age=26) with parameter
   []2018-01-22T06:42:30,800Z INFO  c.e.d.c.publisher.AddUserPublished 
      - #sending addUserEventModel
     []2018-01-22T06:42:30,809Z INFO  o.a.k.c.producer.ProducerConfig - 
ProducerConfig values: 
compression.type = none
metric.reporters = []
metadata.max.age.ms = 300000
metadata.fetch.timeout.ms = 60000
reconnect.backoff.ms = 50
sasl.kerberos.ticket.renew.window.factor = 0.8
bootstrap.servers = [localhost:9092]
retry.backoff.ms = 100
sasl.kerberos.kinit.cmd = /usr/bin/kinit
buffer.memory = 33554432
timeout.ms = 30000
key.serializer = class org.springframework.kafka.support.serializer.JsonSerializer
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
ssl.keystore.type = JKS
ssl.trustmanager.algorithm = PKIX
block.on.buffer.full = false
ssl.key.password = null
max.block.ms = 60000
sasl.kerberos.min.time.before.relogin = 60000
connections.max.idle.ms = 540000
ssl.truststore.password = null
max.in.flight.requests.per.connection = 5
metrics.num.samples = 2
client.id = 
ssl.endpoint.identification.algorithm = null
ssl.protocol = TLS
request.timeout.ms = 30000
ssl.provider = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
acks = 1
 batch.size = 16384
    ssl.keystore.location = null
receive.buffer.bytes = 32768
ssl.cipher.suites = null
ssl.truststore.type = JKS
security.protocol = PLAINTEXT
retries = 0
max.request.size = 1048576
value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer
ssl.truststore.location = null
ssl.keystore.password = null
ssl.keymanager.algorithm = SunX509
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
send.buffer.bytes = 131072
linger.ms = 0
 []2018-01-22T06:42:30,832Z INFO  o.a.k.common.utils.AppInfoParser - 
 Kafka version : 0.9.0.1
 []2018-01-22T06:42:30,833Z INFO  o.a.k.common.utils.AppInfoParser - 
 Kafka commitId : 23c69d62a0cabf06
 []2018-01-22T08:21:15,047Z INFO  o.a.k.c.c.i.AbstractCoordinator - 
 Marking the coordinator 2147483647 dead.
 []2018-01-22T08:21:19,485Z ERROR o.a.k.c.c.i.ConsumerCoordinator - 
 Error UNKNOWN_MEMBER_ID occurred while committing offsets for group 
  com.combine.domain.addUser
 []2018-01-22T08:21:19,486Z WARN  o.a.k.c.c.i.ConsumerCoordinator - 
  Auto offset commit failed: Commit cannot be completed due to group 
       rebalance
 []2018-01-22T08:21:19,487Z ERROR o.a.k.c.c.i.ConsumerCoordinator - 
      Error UNKNOWN_MEMBER_ID occurred while committing offsets for 
      group com.combine.domain.addUser
  []2018-01-22T08:21:19,487Z WARN  o.a.k.c.c.i.ConsumerCoordinator - 
      Auto offset commit failed: 
  []2018-01-22T08:21:20,075Z INFO  o.a.k.c.c.i.AbstractCoordinator - 
      Attempt to join group com.combine.domain.addUser failed due to 
     unknown member id, resetting and retrying.

的pom.xml

    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka</artifactId>
        <version>${spring-kafka-version}</version>
    </dependency>

卡夫卡构

    @Configuration
 public class BaseKafkaConfiguration {

  @Value("${spring.kafka.bootstrap-servers}")
  private String servers;

  @Bean
  public ProducerFactory<String, AddUserEventModel> producerFactory() {
  Map<String, Object> property = new HashMap<>();
  property.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, servers);
  property.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, 
                            JsonSerializer.class);
  property.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, 
                            JsonSerializer.class);
    return new DefaultKafkaProducerFactory<>(property);
}
 @Bean
 public KafkaTemplate<String, AddUserEventModel> kafkaTemplate() {
   return new KafkaTemplate<>(producerFactory());
   }
}

BasicConsumerConfig

    @EnableKafka
    @Configuration
       public class BasicConsumerConfig {

       @Value("${spring.kafka.bootstrap-servers}")
       private String servers;

       public ConsumerFactory<String, AddUserEventModel> 
                      kafkaConsumerFactory() {
       Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, servers);
props.put(ConsumerConfig.GROUP_ID_CONFIG, 
                   DomainEventNames.COM_COMBINE_DOMAIN_ADD_USER);
 return new DefaultKafkaConsumerFactory<>(props, new 
StringDeserializer(), new JsonDeserializer<>(AddUserEventModel.class));
   }

   @Bean
   public ConcurrentKafkaListenerContainerFactory<String, 
          AddUserEventModel> containerFactory() {
   ConcurrentKafkaListenerContainerFactory<String, AddUserEventModel> 
            factory =new ConcurrentKafkaListenerContainerFactory<>();
   factory.setConsumerFactory(kafkaConsumerFactory());
   return factory;
  }
 }

出版商

    @Slf4j
    @Component
     public class AddUserPublished  {
       @Autowired
       private KafkaTemplate<String, AddUserEventModel> kafkaTemplate;
     public void publish(AddUserEventModel addUserEventModel) {
     log.info("#sending addUserEventModel");
     kafkaTemplate.send(DomainEventNames.COM_COMBINE_DOMAIN_ADD_USER, 
                                       addUserEventModel);
          try {
               TimeUnit.MILLISECONDS.sleep(2000);
              } catch (InterruptedException e) {
    log.error("exception at addUserEventModel while thread sleep", e);
    }
  }
}

消费者

   @Component
   @KafkaListener(topics = 
       DomainEventNames.COM_COMBINE_DOMAIN_ADD_USER, containerFactory =
                                       "containerFactory")
   @Slf4j
     public class AddUserConsumer {
       @Autowired
       private MongoUserRepository mongoUserRepository;
       @Autowired
       private ObjectMapper objectMapper;
       public void addUserConsumer(AddUserEventModel addUserEventModel) 
          {
             log.info("#AdduserConsumer consuming addUserEventModel : 
                             {} ", addUserEventModel);
            try {
                MongoUser mongoUser = new MongoUser();
                BeanUtils.copyProperties(addUserEventModel, mongoUser);
                this.mongoUserRepository.save(mongoUser);
               log.info("#SuccessFully saved  consumed object : {}", 
                   mongoUser);
               } catch (Exception e) {
            log.error("#AddUserConsumer exception during consume 
         addUserEventModel : {}, with error : {}", addUserEventModel,
                                                        e);
        }
      }
    }

DomainEventNames

    public final class DomainEventNames {
         public static final String COM_COMBINE_DOMAIN_ADD_USER = 
                "com.combine.domain.addUser";
       }

application.properties

    spring.kafka.bootstrap-servers=localhost:9092

topicName

    com.combine.domain.addUser    

我在本地创建的上述主题

如果我的查询在任何时候都不清楚,请发表评论我将尽力使其更清晰。

感谢高级

1 个答案:

答案 0 :(得分:2)

在课程级别使用@KafkaListener时,您必须在方法级别使用@KafkaHandlerhttps://docs.spring.io/spring-kafka/docs/2.1.1.RELEASE/reference/html/_reference.html#class-level-kafkalistener