EFK。日志条目未解析

时间:2018-07-11 13:26:30

标签: kubernetes logstash kibana efk fluent-bit

我们使用EFK堆栈,其中F表示流利的位。在我的Kotlin Spring Boot应用程序中,我使用logback和logstash如下配置了日志记录

<appender name="STDOUT_JSON" class="ch.qos.logback.core.ConsoleAppender">
  <encoder class="net.logstash.logback.encoder.LogstashEncoder" >
    <timestampPattern>yyyy-MM-dd' 'HH:mm:ss.SSS</timestampPattern>
    <fieldNames>
      <timestamp>timestamp</timestamp>
      <logger>logger</logger>
      <version>[ignore]</version>
    </fieldNames>
  </encoder>
</appender>

我们在kubernetes中运行该应用程序。现在,有时对于非常冗长的异常(例如,以下),我们在kibana中看到未解析的日志条目。因此,kibana既未检测到logger,也未检测到message,尽管这些字段位于json中。

{"timestamp":"2018-07-11 12:59:40.973","message":"Container exception","logger":"org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer","thread_name":"org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1","level":"ERROR","level_value":40000,"stack_trace":"org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition offer-mail-crawler-new-mails-2 at offset 2. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data [[123, 34, 101, 118, 101, 110, 116, 73, 100, 34, 58, 34, 98, 51, 57, 100, 49, 102, 54, 49, 45, 99, 57, 51, 53, 45, 52, 48, 52, 53, 45, 57, 52, 51, 51, 45, 98, 49, 100, 98, 98, 54, 97, 57, 49, 48, 49, 53, 34, 44, 34, 101, 118, 101, 110, 116, 84, 105, 109, 101, 34, 58, 123, 34, 110, 97, 110, 111, 34, 58, 50, 56, 51, 49, 50, 57, 48, 48, 48, 44, 34, 101, 112, 111, 99, 104, 83, 101, 99, 111, 110, 100, 34, 58, 49, 53, 51, 49, 51, 49, 50, 48, 56, 48, 125, 44, 34, 101, 118, 101, 110, 116, 86, 101, 114, 115, 105, 111, 110, 34, 58, 34, 50, 48, 49, 56, 45, 48, 55, 45, 49, 49, 34, 44, 34, 115, 104, 97, 114, 101, 100, 77, 97, 105, 108, 98, 111, 120, 34, 58, 34, 111, 102, 102, 101, 114, 115, 46, 116, 101, 115, 116, 64, 97, 107, 101, 108, 105, 117, 115, 46, 100, 101, 34, 44, 34, 97, 122, 117, 114, 101, 83, 116, 111, 114, 97, 103, 101, 77, 97, 105, 108, 65, 115, 69, 109, 108, 66, 108, 111, 98, 78, 97, 109, 101, 34, 58, 34, 55, 97, 98, 54, 49, 57, 52, 97, 45, 99, 57, 101, 98, 45, 52, 55, 99, 53, 45, 56, 53, 54, 51, 45, 56, 52, 54, 54, 53, 48, 99, 51, 52, 57, 99, 48, 47, 109, 105, 109, 101, 45, 99, 111, 110, 116, 101, 110, 116, 46, 101, 109, 108, 34, 44, 34, 97, 122, 117, 114, 101, 83, 116, 111, 114, 97, 103, 101, 65, 116, 116, 97, 99, 104, 109, 101, 110, 116, 66, 108, 111, 98, 78, 97, 109, 101, 115, 34, 58, 91, 93, 44, 34, 102, 114, 111, 109, 34, 58, 34, 82, 111, 109, 97, 110, 46, 84, 117, 99, 104, 105, 110, 64, 97, 107, 101, 108, 105, 117, 115, 46, 100, 101, 34, 44, 34, 115, 117, 98, 106, 101, 99, 116, 34, 58, 34, 116, 101, 115, 116, 34, 125]] from topic [new-mails]
Caused by: com.fasterxml.jackson.module.kotlin.MissingKotlinParameterException: Instantiation of [simple type, class com.akelius.crawledmails.NewMailEvent] value failed for JSON property azureStorageMailUuid due to missing (therefore NULL) value for creator parameter azureStorageMailUuid which is a non-nullable type
 at [Source: [B@66872193; line: 1, column: 350] (through reference chain: com.akelius.crawledmails.NewMailEvent["azureStorageMailUuid"])
    at com.fasterxml.jackson.module.kotlin.KotlinValueInstantiator.createFromObjectWith(KotlinValueInstantiator.kt:53)
    at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:138)
    at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:471)
    at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1191)
    at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:314)
    at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:148)
    at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1626)
    at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1237)
    at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:86)
    at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:65)
    at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:55)
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:918)
    at org.apache.kafka.clients.consumer.internals.Fetcher.access$2600(Fetcher.java:93)
    at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1095)
    at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1200(Fetcher.java:944)
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:567)
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:528)
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1086)
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1043)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:628)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.lang.Thread.run(Thread.java:748)
"}

0 个答案:

没有答案
相关问题