Serilog HTTP接收器+ Logstash:将Serilog消息数组拆分为单独的日志事件

时间:2017-01-19 16:10:31

标签: c# logging logstash elastic-stack serilog

我们正在使用Serilog HTTP sink将消息发送到Logstash。但是HTTP消息体是这样的:

{
  "events": [
    {
      "Timestamp": "2016-11-03T00:09:11.4899425+01:00",
      "Level": "Debug",
      "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
      "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
      "Properties": {
        "Heartbeat": {
          "UserName": "Mike",
          "UserDomainName": "Home"
        },
        "Computer": "Workstation"
      }
    },
    {
      "Timestamp": "2016-11-03T00:09:12.4905685+01:00",
      "Level": "Debug",
      "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
      "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
      "Properties": {
        "Heartbeat": {
          "UserName": "Mike",
          "UserDomainName": "Home"
        },
        "Computer": "Workstation"
      }
    }
  ]
}

即。日志记录事件以数组形式进行批处理。可以逐个发送消息,但它仍然是一个项目的数组。

然后,该事件在Kibana中显示为具有值<{p}}的字段message

{
  "events": [
    {
      // ...
    },
    {
      // ...
    }
  ]
}

即。字面上来自HTTP输入的内容。

如何将events数组中的项目拆分为单个日志记录事件和&#34;拉出&#34;顶层的属性,以便在ElasticSearch中有两个日志事件:

  "Timestamp": "2016-11-03T00:09:11.4899425+01:00",
  "Level": "Debug",
  "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
  "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
  "Properties": {
    "Heartbeat": {
      "UserName": "Mike",
      "UserDomainName": "Home"
    },
    "Computer": "Workstation"
  }
  "Timestamp": "2016-11-03T00:09:12.4905685+01:00",
  "Level": "Debug",
  "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
  "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
  "Properties": {
    "Heartbeat": {
      "UserName": "Mike",
      "UserDomainName": "Home"
    },
    "Computer": "Workstation"
  }

我尝试过Logstash jsonsplit,但我无法让它发挥作用。

3 个答案:

答案 0 :(得分:3)

您可以使用额外的ruby过滤器从子结构中提取字段,从而实现预期目标:

filter {
  split {
   field => "events"
  }
  ruby {
    code => "
       event.to_hash.update(event['events'].to_hash) 
       event.to_hash.delete_if {|k, v| k == 'events'}     
    "
  }
}

结果事件将如下所示:

{
           "@version" => "1",
         "@timestamp" => "2017-01-20T04:51:39.223Z",
               "host" => "iMac.local",
          "Timestamp" => "2016-11-03T00:09:12.4905685+01:00",
              "Level" => "Debug",
    "MessageTemplate" => "Logging {@Heartbeat} from {Computer}",
    "RenderedMessage" => "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
         "Properties" => {
        "Heartbeat" => {
                  "UserName" => "Mike",
            "UserDomainName" => "Home"
        },
         "Computer" => "Workstation"
    }
}

答案 1 :(得分:1)

您现在可以通过设置batchFormatter来实现。默认的批处理格式化程序将创建错误的事件,但是ArrayBatchFormatter可以解决此问题:

 logger.WriteTo.DurableHttpUsingFileSizeRolledBuffers(
                    requestUri: new Uri($"http://{elasticHost}:{elasticPort}").ToString(),
                    batchFormatter: new ArrayBatchFormatter());

答案 2 :(得分:0)

升级到Logstash 5.0之后Val's solution由于Event API的更改而停止工作:更新event.to_hash未反映在原始event中。对于Logstash 5.0 + event.get('field')event.set('field', value)必须使用访问者。

现在更新的解决方案是:

input {
  http {
    port => 8080
    codec => json
  }
}

filter {
  split {
    field => "events"
  }
  ruby {
    code => "
      event.get('events').each do |k, v|
        event.set(k, v)
      end
    "
  }
  mutate {
    remove_field => [ "events" ]
  }
}