logstash中的CSV过滤器" _csvparsefailure"错误

时间:2016-01-12 03:23:16

标签: elasticsearch logstash kibana

我问了另一个问题,我认为这个问题可能与这个问题有关: JSON parser in logstash ignoring data? 我认为它与之相关的原因是因为在上一个问题中,kibana没有显示来自JSON解析器的结果,该解析器具有" PROGRAM" field as" mfd_status"。现在我改变了我做事的方式,删除了JSON解析器以防万一它可能会干扰东西,但我仍然没有任何日志与" mfd_status"在他们出现。

csv 
{
    columns => ["unixTime", "unixTime2", "FACILITY_NUM", "LEVEL_NUM", "PROGRAM", "PID", "MSG_FULL"]
    source => "message"
    separator => "  "
}

在上一个问题的过滤器中,我使用了两个grok过滤器,现在我用csv过滤器替换了它们。我还有两个日期和一个指纹过滤器,但我认为它们与这个问题无关。

示例日志消息:

  

" 1452564798.76 \ t1452496397.00 \ t1 \ t4 \ tkernel \ t \ t [6252.000246]声纳:sonar_write():等待......"

输出:

        "unixTime" => "1452564798.76",
       "unixTime2" => "1452496397.00",
    "FACILITY_NUM" => "1",
       "LEVEL_NUM" => "4",
         "PROGRAM" => "kernel",
             "PID" => nil,
        "MSG_FULL" => "[ 6252.000246] sonar: sonar_write(): waiting...",
       "TIMESTAMP" => "2016-01-12T02:13:18.760Z",
"TIMESTAMP_second" => "2016-01-11T07:13:17.000Z"
  

" 1452564804.57 \ t1452496403.00 \ T1 \ T7 \ tmfd_status \吨\ t00800F08CFB0 \ TEXTRA \吨{\"日期\":1452543203,\"主机\&# 34;:\" ABCD1234 \" \" INET \":[\" 169.254.42.207/16 \" \" 10.8 .207.176 / 32 \" \" 172.22.42.207/16 \&#34],\" FB0 \":[\" U:1280x800p-60 \",32]}"

输出:

       "tags" => [
    [0] "_csvparsefailure"

在日志中显示kernel / mfd_status之后,不应再有任何分隔符,它应该都在MSG_FULL字段下。

总而言之,为什么我的一条日志消息正确解析而另一条没有?此外,即使它没有正确解析,它仍然应该只用空字段将它发送到elasticsearch,我想,为什么它也没有呢?

1 个答案:

答案 0 :(得分:3)

你几乎不错,你需要在CSV过滤器中覆盖另外两个参数,这两行都将被正确解析。

第一个是skip_empty_columns => true,因为你的第二个日志行中有一个空字段,你需要忽略它。

第二个是quote_char=> "'"(或双引号"以外的任何内容),因为您的JSON包含双引号。

csv {
    columns => ["unixTime", "unixTime2", "FACILITY_NUM", "LEVEL_NUM", "PROGRAM", "PID", "MSG_FULL"]
    source => "message"
    separator => "  "
    skip_empty_columns => true
    quote_char => "'"
}

使用它,您的第一个日志行解析为:

{
         "message" => "1452564798.76\\t1452496397.00\\t1\\t4\\tkernel\\t\\t[ 6252.000246] sonar: sonar_write(): waiting...",
        "@version" => "1",
      "@timestamp" => "2016-01-12T04:21:34.051Z",
            "host" => "iMac.local",
        "unixTime" => "1452564798.76",
       "unixTime2" => "1452496397.00",
    "FACILITY_NUM" => "1",
       "LEVEL_NUM" => "4",
         "PROGRAM" => "kernel",
        "MSG_FULL" => "[ 6252.000246] sonar: sonar_write(): waiting..."
}

第二个日志行解析为:

{
         "message" => "1452564804.57\\t1452496403.00\\t1\\t7\\tmfd_status\\t\\t00800F08CFB0\\textra\\t{\\\"date\\\":1452543203,\\\"host\\\":\\\"ABCD1234\\\",\\\"inet\\\":[\\\"169.254.42.207/16\\\",\\\"10.8.207.176/32\\\",\\\"172.22.42.207/16\\\"],\\\"fb0\\\":[\\\"U:1280x800p-60\\\",32]}",
        "@version" => "1",
      "@timestamp" => "2016-01-12T04:21:07.974Z",
            "host" => "iMac.local",
        "unixTime" => "1452564804.57",
       "unixTime2" => "1452496403.00",
    "FACILITY_NUM" => "1",
       "LEVEL_NUM" => "7",
         "PROGRAM" => "mfd_status",
        "MSG_FULL" => "00800F08CFB0",
         "column8" => "extra",
         "column9" => "{\\\"date\\\":1452543203,\\\"host\\\":\\\"ABCD1234\\\",\\\"inet\\\":[\\\"169.254.42.207/16\\\",\\\"10.8.207.176/32\\\",\\\"172.22.42.207/16\\\"],\\\"fb0\\\":[\\\"U:1280x800p-60\\\",32]}"
}