logstash:http输入只占用第一行(使用csv过滤器)

时间:2015-08-28 07:48:02

标签: csv elastic-stack

我是elk堆栈的新手并试图监控通过http发送的日志。我有以下logstash配置。但它只读取并发送第一行到弹性搜索,虽然我在我的http POST请求体中发送多行(我使用chromes DHC插件发送http请求到logstash)。请帮我阅读完整的数据并将其发送到弹性搜索。

input {
  http {
    host => "127.0.0.1" # default: 0.0.0.0
    port => 8081 # default: 8080
    threads => 10
  }
}

filter {
  csv {
      separator => ","
      columns => ["posTimestamp","posCode","logLevel","location","errCode","errDesc","detail"]
  }
  date {
    match => ["posTimestamp", "ISO8601"]
  }
  mutate {
     strip => ["posCode", "logLevel", "location", "errCode", "errDesc" ]
     remove_field => [ "path", "message", "headers" ]
  }
}

output { 
    elasticsearch {
      protocol => "http"
      host => "localhost"
      index => "temp"
    }
    stdout { 
        codec => rubydebug
    }
}

示例数据: 2015-08-24T05:21:40.468,352701060205140,ERROR,Colombo,ERR_01,INVALID_CARD,测试POS错误 2015-08-24T05:21:41.468,352701060205140,ERROR,Colombo,ERR_01,INVALID_CARD,测试POS错误 2015-08-24T05:23:40.468,81021320,ERROR,Colombo,ERR_01,INVALID_CARD,测试POS错误 2015-08-25T05:23:50.468,352701060205140,ERROR,Colombo,ERR_02,TIME_OUT,测试POS错误

1 个答案:

答案 0 :(得分:1)

通过添加拆分过滤器来管理解决此问题。

拆分{ }