为什么没有logstash生成日志?

时间:2017-03-12 18:34:56

标签: elasticsearch logging logstash devops logstash-grok

我阅读下面的文章,了解我建立ELK环境的logstash技术。 https://tpodolak.com/blog/tag/kibana/


input {
    file {
        path => ["C:/logs/*.log"]
        start_position => beginning
        ignore_older => 0

    }
}
filter {
    grok {
        match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
    }
    # set the event timestamp from the log
    # https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
    date {
         match => [ "logdate", "yyyy-MM-dd HH:mm:ss.SSSS" ]
        target => "@timestamp"
    }
}
output {
    elasticsearch {
        hosts => "localhost:9200"
    }
    stdout {}
}

我在logstash.conf中添加了输入路径C / logs / * .log。我有一个非空的test.log文件,它有:


TimeStamp=2016-07-20 21:22:46.0079 CorrelationId=dc665fe7-9734-456a-92ba-3e1b522f5fd4 Level=INFO Message=About
TimeStamp=2016-07-20 21:22:46.0079 CorrelationId=dc665fe7-9734-456a-92ba-3e1b522f5fd4 Level=INFO Message=About
TimeStamp=2016-11-01 00:13:01.1669 CorrelationId=77530786-8e6b-45c2-bbc1-31837d911c14 Level=INFO Message=Request completed with status code: 200

根据上述文章。我必须看到我的日志在elasticsearch里面。 (来自" https://tpodolak.com/blog/tag/kibana/"样本结果) enter image description here 但是我的结果是,如果我在浏览器上写这个地址:http://localhost:9200/_cat/indices?v我在elasticsearch中看不到logstash日志?在弹性搜索中存储的logstash日志在哪里? logstash.conf看起来不错。但是没有满意的结果。结果是。我想通过logstash获取C / logs / *下的所有日志.log to elastic?但是我的logstash.conf中的错误是什么? enter image description here

我的日志(C:\ monitoring \ logstash \ logs \ C:\ monitoring \ logstash \ logs.log):


[2017-03-13T10:47:17,849][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:46:35,123][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:48:20,023][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:55:10,808][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-03-13T11:55:10,871][INFO ][logstash.pipeline        ] Pipeline main started
[2017-03-13T11:55:11,316][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-03-13T12:00:52,188][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T12:02:48,309][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T12:06:33,270][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 52 (byte 52) after output { elasticsearch { hosts "}
[2017-03-13T12:08:51,636][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input {  file { path "}
[2017-03-13T12:09:48,114][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input {  file { path "}
[2017-03-13T12:11:40,200][ERROR][logstash.agent           ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input {  file { path "}
[2017-03-13T12:19:17,622][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash


1 个答案:

答案 0 :(得分:3)

首先,您有一些配置问题:

  • Elasticsearch中的主机应该是一个数组(例如hosts => ["myHost:myPort3]),请参阅the doc
  • 使用通配符的Windows上的文件应使用正斜杠而不是向后(请参阅this issue
  • 您的日期过滤器正在寻找一个字段" logdate"什么时候应该寻找领域" TimeStamp" (给出你的日志文件)
  • 我方便的一个设置是sincedb_path,因为Logstash不会再尝试解析它已经解析过的文件(它会检查.sincedb以查看它是否已经解析了一个文件,默认位于$ HOME / .sincedb,在使用相同的日志文件进行测试时解析之间需要删除它

这就是为什么经过一些研究(实际上很多,不是Windows用户),我可以想出这个有效的配置:

input {
    file {
        path => "C:/some/log/dir/*"
        start_position => beginning
        ignore_older => 0
        sincedb_path => "NIL" #easier to remove from the current directory, the file will be NIL.sincedb

    }
}
filter {
    grok {
        match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
    }
    # set the event timestamp from the log
    # https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
    date {
         match => [ "TimeStamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
        target => "@timestamp"
    }
}
output {
    elasticsearch {
        hosts => ["localhost:9200"]
    }
    stdout {}
}