Logstash redis配置不将日志推送到ES

时间:2016-12-12 10:02:25

标签: elasticsearch redis logstash elastic-stack

我们正在使用ELK堆栈来监控我们的日志。我是ELK环境的全新手,最近我正在开展一项任务,我需要使用Redis配置Logstash来将我们的日志推送到,

下面是我正在使用的配置,下面的配置适用于ElasticSearch,但不适用于Redis,

 input {
  file {
        path => "E:/Logs/**/*.log"
        start_position => beginning 
        codec => json       
    }
}
filter {
    date {
      match => [ "TimeCreated", "YYYY-MM-dd HH:mm:ss Z" ]
    }
    mutate {
        add_field => {
            #"debug" => "true"
            "index_prefix" => "logstash-app"
        }
    }

}
output {
    #elasticsearch { 
        #local env
        #hosts => ["localhost:9200"]

        #preprod env
        #hosts => ["elk.logs.abc.corp.com:9200"]

        #prod env
        #hosts => ["elk.logs.abc.prod.com:9200"]

        #index => "logstash-app"
    #}
    redis { 
        #local env
        #host => "localhost:5300"

        #preprod env
        host => "redis.logs.abc.corp.com"

        #prod env
        #host => "redis.logs.abc.prod.com"

        data_type => "list"
        key => "logstash" 
    }

    if[debug] == "true" {
        stdout {
            codec => rubydebug 
        }
        file { 
            path => "../data/logstash-app-%{+YYYYMMdd}.log" 
        }
    }
}

我评论了Elasticsearch,使用Elastic Search我可以在Kibana中查看日志,但是使用Redis我无法看到,

有人能指出我在做什么错吗?我怎么能调试或看看我的日志是否正确发货

1 个答案:

答案 0 :(得分:1)

基于logstash plugin的文档:

  

host应为array

redis { 
    #preprod env
    host => ["redis.logs.abc.corp.com"]
    data_type => "list"
    key => "logstash" 
}