事件不会从我的Spring Boot应用程序发送到logstash。这是我的logback.xml文件:
<configuration>
<appender name="STASH-C" class="net.logstash.logback.appender.LogstashAccessTcpSocketAppender">
<destination>arc-poc01:5044</destination>
<encoder class="net.logstash.logback.encoder.LogstashAccessEncoder" />
<keepAliveDuration>5 minutes</keepAliveDuration>
</appender>
<appender name="STASH-B" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>arc-poc01:5045</destination>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder" >
<providers>
<timestamp/>
<version/>
<loggerName/>
<pattern>
<pattern>
{
"custom_constant": "cfg",
"level": "%level",
"thread": "%thread",
"message": "%message"
}
</pattern>
</pattern>
</providers>
</encoder>
<keepAliveDuration>5 minutes</keepAliveDuration>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are assigned the type ch.qos.logback.classic.encoder.PatternLayoutEncoder
by default -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
</pattern>
</encoder>
</appender>
<logger name="com.gw.test" level="INFO" />
<logger name="org.springframework" level="INFO" />
<logger name="com.netflix.astyanax" level="INFO" />
<root level="DEBUG">
<appender-ref ref="STASH-B" />
<appender-ref ref="STASH-C" />
<appender-ref ref="STDOUT" />
</root>
</configuration>
这是我的logstash管道配置:
input {
tcp {
port => 5044
codec => json
data_timeout => -1
}
log4j {
mode => "server"
host => "0.0.0.0"
port => 5045
type => "log4j"
codec => json
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "http://arc-poc01:9200" ]
}
}
当我运行应用程序控制台时,正在按预期生成输出,但没有事件发送到logstash。我运行curl -XGET&#39; localhost:9600 / _node / stats / pipeline?pretty&#39;验证流量,这是输出:
{
"host" : "55357b6f0969",
"version" : "5.4.1",
"http_address" : "0.0.0.0:9600",
"id" : "65153f2f-10af-48c3-9be5-5db0913bf7d8",
"name" : "55357b6f0969",
"pipeline" : {
"events" : {
"duration_in_millis" : 0,
"in" : 0,
"filtered" : 0,
"out" : 0,
"queue_push_duration_in_millis" : 0
},
"plugins" : {
"inputs" : [ {
"id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-2",
"events" : {
"out" : 0,
"queue_push_duration_in_millis" : 0
},
"name" : "log4j"
}, {
"id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-1",
"events" : {
"out" : 0,
"queue_push_duration_in_millis" : 0
},
"name" : "tcp"
} ],
"filters" : [ ],
"outputs" : [ {
"id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-3",
"name" : "stdout"
}, {
"id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-4",
"name" : "elasticsearch"
} ]
},
"reloads" : {
"last_error" : null,
"successes" : 0,
"last_success_timestamp" : null,
"last_failure_timestamp" : null,
"failures" : 0
},
"queue" : {
"type" : "memory"
},
"id" : "main"
}
}
我的Spring启动应用程序或Logstash的日志中都没有显示错误。在Spring Boot应用程序下配置我缺少的logback是否有特定的东西?我花了两天的时间研究这个并且没有想法。任何关于如何解决这个问题的提示都将非常感激!
答案 0 :(得分:1)
我还应该在原始描述中添加我在通过Marathon脚本部署在Mesos上的Docker容器中运行ELK堆栈。以下是最终帮助我通过ELK管道流式传输日志的应用程序:
1)更改logstash管道定义,如下所示:
input {
tcp {
port => 5045
codec => json
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "http://arc-poc01:9200" ]
}
}
2)在logback.xml文件中配置Logstash appender,如下所示:
<appender name="STASH-B" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>arc-poc01:5045</destination>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder" >
<providers>
<timestamp/>
<version/>
<loggerName/>
<pattern>
<pattern>
{
"custom_constant": "cfg",
"level": "%level",
"thread": "%thread",
"message": "%message"
}
</pattern>
</pattern>
</providers>
</encoder>
<keepAliveDuration>5 minutes</keepAliveDuration>
</appender>
3)在我的客户端应用程序中从Logstash appender获取相关的调试跟踪,通过在logback.xml文件中包含StatusListener来显示Logstash连接的上下,如下所示:
<statusListener class="ch.qos.logback.core.status.OnConsoleStatusListener" />
日志条目指示正在建立的logstash连接并由logstash服务器立即关闭。我认为问题是由管道中定义的编解码器配置与客户端应用程序端的logstash appender发送的内容类型之间明显不匹配引起的。
答案 1 :(得分:0)
尝试将json编解码器从输入移动到过滤器: -
input {
tcp {
port => 5045
}
}
filter
{
json {
source => "message"
remove_field => "message"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "http://arc-poc01:9200" ]
}
}