我有logstash将事件推送到elasticsearch,有些工作,有些则没有。
这是一个确实出现在elasticsearch中的文档示例:
{
"Application" => "API",
"Environment" => "myenv",
"LoggerName" => "MyLoggerName",
"@timestamp" => "2015-02-01T17:18:30.454Z",
"LogLevel" => "Warn",
"DeploymentId" => "ebb9e128b8d44994b7bbbf27b6893b03",
"RoleInstanceId" => "MyRoleInstance",
"Message" => "The message.",
"@version" => "1"
}
然而,以下一个(以及更多类似的)不会:
{
"Application" => "API",
"Environment" => "myenv",
"LoggerName" => "Common.Services.RequestLogger",
"@timestamp" => "2015-02-01T17:19:46.265Z",
"LogLevel" => "Info",
"DeploymentId" => "0a56017c4ad14cfe818afdbc52dabe76",
"RoleInstanceId" => "Instance",
"Data" => {
"Elapsed" => "PT0.0119377S",
"RequestDto" => {
"__type" => "Structure.Definition.LoginRequest, Structure.Definition",
"Email" => "email@domain.com"
},
"ResponseDto" => {
"__type" => "ServiceStack.HttpResult, ServiceStack",
"Headers" => {},
"AllowsPartialResponse" => false,
"Options" => {},
"Status" => 200,
"StatusCode" => "OK",
"Response" => {
"__type" => "Structure.Definition.LoginResponse, Structure.Definition",
"UserId" => "xxxx",
"OrganisationId" => "xxxx",
"Token" => "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"TimeMs" => 0
},
"ResponseFilter" => {
"__type" => "ServiceStack.Host.ContentTypes, Service
Stack",
"ContentTypeFormats" => {
"csv" => "text/csv",
"markdown" => "text/markdown",
"plain" => "text/plain",
"x-protobuf" => "application/x-protobuf"
}
},
"PaddingLength" => 0,
"IsPartialRequest" => false
},
"OperationName" => "LoginRequest",
"Verb" => "POST",
"AbsoluteUri" => "https://xxxx/auth/login",
"RawUrl" => "/auth/login",
"IsLocal" => false,
"IsSecureConnection" => true,
"RemoteIp" => "000.000.000.000",
"UserHostAddress" => "000.000.000.000",
"UserAgent" => "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit
/537.36 (KHTML, like Gecko) Chrome/40.0.2214.93 Safari/537.36",
"StatusCode" => 200,
"StatusDescription" => "OK",
"Type" => "Request"
},
"@version" => "1"
}
这些已从stdout复制{codec => rubydebug {}}所以我知道logstash理解文档,但是第二种类型永远不会出现在elasticsearch中。
我已经看过这是否与文件大小有关,但我发现没有任何限制这一点。
这是logstash配置:
input {
redis {
codec => json {}
data_type => "list"
host => "xxxx.redis.cache.windows.net"
key => "logstash"
password => "xxxxxxx"
}
redis {
codec => json {}
data_type => "list"
host => "xxxx.redis.cache.windows.net"
key => "logstash"
password => "xxxxxx"
}
}
output {
elasticsearch {
host => "127.0.0.1"
port => "9201"
protocol => http
}
stdout { codec => rubydebug{} }
}
我在elasticsearch配置中缺少什么? logstash节点通过SSH端口映射通过HTTP连接到elasticsearch。
答案 0 :(得分:0)
在失败的文档中,您似乎有名为" __ type"的字段。以下划线开头的字段受到限制,并会导致问题。因此我建议你重命名" __ type"字段到"键入"然后再试一次。此处和其他字段名称限制在此处描述:http://grokbase.com/t/gg/elasticsearch/144s6e3877/illegal-characters-in-elasticsearch-field-names