我正在尝试实施以下答案中描述的解决方案:
https://stackoverflow.com/a/27867252/740839
但Elasticsearch会抛出以下异常,说它无法解析@timestamp字段:
[2015-01-30 12:09:39,513][DEBUG][action.bulk ] [perfgen 1] [logaggr-2015.01.30][2] failed to execute bulk item (index) index {[logaggr-2015.01.30][logs][c2s5PliTSGKmZSXUWzlkNw], source[{"message":"2015-01-29 17:30:31,579 [ERROR] [pool-1-thread-9] [LogGenerator] invocation count=813,time=2015-01-29 17:30:31,578,metric=-9080142057551045424","@version":"1","@timestamp":"2015-01-30T19:10:53.891Z","host":"perfdev","path":"/home/user/work/elk/logaggr-test/LogAggr_Test.log","logts":"2015-01-29 17:30:31,579","level":"ERROR","thread":"pool-1-thread-9","classname":"LogGenerator","details":"invocation count=813,time=2015-01-29 17:30:31,578,metric=-9080142057551045424"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [@timestamp]
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:414)
at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:648)
at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:501)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:542)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:491)
at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:376)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:451)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:157)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:535)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:434)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to parse date field [2015-01-30T19:10:53.891Z], tried both date format [yyyy-MM-dd HH:mm:ss,SSS], and timestamp number with locale []
at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:610)
at org.elasticsearch.index.mapper.core.DateFieldMapper.innerParseCreateField(DateFieldMapper.java:538)
at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:223)
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:404)
... 12 more
Caused by: java.lang.IllegalArgumentException: Invalid format: "2015-01-30T19:10:53.891Z" is malformed at "T19:10:53.891Z"
at org.elasticsearch.common.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:754)
at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:604)
... 15 more
如“消息”中所示,我的日志语句如下所示:
2015-01-29 17:30:31,579 [ERROR] [pool-1-thread-9] [LogGenerator] invocation count=813,time=2015-01-29 17:30:31,578,metric=-9080142057551045424
不确定问题是否与logstash配置有关。我的logstash过滤器如下所示:
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logts}%{SPACE}\[%{LOGLEVEL:level}%{SPACE}]%{SPACE}\[%{DATA:thread}]%{SPACE}\[%{DATA:classname}]%{SPACE}%{GREEDYDATA:details}" ]
}
}
我的logstash输出是:
output {
elasticsearch {
cluster => "perfgen"
host => "10.1.1.1"
port => 9201
index => "logaggr-%{+YYYY.MM.dd}"
protocol => "http"
template => "logaggr-test.json"
template_name => "logaggr"
}
}
我的模板“logaggr-test.json”是:
{
"template": "logaggr-*",
"mappings": {
"logaggr": {
"date_detection": false,
"properties": {
"_timestamp": { "type": "date", "enabled": true, "store": true },
"logts": { "type": "date" },
"level": { "type": "string" },
"thread": { "type": "string" },
"classname": { "type": "string" },
"details": { "type": "string"}
}
}
}
}
我已尝试添加默认映射等,但我无法通过解析异常。
重申我要解决的问题,我正在尝试设置logstash来解析我的日志文件并将其索引到Elasticsearch中。在此过程中,我想捕获我的日志消息的时间戳,@ timestamp(由logstash添加)和_timestamp(由Elasticsearch添加)
感谢任何帮助。
答案 0 :(得分:0)
原来我在之前的一些测试中遗留了一个模板,该模板为@timestamp指定了不同的格式。我删除了模板,现在我可以摄取我的日志了。