无法实例化类型的值[map type; class java.util.LinkedHashMap

时间:2015-12-26 07:27:25

标签: json parsing elasticsearch apache-kafka

我正在将一个json文件从Kafka推送到Elasticsearch,然后在Kibana中可视化数据。

这是我的json文件格式:

[{
"A": "---",
"B": "---",
"C": "---",
"D": "---",
"ABC": "---",
"CDE": "---",
"FGY": "1110",
"ADF": "226",
"SSS": "nil",
"ASA": "9.5",
"DFGHJKLIWSSFFFSF": "12121",
"sasfasfafasfsa": "0.21212",
"TEST": "12121121",
"AGAIN_TEST": "1.23456",
"SSS": "---",
"ASD": "---",
"ASSDFFF": "---",
"QQQQ": "61.2793",
"UYTR": "3619",
"testing": "58.3649",
"fffff": "1010",
"Fasa_sasfaf": "9.000"
}, {
"A": "1616161",
"B": "0.234",
"C": "---",
"D": "---",
"ABC": "1.11",
"CDE": "---",
"FGY": "323",
"ADF": "121",
"SSS": "---",
"ASA": "9.5",
"DFGHJKLIWSSFFFSF": "12121",
"sasfasfafasfsa": "0.21212",
"TEST": "---",
"AGAIN_TEST": "1.23456",
"SSS": "---",
"ASD": "121212",
"ASSDFFF": "---",
"QQQQ": "61.2793",
"UYTR": "3619",
"testing": "50.3649",
"fffff": "1030",
"Fasa_sasfaf": "123.012"
}]

根据网站http://jsonlint.com/,我使用的json文件是正确的。但是在Kafka中传递该文件时,我在弹性搜索中出现错误

Can not instantiate value of type [map type; class java.util.LinkedHashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]] from JSON String; no single-String constructor/factory method

这是完整的堆栈跟踪:

Can not instantiate value of type [map type; class java.util.LinkedHashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]] from JSON String; no single-String constructor/factory method
at org.codehaus.jackson.map.deser.std.StdValueInstantiator._createFromStringFallbacks(StdValueInstantiator.java:379)
at org.codehaus.jackson.map.deser.std.StdValueInstantiator.createFromString(StdValueInstantiator.java:268)
at org.codehaus.jackson.map.deser.std.MapDeserializer.deserialize(MapDeserializer.java:244)
at org.codehaus.jackson.map.deser.std.MapDeserializer.deserialize(MapDeserializer.java:33)
at org.codehaus.jackson.map.ObjectReader._bindAndClose(ObjectReader.java:768)
at org.codehaus.jackson.map.ObjectReader.readValue(ObjectReader.java:473)
at org.elasticsearch.river.kafka.IndexDocumentProducer.addMessagesToBulkProcessor(IndexDocumentProducer.java:71)
at org.elasticsearch.river.kafka.KafkaWorker.consumeMessagesAndAddToBulkProcessor(KafkaWorker.java:107)
at org.elasticsearch.river.kafka.KafkaWorker.run(KafkaWorker.java:78)
at java.lang.Thread.run(Thread.java:745)

0 个答案:

没有答案