我有一个相当简单的管道,用于从Kafka获取json消息并将它们发送到Elasticsearch:
input {
kafka {
bootstrap_servers => "kafka04-prod01.messagehub.services.eu-de.bluemix.net:9093,kafka05-prod01.messagehub.services.eu-de.bluemix.net:9093,kafka01-prod01.messagehub.services.eu-de.bluemix.net:9093,kafka03-prod01.messagehub.services.eu-de.bluemix.net:9093,kafka02-prod01.messagehub.services.eu-de.bluemix.net:9093"
topics => [ "transactions_load" ]
}
}
filter {
json {
source => "message"
}
mutate{
remove_field => ["kafka"]
remove_field => ["@version"]
remove_field => ["@timestamp"]
remove_field => ["message"]
remove_tag => ["multiline"]
}
}
output {
elasticsearch {
hosts => [
"xxxxx.ibm-343.composedb.com:16915",
"xxxxx.ibm-343.composedb.com:16915"
]
ssl => true
user => "logstash_kafka"
password => "*****"
index => "pos_transactions"
}
}
json记录有一个TransactionID
字段,用于唯一标识每条记录:
{"TransactionID": "5440772161", "InvoiceNo": 5440772, "StockCode": 22294, "Description": "HEART FILIGREE DOVE SMALL", "Quantity": 4, "InvoiceDate": 1507777440000, "UnitPrice": 1.25, "CustomerID": 14825, "Country": "United Kingdom", "LineNo": 16, "InvoiceTime": "03:04:00", "StoreID": 1}
{"TransactionID": "5440772191", "InvoiceNo": 5440772, "StockCode": 21733, "Description": "RED HANGING HEART T-LIGHT HOLDER", "Quantity": 4, "InvoiceDate": 1507777440000, "UnitPrice": 2.95, "CustomerID": 14825, "Country": "United Kingdom", "LineNo": 19, "InvoiceTime": "03:04:00", "StoreID": 1}
我可以配置logstash以使用TransactionID
作为_id
字段,这样如果我处理同一事务的重复记录,这些更新是幂等的吗?
答案 0 :(得分:0)
我自己想出了答案。在此发布,因为它可能对其他人有用:
output {
elasticsearch {
hosts => [
"xxxxx.ibm-343.composedb.com:16915",
"xxxxx.ibm-343.composedb.com:16915"
]
ssl => true
user => "logstash_kafka"
password => "*****"
index => "pos_transactions"
document_id => "%{TransactionID}"
}
}
document_id => "%{TransactionID}"
配置条目使用弹性搜索TransactionID
的传入文档_id
字段