1。我的版本信息
jdk-8u191-linux-x64.tar.gz
kibana-6.5.0-linux-x86_64.tar.gz
elasticsearch-6.5.0.tar.gz
logstash-6.5.0.tar.gz
filebeat-6.5.0-linux-x86_64.tar.gz
kafka_2.11-2.1.0.tgz
zookeeper-3.4.12.tar.gz
2。问题描述
我有一个XML格式的日志文件。我使用filebeat收集此文件并将其推送到Kafka Content乱码。
这是我的文件拍配置
filebeat.inputs:
- type: log
enabled: true
paths:
- /data/reporttg/ChannelServer.log
include_lines: ['\<\bProcID.*\<\/ProcID\b\>']
### Filebeat modules
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
### Elasticsearch template setting
setup.template.settings:
index.number_of_shards: 3
### Kibana
setup.kibana:
### Kafka
output.kafka:
enabled: true
hosts: ["IP:9092", "IP:9092", "IP:9092"]
topic: houry
### Procesors
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
我的日志内容
<OrigDomain>ECIP</OrigDomain>
<HomeDomain>UCRM</HomeDomain>
<BIPCode>BIP2A011</BIPCode>
<BIPVer>0100</BIPVer>
<ActivityCode>T2000111</ActivityCode>
<ActionCode>1</ActionCode>
<ActionRelation>0</ActionRelation>
<Routing>
<RouteType>01</RouteType>
<RouteValue>13033935743</RouteValue>
</Routing>
<ProcID>PROC201901231142020023206514</ProcID>
<TransIDO>SSP201901231142020023206513</TransIDO>
<TransIDH>2019012311420257864666</TransIDH>
<ProcessTime>20190123114202</ProcessTime>
<Response>
<RspType>0</RspType>
<RspCode>0000</RspCode>
<RspDesc>success</RspDesc>
</Response>
测试正则表达式
3。开始文件拍并查看Kafka内容
4。我测试了filebeat收集内容并将其推送到logstash是正常的。
该问题应如何解决?