Logstash,将Kafka字节十进制转换为十进制

时间:2019-11-19 07:13:36

标签: apache-kafka logstash avro

我无法将字节转换为十进制,这是我的示例logstash配置

我正确读取的所有数据(字节字段除外)

bytes字段看起来像这样:“ \ u0003 \ ae”

input {
    kafka {
      bootstrap_servers => "${INPUT_KAFKA_BOOTSTRAP_SERVERS:kafka1.x.x.net:9092}"
      topics => ["${INPUT_KAFKA_TOPIC:test-name}"]
      auto_offset_reset => "earliest"
      group_id => "${GROUP_ID:logstash-elastic-test}"
      client_id => "${CLIENT_ID:logstash-elastic-test}"

      codec => avro_schema_registry {
        endpoint => "${SCHEMA_REGISTRY:http://kafka-schema-registry.x.x.x.net:8080}"
    }
      value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
      key_deserializer_class => "org.apache.kafka.common.serialization.StringDeserializer"
      decorate_events => "true"
    } 
  }

filter {
}

output {
  stdout { codec => rubydebug }
}

它对我的带有字节的avro字段进行了采样(在配置中,我仅使用avro模式注册表)

                    {
                      "name": "decimal_value",
                      "type": [
                        "null",
                        {
                        "type": "bytes",
                        "logicalType": "decimal",
                        "precision": 9,
                        "scale": 5
                        }
                      ],
                      "default": null
                    },

如何编写正确的转换,或者是一些插件来对此进行解析?

0 个答案:

没有答案