缺少必需的配置“key.converter”,它没有defau

时间:2018-05-01 01:47:50

标签: apache-kafka apache-kafka-connect

当我尝试启动Kafka connect for elastic search reactor时,在独立模式下我收到以下错误:

Exception in thread "main" org.apache.kafka.common.config.ConfigException: Missing required configuration "key.converter" which has no default value.
        at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:463)
        at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:453)
        at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
        at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
        at org.apache.kafka.connect.runtime.WorkerConfig.<init>(WorkerConfig.java:218)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.<init>(DistributedConfig.java:272)
        at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:72)

我能解决这个错误吗?

编辑01/05/2018 对不起,我试着更具体一点。我使用流反应器连接器: https://github.com/Landoop/stream-reactor 这是我从EC2实例启动的命令,其中有我的kafka的唯一代理:

./bin/connect-standalone.sh config/elastic-config.properties config/connect- 
standalone.properties.

这是connect-standalone.properties:

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# These are defaults. This file just demonstrates how to override some 
settings.
bootstrap.servers=localhost:9092

# The converters specify the format of data in Kafka and how to translate it 
into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when 
loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's 
setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

# The internal converter used for offsets and config data is configurable 
and must be specified, but most users will
# always want to use the built-in default. Offset and config data is never 
visible outside of Copcyat in this format.
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
plugin.path=/home/ubuntu/kafka_2.11-1.0.1/libs

这是另一个文件:

name=elasticsearch-sink
    connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
    tasks.max=1
    topics=test
    topic.index.map=test:test_index
    connection.url=myurl
    type.name=log
    key.ignore=true
    schema.ignore=true

1 个答案:

答案 0 :(得分:2)

错误有点说明了一切。您错过了key.converter所需的配置条目。这告诉Kafka Connect如何对Kafka主题(通常是JSON或Avro)上的数据进行反序列化。

您可以在this gist中看到Elasticsearch的有效连接器配置示例。如果您更新问题以包含您正在使用的配置,我可以指出如何合并它。

看到您的配置后,您的错误原因是您以错误的顺序调用Connect配置文件,因此Connect无法找到它所期望的配置。

应该是:

./bin/connect-standalone.sh config/connect-standalone.properties config/elastic-config.properties

this article中阅读有关从Kafka到Elasticsearch的流媒体的更多信息,以及关于使用Kafka Connect的一般系列: