我正在尝试在EC2上运行kafka connect并将数据从MSK发送到Elastic Search。 这就是我在做什么。
sudo yum install java-1.8.0
wget https://archive.apache.org/dist/kafka/2.2.1/kafka_2.12-2.2.1.tgz
tar -xzf kafka_2.12-2.2.1.tgz
Kafka connect安装
wget /usr/local http://packages.confluent.io/archive/5.2/confluent-5.2.0-2.11.tar.gz -P ~/Downloads/
tar -zxvf ~/Downloads/confluent-5.2.0-2.11.tar.gz -C ~/Downloads/
sudo mv ~/Downloads/confluent-5.2.0 /usr/local/confluent
我已经修改了两个属性文件
vim /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties
这是我给我创建的主题名称,即kafka主题和用于弹性搜索的连接网址
第二个属性文件
vim /usr/local/confluent/etc/kafka/connect-standalone.properties
在这里,我只修改了引导服务器的网址,就给了所有3个引导网址
最后,我正在像这样运行我的连接器
/usr/local/confluent/bin/connect-standalone /usr/local/confluent/etc/kafka/connect-standalone.properties /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties
然后我得到以下错误
[2019-12-30 20:35:38,109] INFO Kafka Connect standalone worker initialization took 3890ms (org.apache.kafka.connect.cli.ConnectStandalone:96)
[2019-12-30 20:35:38,109] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50)
[2019-12-30 20:35:38,109] INFO Herder starting (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:87)
[2019-12-30 20:35:38,109] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:162)
[2019-12-30 20:35:38,109] INFO Starting FileOffsetBackingStore with file /tmp/connect.offsets (org.apache.kafka.connect.storage.FileOffsetBackingStore:58)
[2019-12-30 20:35:38,111] INFO Worker started (org.apache.kafka.connect.runtime.Worker:167)
[2019-12-30 20:35:38,111] INFO Herder started (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:89)
[2019-12-30 20:35:38,111] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:55)
[2019-12-30 20:35:38,113] ERROR Failed to create job for /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties (org.apache.kafka.connect.cli.ConnectStandalone:108)
[2019-12-30 20:35:38,113] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:119)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}
我对plugin.path感到困惑。 我该如何设置? 它带有kafka-connect还是我必须设置它?
答案 0 :(得分:1)
我将指出Logstash也可用于将Kafka连接到Elasticsearch
我对plugin.path感到困惑。我该如何设置?它带有kafka-connect
查看connect-standalone.properties的底部并在其中阅读注释,然后取消注释插件路径属性
https://github.com/apache/kafka/blob/trunk/config/connect-standalone.properties#L32-L41
正如我在这里回答的那样,最好不要使用tarball;您已经通过分别下载Confluent Platform和Kafka来复制Zookeeper和Kafka。
请使用YUM安装Confluent Platform,其中包括Zookeeper,Apache Kafka和您的Elasticsearch连接器