无法从链中的任何提供商加载AWS凭证-kinesis-kafka-connector

时间:2019-01-26 12:00:27

标签: java apache-kafka apache-kafka-connect amazon-kinesis-firehose

我尝试使用Kafka-Kinesis-Connector是与Kafka Connect一起使用的连接器,如链接(https://github.com/awslabs/kinesis-kafka-connector中所述,将消息从Kafka发布到Amazon Kinesis Firehose,并出现以下错误。我正在使用Cloudera版本CDH-6.1.0-1.cdh6.1.0.p0.770702,它随Kafka 2.1.2(0.10.0.1 + kafka2.1.2 + 6)一起提供。

我已经在当前会话中加载了AWS凭证,这没有用。

export AWS_ACCESS_KEY_ID="XXX"
export AWS_SECRET_ACCESS_KEY="YYYYY"
export AWS_DEFAULT_REGION="sssss"

我的worker.properties如下所示

bootstrap.servers=kafkanode:9092
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
#internal.value.converter=org.apache.kafka.connect.storage.StringConverter
#internal.key.converter=org.apache.kafka.connect.storage.StringConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter.schemas.enable=true
internal.key.converter.schemas.enable=true
internal.value.converter.schemas.enable=true
offset.storage.file.filename=offset.log
schemas.enable=false
#Rest API
rest.port=8096
plugin.path=/home/opc/kinesis-kafka-connector-master/target/
#rest.host.name=

我的kinesis-firehose-kafka-connector.properties如下所示

name=kafka_kinesis_sink_connector
connector.class=com.amazon.kinesis.kafka.FirehoseSinkConnector
tasks.max=1
topics=OGGTest
region=eu-central-1
batch=true
batchSize=500
batchSizeInBytes=1024
deliveryStream=kafka-s3-stream

错误代码如下所示:

        [2019-01-26 11:32:24,446] INFO Kafka version : 2.0.0-cdh6.1.0 (org.apache.kafka.common.utils.AppInfoParser:109)
  [2019-01-26 11:32:24,446] INFO Kafka commitId : unknown (org.apache.kafka.common.utils.AppInfoParser:110)
  [2019-01-26 11:32:24,449] INFO Created connector kafka_kinesis_sink_connector (org.apache.kafka.connect.cli.ConnectStandalone:104)
  [2019-01-26 11:32:25,296] ERROR WorkerSinkTask{id=kafka_kinesis_sink_connector-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
  com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain
    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
    at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.doInvoke(AmazonKinesisFirehoseClient.java:826)
    at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.invoke(AmazonKinesisFirehoseClient.java:802)
    at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.describeDeliveryStream(AmazonKinesisFirehoseClient.java:451)
    at com.amazon.kinesis.kafka.FirehoseSinkTask.validateDeliveryStream(FirehoseSinkTask.java:95)
    at com.amazon.kinesis.kafka.FirehoseSinkTask.start(FirehoseSinkTask.java:77)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:301)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:190)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
 [2019-01-26 11:32:25,299] ERROR WorkerSinkTask{id=kafka_kinesis_sink_connector-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)
 [2019-01-26 11:32:33,375] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65)
 [2019-01-26 11:32:33,375] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:223)

请告知。提前致谢!

1 个答案:

答案 0 :(得分:0)

〜/ .aws / credentials文件位于运行Connect工作进程的操作系统用户的主目录中。大多数AWS开发工具包和AWS CLI均可识别这些凭证。使用以下AWS CLI命令创建凭证文件:

  

aws configure

您还可以使用文本编辑器手动创建凭据文件。该文件应包含以下格式的行:

  

[默认]   aws_access_key_id =   aws_secret_access_key =

注意 :创建凭据文件时,请确保创建凭据文件的用户与运行Connect worker进程的用户相同。凭据文件位于该用户的主目录中。否则,S3连接器将无法找到凭据。