Druid命令行hadoop索引器没有提供AWS凭证

时间:2020-03-03 16:22:47

标签: amazon-s3 druid

我试图将S3用作我的德鲁伊的深层存储,并使用以下命令使用命令行hadoop索引器:

List<UserDetail> newUserDetails = groupedUserDetails.entrySet()
    .stream()
    .map(entry -> new UserDetail(entry.getKey(), entry.getValue())
    .collect(Collectors.toList());

并收到以下异常:

cd /opt/apache-druid-0.17.0; java -Xmx512m
-Daws.region=us-east-1
-Ddruid.storage.bucket=TEST_BUCKET                            
-Ddruid.storage.baseKey=druid_indexed_data                             
-Ddruid.storage.useS3aSchema=True                             
-Ddruid.s3.accessKey=ACCESS_KEY                            
-Ddruid.s3.secretKey=SECRET_KEY                            
-Ddruid.storage.storageDirectory=s3a://TEST_BUCKET/druid_indexed_data                             -Ddruid.storage.type=hdfs                             
-Dfile.encoding=UTF-8                             
-classpath extensions/druid-parquet-extensions/*:extensions/druid-avro-extensions:extensions/druid-hdfs-storage:extensions/druid-s3-storage:lib/*:/opt/apache-druid-0.17.0/conf/druid/single-server/micro-quickstart/_common:/opt/hadoop-2.8.5/bin:/opt/hadoop-2.8.5/share/hadoop/tools/lib/*                             
org.apache.druid.cli.Main index hadoop /path/to/specfile

我正尝试避免修改Caused by: java.io.InterruptedIOException: doesBucketExist on TEST_BUCKET: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint 文件,因为我将对不同的数据源使用不同的存储桶,因此我无法对配置进行硬编码,这就是为什么我要使用

  • -Ddruid.s3.accessKey =访问权限
  • -Ddruid.s3.secretKey = SECRET_KEY

我还尝试将那些变量更改为common.runtime.propertiesdruid.s3.access.key,这些变量根本没有帮助。知道如何实现吗?

0 个答案:

没有答案