从Hive写入s3失败

时间:2018-06-06 00:12:01

标签: amazon-s3 hive cloudera

我正在尝试设置一个Hadoop集群来将Hive表写入s3。

我从s3收到以下错误:(这是为读取能力而分解的单行错误)

FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: 
Unable to determine if s3a://<MyBucket>/hive/warehouse/<My>.db/<MyTable> is encrypted: 
java.io.InterruptedIOException: doesBucketExist on <MyBucket>: 
com.amazonaws.AmazonClientException: 
No AWS Credentials provided by 
  BasicAWSCredentialsProvider
  EnvironmentVariableCredentialsProvider 
  SharedInstanceProfileCredentialsProvider : 
com.amazonaws.SdkClientException:
Unable to load credentials from service endpoint

A Similar Issue is described here

1 个答案:

答案 0 :(得分:1)

看起来好的锻炼和良好的休息是解决方案:

This link talks about the fs.s3a.aws.credentials.provider

If unspecified, then the default list of credential provider classes,
queried in sequence, is:
1. org.apache.hadoop.fs.s3a.BasicAWSCredentialsProvider: supports
    static configuration of AWS access key ID and secret access key.
    See also fs.s3a.access.key and fs.s3a.secret.key.

问题是我在 hadoop conf /etc/hadoop/conf中指定了密钥,而不是 hive conf /etc/hive/conf。移过fs.s3a.access.keyfs.s3a.secret.key即可解决问题。