Spark:无法从群集从属链中的链中的任何提供者加载AWS凭据

时间:2016-06-18 17:53:07

标签: amazon-web-services apache-spark amazon-ec2 aws-sdk

我有一个由spark-ec2启动的群集。当它从所有从属机器调用AWS API时,我收到了这个错误:

Caused by: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131)
    at com.amazonaws.http.AmazonHttpClient.getCredentialsFromContext(AmazonHttpClient.java:774)
    at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:800)
    at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:695)
    at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:447)
    at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:409)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:358)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:2051)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:2021)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.describeTable(AmazonDynamoDBClient.java:1299)
    at com.amazon.titan.diskstorage.dynamodb.DynamoDBDelegate.describeTable(DynamoDBDelegate.java:635)
    ... 27 more

这清楚地表明AWS无法从任何提供商加载AWS凭据。我在spark-env.sh中添加了所有凭据,如下所示:

...

export AWS_ACCESS_KEY_ID="XXX"
export AWS_SECRET_ACCESS_KEY="YYYYY"

...

这显然不起作用。

我也尝试使用

导出它
pssh -i -h /root/spark-ec2/slaves export AWS_ACCESS_KEY_ID=XXX
pssh -i -h /root/spark-ec2/slaves export AWS_SECRET_ACCESS_KEY=YYYYY

这也行不通。

有什么建议吗?

0 个答案:

没有答案