我已经尝试了下面的两个命令并在启动脚本之前设置了env变量,但是我遇到了“AWS无法验证提供的访问凭据”错误。我不认为钥匙有问题。 我很感激任何帮助来解决这个问题。 我在ubuntu t2.micro实例上。
https://spark.apache.org/docs/latest/ec2-scripts.html
export AWS_SECRET_ACCESS_KEY=
export AWS_ACCESS_KEY_ID=
./spark-ec2 -k admin-key1 -i /home/ubuntu/admin-key1.pem -s 3 launch my-spark-cluster
./spark-ec2 --key-pair=admin-key1 --identity-file=/home/ubuntu/admin-key1.pem --region=ap-southeast-2 --zone=ap-southeast-2a launch my-spark-cluster
AuthFailure
AWS was not able to validate the provided access credentials
Traceback (most recent call last):
File "./spark_ec2.py", line 1465, in <module>
main()
File "./spark_ec2.py", line 1457, in main
real_main()
File "./spark_ec2.py", line 1277, in real_main
opts.zone = random.choice(conn.get_all_zones()).name
File "/cskmohan/spark-1.4.1/ec2/lib/boto-2.34.0/boto/ec2/connection.py", line 1759, in get_all_zones
[('item', Zone)], verb='POST')
File "/cskmohan/spark-1.4.1/ec2/lib/boto-2.34.0/boto/connection.py", line 1182, in get_list
raise self.ResponseError(response.status, response.reason, body)
boto.exception.EC2ResponseError: EC2ResponseError: 401 Unauthorized