必须设置环境变量AWS_ACCESS_KEY_ID

时间:2020-01-05 19:31:36

标签: linux amazon-web-services apache-spark

我正在使用Linux 18.04,并且想在EC2上为Spark集群吃午餐。

我使用export命令设置环境变量

export AWS_ACCESS_KEY_ID=MyAccesskey
export AWS_SECRET_ACCESS_KEY=Mysecretkey

但是当我运行命令来吃午饭时,我得到了

错误:必须设置环境变量AWS_ACCESS_KEY_ID

我把所有我用过的命令都放了,以防万一我弄错了:


sudo mv ~/Downloads/keypair.pem   /usr/local/spark/keypair.pem
sudo mv ~/Downloads/credentials.csv   /usr/local/spark/credentials.csv
# Make sure the .pem file is readable by the current user.
chmod 400 "keypair.pem"
# Go into the spark directory and set the environment variables with the credentials information
cd spark
export AWS_ACCESS_KEY_ID=ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=SECRET_KEY
# To install Spark 2.0 on the cluster:
sudo spark-ec2/spark-ec2 -k keypair --identity-file=keypair.pem --region=us-west-2 --zone=us-west-2a --copy-aws-credentials --instance-type t2.micro --worker-instances 1 launch project-launch

我对这些事情还是陌生的,真的很感谢任何帮助

2 个答案:

答案 0 :(得分:3)

环境变量可以简单地在sudo之后以ENV = VALUE的形式传递,它们将被跟随的命令接受。我不知道这种用法是否有限制,因此可以通过以下方法解决我的示例问题:

sudo AWS_ACCESS_KEY_ID=ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY=SECRET_KEY spark-ec2/spark-ec2 -k keypair --identity-file=keypair.pem --region=us-west-2 --zone=us-west-2a --copy-aws-credentials --instance-type t2.micro --worker-instances 1 launch project-launch

答案 1 :(得分:1)

您还可以使用aws configure的get子命令来检索值AWS_ACCESS_KEY_ID和AWS_SECRET_ACCESS_KEY:

AWS_ACCESS_KEY_ID=$(aws configure get aws_access_key_id)
AWS_SECRET_ACCESS_KEY=$(aws configure get aws_secret_access_key) 

在命令行中:

sudo AWS_ACCESS_KEY_ID=$(aws configure get aws_access_key_id) AWS_SECRET_ACCESS_KEY=$(aws configure get aws_secret_access_key) spark-ec2/spark-ec2 -k keypair --identity-file=keypair.pem --region=us-west-2 --zone=us-west-2a --copy-aws-credentials --instance-type t2.micro --worker-instances 1 launch project-launch

来源:AWS Command Line Interface User Guide