我正在通过Airflow在Google Dataproc群集上运行PySpark作业。
此作业在处理后从AWS S3下载数据并将其存储在Google Cloud Storage中。因此,为了通过执行者从Google Dataproc访问S3存储桶,我将AWS凭证存储在环境变量(附加到/ etc / environment)中,同时通过初始化操作创建dataproc集群。
使用Boto3,我正在获取凭据,然后设置Spark配置。
boto3_session = boto3.Session()
aws_credentials = boto3_session.get_credentials()
aws_credentials = aws_credentials.get_frozen_credentials()
aws_access_key = aws_credentials.access_key
aws_secret_key = aws_credentials.secret_key
spark._jsc.hadoopConfiguration().set("fs.s3a.secret.key", aws_secret_key)
spark._jsc.hadoopConfiguration().set("fs.s3a.access.key", aws_access_key)
初始化操作文件:
#!/usr/bin/env bash
#This script installs required packages and configures the environment
wget https://bootstrap.pypa.io/get-pip.py
sudo python get-pip.py
sudo pip install boto3
sudo pip install google-cloud-storage
echo "AWS_ACCESS_KEY_ID=XXXXXXXXXXXXX" | sudo tee --append /etc/environment
echo "AWS_SECRET_ACCESS_KEY=xXXxXXXXXX" | sudo tee --append /etc/environment
source /etc/environment
但是我遇到以下错误:这意味着我的Spark进程无法从环境变量中获取配置。
18/07/19 22:02:16 INFO org.spark_project.jetty.util.log: Logging initialized @2351ms
18/07/19 22:02:16 INFO org.spark_project.jetty.server.Server: jetty-9.3.z-SNAPSHOT
18/07/19 22:02:16 INFO org.spark_project.jetty.server.Server: Started @2454ms
18/07/19 22:02:16 INFO org.spark_project.jetty.server.AbstractConnector: Started ServerConnector@75b67e54{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
18/07/19 22:02:16 INFO com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase: GHFS version: 1.6.7-hadoop2
18/07/19 22:02:17 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at cluster-1-m/10.164.0.2:8032
18/07/19 22:02:19 INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl: Submitted application application_1532036330220_0004
ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/conf.dist/ivysettings.xml will be used
18/07/19 22:02:23 INFO DependencyResolver: ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/conf.dist/ivysettings.xml will be used
18/07/19 22:02:23 INFO hive.metastore: Trying to connect to metastore with URI thrift://cluster-1-m:9083
18/07/19 22:02:23 INFO hive.metastore: Connected to metastore.
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created local directory: /tmp/952f73b3-a59c-4a23-a04a-f05dc4e67d89_resources
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created HDFS directory: /tmp/hive/root/952f73b3-a59c-4a23-a04a-f05dc4e67d89
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created local directory: /tmp/root/952f73b3-a59c-4a23-a04a-f05dc4e67d89
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created HDFS directory: /tmp/hive/root/952f73b3-a59c-4a23-a04a-f05dc4e67d89/_tmp_space.db
18/07/19 22:02:24 INFO DependencyResolver: ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/conf.dist/ivysettings.xml will be used
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created local directory: /tmp/59c3fef5-6c9e-49c9-bf31-69634430e4e6_resources
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created HDFS directory: /tmp/hive/root/59c3fef5-6c9e-49c9-bf31-69634430e4e6
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created local directory: /tmp/root/59c3fef5-6c9e-49c9-bf31-69634430e4e6
18/07/19 22:02:24 INFO org.apache.hadoop.hive.ql.session.SessionState: Created HDFS directory: /tmp/hive/root/59c3fef5-6c9e-49c9-bf31-69634430e4e6/_tmp_space.db
Traceback (most recent call last):
File "/tmp/get_search_query_logs_620dea04/download_elk_event_logs.py", line 237, in <module>
aws_credentials = aws_credentials.get_frozen_credentials()
AttributeError: 'NoneType' object has no attribute 'get_frozen_credentials'
18/07/19 22:02:24 INFO org.spark_project.jetty.server.AbstractConnector: Stopped Spark@75b67e54{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
当我尝试登录到dataproc节点后手动提交作业时,Spark作业会影响凭据并运行正常。
有人可以帮我吗?
答案 0 :(得分:2)
在linux环境中玩了一段时间后,通过boto3会话从环境变量中获取了AWS凭证,这对我没有任何帮助。因此,遵循boto3文档并按如下所示修改了初始化操作脚本:
echo "[default]" | sudo tee --append /root/.aws/config
echo "aws_access_key_id = XXXXXXXX" | sudo tee --append /root/.aws/config
echo "aws_secret_access_key = xxxxxxxx" | sudo tee --append /root/.aws/config
有多种方法可以通过boto3_session
访问AWS凭证,其中一种是通过~/.aws/config
。