在Python脚本中传递AWS凭据

时间:2017-07-20 09:24:53

标签: python amazon-web-services boto3 amazon-ses

我有一个由PHP调用的python脚本。调用此php脚本的用户是apache,因此,apache也会调用python文件。因此,它给出了#34;无法找到凭证"。我通过awscli设置了默认凭据,当我以root身份调用python脚本时,它可以工作。

这是我的代码行:

client = boto3.client('ses', region_name=awsregion, aws_access_key_id='AJHHJHJHJ', aws_secret_access_key='asdasd/asdasd/asd')

但是,这会给出"无效的语法"错误。所以,我试过这个:

client = boto3.Session(aws_access_key_id='ASDASD', aws_secret_access_key='asd/asdasd/asdasd')
client = boto3.client('ses', region_name=awsregion, aws_access_key_id='ASDASD', aws_secret_access_key='asd/asdasd/asdasd')

给出与上述相同的错误。奇怪的是,同样的事情是mentioned in the documentation。虽然不推荐,但它应该可行。

有人可以帮我解决这个问题吗?

1 个答案:

答案 0 :(得分:0)

你有没有解决这个问题?以下是我在Python脚本中连接boto3的方法:

import boto3
from botocore.exceptions import ClientError
import re
from io import BytesIO
import gzip
import datetime
import dateutil.parser as dparser
from datetime import datetime
import tarfile
import requests
import sys
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark.context import SparkContext
from awsglue.context import GlueContext
from awsglue.job import Job

## Needed glue stuff
sc = SparkContext()
glueContext = GlueContext(sc)
spark = glueContext.spark_session
job = Job(glueContext)

## 
## currently this will run for everything that is in the staging directory     of omniture

# set needed parms
myProfileName = 'MyDataLake'
dhiBucket = 'data-lake'
#create boto3 session
try:    
    session = boto3.Session(aws_access_key_id='aaaaaaaaaaaa',     aws_secret_access_key='abcdefghijklmnopqrstuvwxyz', region_name='us-east-1')aws_session_token=None, region_name=None, botocore_session=None
    s3 = session.resource('s3') #establish connection to s3
except Exception as conne:
    print ("Unable to connect:  " + str(conne))
    errtxt = requests.post("https://errorcapturesite", data={'message':'Unable to connect to : ' + myProfileName, 'notify':True,'color':'red'})
    print(errtxt.text) 
    exit()