我尝试使用AWS cognito服务来验证和上传文件。我已经提供了我的regionType,identityPool,AWS账户ID和UnAuthRole。我也知道生产和开发桶的名称。
我认为我正在设置AWS Access密钥和AWS密钥...我想使用cognito进行身份验证并使用结果来允许我进行存储桶列表以及稍后的文件上传。
我做错了什么?如何使用cognito id建立S3连接?
这是我的代码和产生的错误:
#!/usr/bin/python
import boto3
import boto
#boto.set_stream_logger('foo')
import json
client = boto3.client('cognito-identity','us-east-1')
resp = client.get_id(AccountId='<ACCNTID>',IdentityPoolId='<IDPOOLID>')
print "\nIdentity ID: %s"%(resp['IdentityId'])
print "\nRequest ID: %s"%(resp['ResponseMetadata']['RequestId'])
resp = client.get_open_id_token(IdentityId=resp['IdentityId'])
token = resp['Token']
print "\nToken: %s"%(token)
print "\nIdentity ID: %s"%(resp['IdentityId'])
resp = client.get_credentials_for_identity(IdentityId=resp['IdentityId'])
secretKey = resp['Credentials']['SecretKey']
accessKey = resp['Credentials']['AccessKeyId']
print "\nSecretKey: %s"%(secretKey)
print "\nAccessKey ID: %s"%(accessKey)
print resp
conn = boto.connect_s3(aws_access_key_id=accessKey,aws_secret_access_key=secretKey,debug=0)
print "\nConnection: %s"%(conn)
for bucket in conn.get_all_buckets():
print bucket.name
错误:
Traceback (most recent call last):
File "./test.py", line 32, in <module>
for bucket in conn.get_all_buckets():
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 440, in get_all_buckets
response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>InvalidAccessKeyId</Code><Message>The AWS Access Key Id you provided does not exist in our records.</Message><AWSAccessKeyId>ASIAILXMPZEMJAVZN7TQ</AWSAccessKeyId><RequestId>10631ACFF95610DD</RequestId><HostId>PGWDRBmhLjjv8Ast8v6kVHOG3xR8erJRV2ob3/2RmqHXwrg8HCZV578YsNLaoL24Hknr+nh033U=</HostId></Error>
这个相应的iOS代码工作正常:
AWSCognitoCredentialsProvider *credentialsProvider =
[AWSCognitoCredentialsProvider credentialsWithRegionType:awsCognitoRegionType
accountId:awsAccountId
identityPoolId:awsCognitoIdentityPool
unauthRoleArn:unauthRoleArn
authRoleArn:nil];
AWSServiceConfiguration *configuration = [AWSServiceConfiguration configurationWithRegion:awsCognitoRegionType
credentialsProvider:credentialsProvider];
....
AWSS3TransferManagerUploadRequest *uploadRequest = [AWSS3TransferManagerUploadRequest new];
uploadRequest.bucket = [ELEEnvironment currentEnvironment].userDataS3Bucket;
uploadRequest.key = key;
uploadRequest.body = uploadFileURL;
[[self uploadTask:uploadRequest] continueWithExecutor:[BFExecutor mainThreadExecutor]...
感谢您的帮助!
答案 0 :(得分:2)
此问题确实无效,因为身份验证在创建会话时失败但在尝试列出存储桶时失败。
从特定存储桶上传和下载可以正常使用上述代码,但不适用于所有存储桶的列表。
# Upload a new file
data = open('test.jpg', 'rb')
s3.Bucket('mybucket').put_object(Key='test.jpg', Body=data)
# S3 Object
obj = s3.Object(bucket_name='mybucket', key='test.jpg')
response = obj.get()
data = response['Body'].read()
print len(data)
答案 1 :(得分:1)
PhilBot,我不知道为什么你的原始代码示例使用boto连接到s3(而不是boto3)。代码使用boto3连接到cognito。截至目前,boto3是稳定的,并且可能没有太多理由再使用boto了。 (也许当你最初发布你的问题时,boto3并不像今天那样稳定。)
当我尝试使用你的代码连接到boto3的kinesis时,它不起作用 - 我必须将响应[“Credentials”] [“SessionToken”]作为aws_session_token传递给client()函数。
答案 2 :(得分:0)
这是你的错误:
File "./test.py", line 32, in <module>
bucket = conn.get_bucket("elektradevbucket")
这是您引用存储桶的代码的一部分:
bucket = conn.get_bucket("testbucket")
'''
s3 = boto3.resource('s3')
for bucket in s3.buckets.all():
print(bucket.name)
s3.Bucket('testbucket')
您确定正在运行或调用正确的脚本吗?
最佳, -Iulian