有人可以帮我如何使用aws cli lambda调用将s3文件夹中的多个文件放到另一个文件中。
更新的问题
要求:
现有过程:s3事件在源存储桶中创建,并为特定文件夹启用了前缀。 Lambda代码(粘贴在下面的工作中)。
问题:如果我必须手动调用一天的lambda函数,如何传递密钥。由于密钥是多个文件。
单个文件的现有代码:
aws lambda invoke --function-name abcdeffe_job --payload '{"Records":[{"eventTime": "2020-03-07T23:38:16.762Z","s3":{"bucket": {"name": "xxxx-test"},"object": {"key": "lambda-test/account.csv"}}}]}' abc.txt
我有多个帐户文件,例如account_1.csv,account_2.csv ****等。 我如何在这里传递密钥?
Lambda代码:
import json
import boto3
import urllib.parse
import time
s3_client =boto3.client('s3')
def lambda_handler(event, context):
file_event_time = event['Records'][0]['eventTime']
print("file_event_time :",file_event_time)
ts = time.strptime(file_event_time[:19], "%Y-%m-%dT%H:%M:%S")
dt=time.strftime("%Y-%m-%d", ts)
# Bucket Name where file was uploaded
source_bucket_name = event['Records'][0]['s3']['bucket']['name']
print("source_bucket_name : ",source_bucket_name)
# Bucket Name where file to be uploaded
destination_bucket_name = 'test'
# Filename of object (with path)
#file_key_name = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'],encoding='utf-8')
file_key_name = event['Records'][0]['s3']['object']['key']
print("file_key_name : ",file_key_name)
file_name=file_key_name.split("/")[1]
print("file_name : ",file_name)
# Copy Source Object
copy_source_object = {'Bucket': source_bucket_name, 'Key': file_key_name}
print(" The target folder to be created for :",dt)
destination_file_path ="art_jobs"+"/"+dt+"/"+file_name
print("destination_file_path : ",destination_file_path)
try:
response = s3_client.get_object(Bucket=source_bucket_name, Key=file_key_name)
print("response :",response)
s3_client.copy_object(CopySource=copy_source_object, Bucket=destination_bucket_name, Key=destination_file_path)
except Exception as e:
print(e)
raise e
答案 0 :(得分:0)
for My_Bucket_Object in Target_Bucket.objects.all():
Lambda_Client.invoke(FunctionName="Fn-Name",InvocationType='Event',Payload=json.dumps(Sample_Event))
s3_resource = boto3.resource("s3")
sourceLoc = "Bucket-Name-source/Key.csv"
s3_resource.Object("Bucket-Namedestination","Key.csv").copy_from(CopySource = sourceLoc)
response = s3_client.delete_object(
Bucket='Bucket-Name-source',
Key="Key.csv"
)