正在将日志文件上传到S3存储桶?

时间:2020-10-06 15:21:27

标签: python-3.x amazon-s3 logging aws-glue

我有一个使用python编写的程序,并且运行良好。但是,我决定进行一些日志记录以跟踪进度并将步骤输出到日志文件中。自从我第一次使用日志记录python库以来,我一直遇到问题。目的是将步骤记录在文件中并将其上载到S3。我缺少什么,请查看下面的代码?

start_time = time.time()
logging.basicConfig(filename='myLogFile.log', format='%(asctime)s %(levelname)s %(name)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', level=logging.INFO)
logger = logging.getLogger("GlueJob")
logging.info("Program started ....")
logger.setLevel(logging.INFO)
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
logger.addHandler(handler)

 Do Something .................
logging.info("List has all objects from S3 ... good")

Do Something ........................
logging.info("All created lists are populated with elements from S3 ... good")

DO Something ...........................
logging.info("Dictionary and Dataframe has been created ... good")

Do Something .......................
logging.info("Convert dataframe to csv ... good")

# here is the problem ....... Logfile is not uploading to S3 ### What am I missing??
s3.Bucket('my-bucket').upload_file(Filename='myLogFile.log', Key='/Asset_Filename_Database/folder1/folder2/myLogFile.log')

 print("Process Finsihed --- %s seconds ---" %(time.time() - start_time))

谢谢!!!

1 个答案:

答案 0 :(得分:0)

当您在键名Key='/Asset_Filename_Database/中使用 / 时,它将创建一个无名文件夹,请改用Key='Asset_Filename_Database


我尝试用所有三个实例(对象,客户端和存储桶)运行此示例,它对我有用。

import logging
import io
import time
import boto3

start_time = time.time()
logging.basicConfig(filename='myLogFile.log', format='%(asctime)s %(levelname)s %(name)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', level=logging.INFO)
logger = logging.getLogger("GlueJob")
logging.info("Program started ....")
logger.setLevel(logging.INFO)
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
logger.addHandler(handler)

logging.info("List has all objects from S3 ... good")
logging.info("All created lists are populated with elements from S3 ... good")
logging.info("Dictionary and Dataframe has been created ... good")
logging.info("Convert dataframe to csv ... good")

s3 = boto3.resource('s3')
s3_client = boto3.client('s3')

s3_client.upload_file('myLogFile.log', 'test-kayd-bucket', 'client/myLogFile.log')

s3.Object('test-kayd-bucket', 'object/myLogFile.log').upload_file('myLogFile.log')
s3.Bucket('test-kayd-bucket').upload_file(Filename='myLogFile.log',  Key='bucket/myLogFile.log1')

print("Process Finsihed --- %s seconds ---" %(time.time() - start_time))
相关问题