将s3的内容写入CSV

时间:2017-02-13 16:52:21

标签: python amazon-s3 boto

我正在创建一个脚本,将我的s3数据抓取到本地计算机。通常,我收到的数据是hive分区的数据。即使文件存在,我收到No such file or directory错误。有人可以解释我做错了什么以及我应该如何区别对待?以下是错误引用的代码段:

bucket = conn.get_bucket(bucket_name)
for sub in bucket.list(prefix = 'some_prefix'):
        matched = re.search(re.compile(read_key_pattern), sub.name)
        if matched:
            with open(sub.name, 'rb') as fin:
                reader = csv.reader(fin, delimiter = '\x01')
                contents = [line for line in reader]
            with open('output.csv', 'wb') as fout:
                writer = csv.writer(fout, quotechar = '', quoting = csv.QUOTE_NONE, escapechar = '\\')
                writer.writerows.content
  
    

IOError:[Errno 2]没有这样的文件或目录:'my_prefix / 54c91e35-4dd0-4da6-a7b7-283dff0f4483-000000'

  

该文件存在,这是我正在尝试检索的正确文件夹和文件。

1 个答案:

答案 0 :(得分:1)

就像@roganjosh说的那样,在你测试了名字匹配后,看起来你没有downloaded the file。我在下面添加了评论,向您展示如何在python 2中处理内存中的文件:

    from io import StringIO # alternatively use BytesIO
    import contextlib

    bucket = conn.get_bucket(bucket_name)
    # use re.compile outside of the for loop
    # it has slightly better performance characteristics
    matcher = re.compile(read_key_pattern)

    for sub in bucket.list(prefix = 'some_prefix'):
        # bucket.list returns an iterator over s3.Key objects
        # so we can use `sub` directly as the Key object
        matched = matcher.search(sub.name)
        if matched:
            # download the file to an in-memory buffer
            with contextlib.closing(StringIO()) as fp:
                sub.get_contents_to_file(fp)
                fp.seek(0)
                # read straight from the memory buffer
                reader = csv.reader(fp, delimiter = '\x01')
                contents = [line for line in reader]
            with open('output.csv', 'wb') as fout:
                writer = csv.writer(fout, quotechar = '', quoting = csv.QUOTE_NONE, escapechar = '\\')
                writer.writerows.content    

对于python 3,您需要更改回复for this question的注释中讨论的with语句。