以羽毛格式将数据框保存到S3

时间:2018-02-07 13:36:23

标签: python python-3.x amazon-s3 feather

我有一个数据框,让我们说:

import pandas as pd
df = pd.DataFrame({'a': [1, 4], 'b': [1, 3]})

我想将它作为羽毛文件保存到s3,但我找不到一种有效的方法。

我尝试使用s3bps3fs,但他们没有做到这一点。

有什么建议吗?

3 个答案:

答案 0 :(得分:0)

您无需写入磁盘即可使用storefact / simplekv

import pyarrow as pa
from pyarrow.feather import write_feather
import storefact

df = …
store = storefact.get_store('hs3', host="…", bucket="…", access_key="…", secret_key="…")
buf = pa.BufferOutputStream()
write_feather(df, buf)
storage.put('filename.feather', buf.get_result().to_pybytes())

答案 1 :(得分:0)

对我有用的解决方案是

import boto3
import pandas as pd

from io import BytesIO
from pyarrow.feather import write_feather

df = pd.DataFrame({'a': [1, 4], 'b': [1, 3]})

s3_resource = boto3.resource('s3')
with BytesIO() as f:
    write_feather(df, f)
    s3_resource.Object('bucket-name', 'file_name').put(Body=f.getvalue())

答案 2 :(得分:0)

只有 Pyarrow 和 Pandas 的简单解决方案

import pandas as pd
import pyarrow as pa

s3 = pa.fs.S3FileSystem(region='us-east-1')

df = pd.DataFrame(data={'col1': [1, 2], 'col2': [3, 4]})

with s3.open_output_stream('my-bucket/path/to.feather') as f:
   pa.feather.write_feather(df, f)