熊猫:不断从功能写入csv

时间:2015-06-27 15:03:57

标签: python pandas

我为Pandas设置了一个函数,该函数遍历input.csv中的大量行,并将结果输入到系列中。然后它将系列写入output.csv

但是,如果进程中断(例如意外事件),程序将终止,并且所有进入csv的数据都将丢失。

有没有办法将数据连续写入csv,无论该函数是否为所有行完成?

值得一提的是,每次程序启动时,都会创建一个空白output.csv,并在函数运行时附加到空白处。{/ p>

import pandas as pd

df = pd.read_csv("read.csv")

def crawl(a):
    #Create x, y
    return pd.Series([x, y])

df[["Column X", "Column Y"]] = df["Column A"].apply(crawl)
df.to_csv("write.csv", index=False)

3 个答案:

答案 0 :(得分:13)

这是一种可能的解决方案,它会将数据附加到新文件中,因为它会以块的形式读取csv。如果进程中断,则新文件将包含中断之前的所有信息。

import pandas as pd

#csv file to be read in 
in_csv = '/path/to/read/file.csv'

#csv to write data to 
out_csv = 'path/to/write/file.csv'

#get the number of lines of the csv file to be read
number_lines = sum(1 for row in (open(in_csv)))

#size of chunks of data to write to the csv
chunksize = 10

#start looping through data writing it to a new file for each chunk
for i in range(1,number_lines,chunksize):
     df = pd.read_csv(in_csv,
          header=None,
          nrows = chunksize,#number of rows to read at each loop
          skiprows = i)#skip rows that have been read

     df.to_csv(out_csv,
          index=False,
          header=False,
          mode='a',#append data to csv file
          chunksize=chunksize)#size of data to append for each loop

答案 1 :(得分:3)

最后,这就是我想出的。谢谢你的帮助!

import pandas as pd

df1 = pd.read_csv("read.csv")

run = 0

def crawl(a):

    global run
    run = run + 1

    #Create x, y

    df2 = pd.DataFrame([[x, y]], columns=["X", "Y"])

    if run == 1:
        df2.to_csv("output.csv")
    if run != 1:
        df2.to_csv("output.csv", header=None, mode="a")

df1["Column A"].apply(crawl)

答案 2 :(得分:1)

通过使用iterrows()循环数据框并将每行保存到csv文件,我找到了类似问题的解决方案,在您的情况下,它可能是这样的:

for ix, row in df.iterrows():
    row['Column A'] = crawl(row['Column A'])

    # if you wish to mantain the header
    if ix == 0:
        df.iloc[ix - 1: ix].to_csv('output.csv', mode='a', index=False, sep=',', encoding='utf-8')
    else:
        df.iloc[ix - 1: ix].to_csv('output.csv', mode='a', index=False, sep=',', encoding='utf-8', header=False)