以行方式组合pandas数据帧的有效方法

时间:2016-07-07 13:00:27

标签: python numpy pandas

我有14个数据框,每个数据框有14列,超过250,000行。 数据框具有相同的列标题,我想按行合并数据帧。我试图将数据帧连接到“成长”状态。 DataFrame,它花了几个小时。

基本上,我做了13次以下的事情:

DF = pd.DataFrame()
for i in range(13):   
    DF = pd.concat([DF, subDF])

stackoverflow answer here建议将所有子数据帧附加到列表中,然后连接子数据帧列表。

这听起来像是在做这样的事情:

DF = pd.DataFrame()
lst = [subDF, subDF, subDF....subDF] #up to 13 times
for subDF in lst:
    DF = pd.concat([DF, subDF])

Aren他们是一回事吗?也许我误解了建议的工作流程。这就是我测试的内容。

import numpy
import pandas as pd
import timeit

def test1():
    "make all subDF and then concatenate them"
    numpy.random.seed(1)
    subDF = pd.DataFrame(numpy.random.rand(1))
    lst = [subDF, subDF, subDF]
    DF = pd.DataFrame()
    for subDF in lst:
        DF = pd.concat([DF, subDF], axis=0,ignore_index=True)

def test2():
    "add each subDF to the collecitng DF as you're making the subDF"
    numpy.random.seed(1)
    DF = pd.DataFrame()
    for i in range(3):
        subDF = pd.DataFrame(numpy.random.rand(1))
        DF = pd.concat([DF, subDF], axis=0,ignore_index=True)

print('test1() takes {0} sec'.format(timeit.timeit(test1, number=1000)))
print('test2() takes {0} sec'.format(timeit.timeit(test2, number=1000)))

>> Output

test1() takes 12.732409087137057 sec
test2() takes 15.097430311612698 sec

感谢您就有效连接多个大型数据帧的有效方法提出建议。谢谢!

1 个答案:

答案 0 :(得分:6)

创建包含所有数据框的列表:

dfs = []
for i in range(13):
    df = ... # However it is that you create your dataframes   
    dfs.append(df)

然后一举将它们连接起来:

merged = pd.concat(dfs) # add ignore_index=True if appropriate

这比你的代码快得多,因为它创建了14个数据帧(原来的13个加merged),而你的代码创建了26个(原来的13个加13个中间合并)。

编辑:

以下是您的测试代码的变体。

import numpy
import pandas as pd
import timeit

def test_gen_time():
    """Create three large dataframes, but don't concatenate them"""
    for i in range(3):
        df = pd.DataFrame(numpy.random.rand(10**6))

def test_sequential_concat():
    """Create three large dataframes, concatenate them one by one"""
    DF = pd.DataFrame()
    for i in range(3):
        df = pd.DataFrame(numpy.random.rand(10**6))
        DF = pd.concat([DF, df], ignore_index=True)

def test_batch_concat():
    """Create three large dataframes, concatenate them at the end"""
    dfs = []
    for i in range(3):
        df = pd.DataFrame(numpy.random.rand(10**6))
        dfs.append(df)
    DF = pd.concat(dfs, ignore_index=True)

print('test_gen_time() takes {0} sec'
          .format(timeit.timeit(test_gen_time, number=200)))
print('test_sequential_concat() takes {0} sec'
          .format(timeit.timeit(test_sequential_concat, number=200)))
print('test_batch_concat() takes {0} sec'
          .format(timeit.timeit(test_batch_concat, number=200)))

输出:

test_gen_time() takes 10.095820872998956 sec
test_sequential_concat() takes 17.144756617000894 sec
test_batch_concat() takes 12.99131180600125 sec

狮子的份额对应于生成数据帧。批量连接大约需要2.9秒;顺序连接需要7秒以上。