多级索引中的聚合

时间:2017-10-19 19:44:52

标签: python pandas pandas-groupby

我有一个具有两个级别列索引的数据框。我需要在两个键(列)上有不同的聚合函数。但是,我的代码收到错误。如何在多级数据框中的多个列上进行聚合。

dic1 = {('count', 'N.A.'): {Period('1993-01', 'M'): 0,
  Period('1993-02', 'M'): 0,
  Period('1993-03', 'M'): 0},
 ('count', 'No'): {Period('1993-01', 'M'): 1,
  Period('1993-02', 'M'): 1,
  Period('1993-03', 'M'): 1},
 ('count', 'Yes'): {Period('1993-01', 'M'): 0,
  Period('1993-02', 'M'): 0,
  Period('1993-03', 'M'): 0},
 ('sum', 'N.A.'): {Period('1993-01', 'M'): nan,
  Period('1993-02', 'M'): nan,
  Period('1993-03', 'M'): nan},
 ('sum', 'No'): {Period('1993-01', 'M'): 6.5820000000000007,
  Period('1993-02', 'M'): 131.1865,
  Period('1993-03', 'M'): 133.31049999999999},
 ('sum', 'Yes'): {Period('1993-01', 'M'): nan,
  Period('1993-02', 'M'): nan,
  Period('1993-03', 'M'): nan}}

df1 = pd.DataFrame(dic1)

df1.to_timestamp(how='end').groupby(pd.TimeGrouper('A') ).agg(
{'count':['max', 'min', 'median', 'last'] , 
 'sum':['mean', 'max' , 'last']} )

error:  KeyError: 'sum' 

enter image description here

2 个答案:

答案 0 :(得分:2)

一种hacky方法是分别提取所有count和sum列:

In [11]: agg_dict = {col: ['mean', 'max' , 'median', 'last'] for col in df1.columns[df1.columns.get_level_values(0) == "count"]}

In [12]: agg_dict.update({col: ['mean', 'max' , 'last'] for col in df1.columns[df1.columns.get_level_values(0) == "sum"]})

In [13]: g = df1.to_timestamp(how='end').groupby(pd.TimeGrouper('A') )

In [14]: g.agg(agg_dict)
Out[14]:
            sum                                                       count
           N.A.                  No                      Yes           N.A.                   No                  Yes
           mean max last       mean       max      last mean max last  mean max median last mean max median last mean max median last
1993-12-31  NaN NaN  NaN  90.359667  133.3105  133.3105  NaN NaN  NaN     0   0      0    0    1   1      1    1    0   0      0    0

答案 1 :(得分:1)

您可以在分组之前展平MultiIndex列:

df1 = pd.DataFrame(dic1)
df2 = df1.to_timestamp(how='end')
df2 = df2.rename_axis(['operation', 'YN'], axis=1)
df3 = df2.stack(level='YN').reset_index('YN')
# operation     YN  count       sum
# 1993-01-31  N.A.      0       NaN
# 1993-01-31    No      1    6.5820
# 1993-01-31   Yes      0       NaN
# 1993-02-28  N.A.      0       NaN
# 1993-02-28    No      1  131.1865
# 1993-02-28   Yes      0       NaN
# 1993-03-31  N.A.      0       NaN
# 1993-03-31    No      1  133.3105
# 1993-03-31   Yes      0       NaN

YN列索引级别移动到列中后(通过调用 stack/reset_index),您可以通常的方式解决问题:

import numpy as np
import pandas as pd
Period = pd.Period
nan = np.nan

dic1 = {('count', 'N.A.'): {Period('1993-01', 'M'): 0, Period('1993-02', 'M'): 0, Period('1993-03', 'M'): 0}, ('count', 'No'): {Period('1993-01', 'M'): 1, Period('1993-02', 'M'): 1, Period('1993-03', 'M'): 1}, ('count', 'Yes'): {Period('1993-01', 'M'): 0, Period('1993-02', 'M'): 0, Period('1993-03', 'M'): 0}, ('sum', 'N.A.'): {Period('1993-01', 'M'): nan, Period('1993-02', 'M'): nan, Period('1993-03', 'M'): nan}, ('sum', 'No'): {Period('1993-01', 'M'): 6.5820000000000007, Period('1993-02', 'M'): 131.1865, Period('1993-03', 'M'): 133.31049999999999}, ('sum', 'Yes'): {Period('1993-01', 'M'): nan, Period('1993-02', 'M'): nan, Period('1993-03', 'M'): nan}}

df1 = pd.DataFrame(dic1)
df2 = df1.to_timestamp(how='end')
df2 = df2.rename_axis(['operation', 'YN'], axis=1)
df3 = df2.stack(level='YN').reset_index('YN')

grouped = df3.groupby([pd.TimeGrouper('A'), 'YN'])
result = grouped.agg(
    {'count':['max', 'min', 'median', 'last'],  'sum':['mean', 'max' , 'last']})
result = result.unstack('YN')
print(result)

产量

            sum                                                      count  \
           mean                 max               last                 max   
YN         N.A.         No Yes N.A.        No Yes N.A.        No Yes  N.A.   
1993-12-31  NaN  90.359667 NaN  NaN  133.3105 NaN  NaN  133.3105 NaN     0   

           ...                                            
           ...      min        median        last         
YN         ... Yes N.A. No Yes   N.A. No Yes N.A. No Yes  
1993-12-31 ...   0    0  1   0      0  1   0    0  1   0