我正在尝试在pandas
数据框的加权平均值之上加权标准偏差。我有一个pandas
数据框,如:
import numpy as np
import pandas as pd
df = pd.DataFrame({"Date": pd.date_range(start='2018-01-01', end='2018-01-03 18:00:00', freq='6H'),
"Weight": np.random.uniform(3, 5, 12),
"V1": np.random.uniform(10, 15, 12),
"V2": np.random.uniform(10, 15, 12),
"V3": np.random.uniform(10, 15, 12)})
目前,为了获得受this post启发的加权平均值,我正在执行以下操作:
def weighted_average_std(grp):
return grp._get_numeric_data().multiply(grp['Weight'], axis=0).sum()/grp['Weight'].sum()
df.index = df["Date"]
df_agg = df.groupby(pd.Grouper(freq='1D')).apply(weighted_average_std).reset_index()
df_agg
我得到以下内容:
Date V1 V2 V3 Weight
0 2018-01-01 11.421749 13.090178 11.639424 3.630196
1 2018-01-02 12.142917 11.605284 12.187473 4.056303
2 2018-01-03 12.034015 13.159132 11.658969 4.318753
我想修改weighted_average_std
,以便除了weighted average
之外还返回每列的标准偏差。我们的想法是以矢量化方式使用每个组的加权平均值。 Weighted Standard Deviation
的新列名称可能类似于V1_WSD
,V2_WSD
和V3_WSD
。
PS1:This post经历了加权标准差理论。
PS2:Weight
中的df_agg
列无意义。
答案 0 :(得分:1)
您可以使用EOL's NumPy-based code
计算加权平均值和标准差。要在Pandas groupby/apply
操作中使用此功能,请使weighted_average_std
返回DataFrame:
import numpy as np
import pandas as pd
def weighted_average_std(grp):
"""
Based on http://stackoverflow.com/a/2415343/190597 (EOL)
"""
tmp = grp.select_dtypes(include=[np.number])
weights = tmp['Weight']
values = tmp.drop('Weight', axis=1)
average = np.ma.average(values, weights=weights, axis=0)
variance = np.dot(weights, (values - average) ** 2) / weights.sum()
std = np.sqrt(variance)
return pd.DataFrame({'mean':average, 'std':std}, index=values.columns)
np.random.seed(0)
df = pd.DataFrame({
"Date": pd.date_range(start='2018-01-01', end='2018-01-03 18:00:00', freq='6H'),
"Weight": np.random.uniform(3, 5, 12),
"V1": np.random.uniform(10, 15, 12),
"V2": np.random.uniform(10, 15, 12),
"V3": np.random.uniform(10, 15, 12)})
df.index = df["Date"]
df_agg = df.groupby(pd.Grouper(freq='1D')).apply(weighted_average_std).unstack(-1)
print(df_agg)
产量
mean std
V1 V2 V3 V1 V2 V3
Date
2018-01-01 12.105253 12.314079 13.566136 1.803014 1.725761 0.679279
2018-01-02 13.223172 12.534893 11.860456 1.709583 0.950338 1.153895
2018-01-03 13.782625 12.013557 12.105231 0.969099 1.189149 1.249064