按日期填充datetimeindex差距

时间:2015-11-27 10:57:56

标签: python pandas reindex gaps-in-data datetimeindex

我有两个datetimeindexed数据帧。一个是缺少其中一些日期时间(df1),而另一个是完整的(有这个系列中没有任何间隙的常规时间戳)并且充满了NaN' s {{1} })。

我试图将df1中的值与df2的索引相匹配,填充df2这样的NaN并不是这样的存在于datetimeindex

示例:

df1

使用In [51]: df1 Out [51]: value 2015-01-01 14:00:00 20 2015-01-01 15:00:00 29 2015-01-01 16:00:00 41 2015-01-01 17:00:00 43 2015-01-01 18:00:00 26 2015-01-01 19:00:00 20 2015-01-01 20:00:00 31 2015-01-01 21:00:00 35 2015-01-01 22:00:00 39 2015-01-01 23:00:00 17 2015-03-01 00:00:00 6 2015-03-01 01:00:00 37 2015-03-01 02:00:00 56 2015-03-01 03:00:00 12 2015-03-01 04:00:00 41 2015-03-01 05:00:00 31 ... ... 2018-12-25 23:00:00 41 <34843 rows × 1 columns> In [52]: df2 = pd.DataFrame(data=None, index=pd.date_range(freq='60Min', start=df1.index.min(), end=df1.index.max())) df2['value']=np.NaN df2 Out [52]: value 2015-01-01 14:00:00 NaN 2015-01-01 15:00:00 NaN 2015-01-01 16:00:00 NaN 2015-01-01 17:00:00 NaN 2015-01-01 18:00:00 NaN 2015-01-01 19:00:00 NaN 2015-01-01 20:00:00 NaN 2015-01-01 21:00:00 NaN 2015-01-01 22:00:00 NaN 2015-01-01 23:00:00 NaN 2015-01-02 00:00:00 NaN 2015-01-02 01:00:00 NaN 2015-01-02 02:00:00 NaN 2015-01-02 03:00:00 NaN 2015-01-02 04:00:00 NaN 2015-01-02 05:00:00 NaN ... ... 2018-12-25 23:00:00 NaN <34906 rows × 1 columns> 会返回与df2.combine_first(df1)相同的数据,这会填补任何不应该是具有某些值而不是NaN的数据的空白。

df1.reindex(index= df2.index)

这是我希望得到的:

In  [53]: Result = df2.combine_first(df1)
          Result
Out [53]:                       value
          2015-01-01 14:00:00   20
          2015-01-01 15:00:00   29
          2015-01-01 16:00:00   41
          2015-01-01 17:00:00   43
          2015-01-01 18:00:00   26
          2015-01-01 19:00:00   20
          2015-01-01 20:00:00   31
          2015-01-01 21:00:00   35
          2015-01-01 22:00:00   39
          2015-01-01 23:00:00   17
          2015-01-02 00:00:00   35
          2015-01-02 01:00:00   53
          2015-01-02 02:00:00   28
          2015-01-02 03:00:00   48
          2015-01-02 04:00:00   42
          2015-01-02 05:00:00   51
          ...                   ...
          2018-12-25 23:00:00   41

          <34906 rows × 1 columns>

有人可以解释为什么会发生这种情况,以及如何设置这些值的填充方式?

1 个答案:

答案 0 :(得分:1)

你需要resample df1的IIUC,因为你有一个不规则的frequency并且你需要定期频率:

print df1.index.freq
None

print Result.index.freq
<60 * Minutes>

EDIT1
您可以使用函数asfreq代替resample - docresample vs asfreq

<强> EDIT2
首先,我认为resample无效,因为重新取样后Resultdf1相同。但我尝试print df1.info()print Result.info()获得了不同的结果 - 34857 entries vs 34920 entries。 因此,我尝试查找包含NaN值的行,并返回63 rows

所以我认为resample效果很好。

import pandas as pd

df1 = pd.read_csv('test/GapInTimestamps.csv', sep=",", index_col=[0], parse_dates=[0])
print df1.head()

#                     value
#Date/Time                 
#2015-01-01 00:00:00     52
#2015-01-01 01:00:00      5
#2015-01-01 02:00:00     12
#2015-01-01 03:00:00     54
#2015-01-01 04:00:00     47
print df1.info()

#<class 'pandas.core.frame.DataFrame'>
#DatetimeIndex: 34857 entries, 2015-01-01 00:00:00 to 2018-12-25 23:00:00
#Data columns (total 1 columns):
#value    34857 non-null int64
#dtypes: int64(1)
#memory usage: 544.6 KB
#None

Result  = df1.resample('60min')
print Result.head()

#                     value
#Date/Time                 
#2015-01-01 00:00:00     52
#2015-01-01 01:00:00      5
#2015-01-01 02:00:00     12
#2015-01-01 03:00:00     54
#2015-01-01 04:00:00     47
print Result.info()

#<class 'pandas.core.frame.DataFrame'>
#DatetimeIndex: 34920 entries, 2015-01-01 00:00:00 to 2018-12-25 23:00:00
#Freq: 60T
#Data columns (total 1 columns):
#value    34857 non-null float64
#dtypes: float64(1)
#memory usage: 545.6 KB
#None

#find values with NaN
resultnan =  Result[Result.isnull().any(axis=1)]
#temporaly display 999 rows and 15 columns
with pd.option_context('display.max_rows', 999, 'display.max_columns', 15):
    print resultnan

#                     value
#Date/Time                 
#2015-01-13 19:00:00    NaN
#2015-01-13 20:00:00    NaN
#2015-01-13 21:00:00    NaN
#2015-01-13 22:00:00    NaN
#2015-01-13 23:00:00    NaN
#2015-01-14 00:00:00    NaN
#2015-01-14 01:00:00    NaN
#2015-01-14 02:00:00    NaN
#2015-01-14 03:00:00    NaN
#2015-01-14 04:00:00    NaN
#2015-01-14 05:00:00    NaN
#2015-01-14 06:00:00    NaN
#2015-01-14 07:00:00    NaN
#2015-01-14 08:00:00    NaN
#2015-01-14 09:00:00    NaN
#2015-02-01 00:00:00    NaN
#2015-02-01 01:00:00    NaN
#2015-02-01 02:00:00    NaN
#2015-02-01 03:00:00    NaN
#2015-02-01 04:00:00    NaN
#2015-02-01 05:00:00    NaN
#2015-02-01 06:00:00    NaN
#2015-02-01 07:00:00    NaN
#2015-02-01 08:00:00    NaN
#2015-02-01 09:00:00    NaN
#2015-02-01 10:00:00    NaN
#2015-02-01 11:00:00    NaN
#2015-02-01 12:00:00    NaN
#2015-02-01 13:00:00    NaN
#2015-02-01 14:00:00    NaN
#2015-02-01 15:00:00    NaN
#2015-02-01 16:00:00    NaN
#2015-02-01 17:00:00    NaN
#2015-02-01 18:00:00    NaN
#2015-02-01 19:00:00    NaN
#2015-02-01 20:00:00    NaN
#2015-02-01 21:00:00    NaN
#2015-02-01 22:00:00    NaN
#2015-02-01 23:00:00    NaN
#2015-11-01 00:00:00    NaN
#2015-11-01 01:00:00    NaN
#2015-11-01 02:00:00    NaN
#2015-11-01 03:00:00    NaN
#2015-11-01 04:00:00    NaN
#2015-11-01 05:00:00    NaN
#2015-11-01 06:00:00    NaN
#2015-11-01 07:00:00    NaN
#2015-11-01 08:00:00    NaN
#2015-11-01 09:00:00    NaN
#2015-11-01 10:00:00    NaN
#2015-11-01 11:00:00    NaN
#2015-11-01 12:00:00    NaN
#2015-11-01 13:00:00    NaN
#2015-11-01 14:00:00    NaN
#2015-11-01 15:00:00    NaN
#2015-11-01 16:00:00    NaN
#2015-11-01 17:00:00    NaN
#2015-11-01 18:00:00    NaN
#2015-11-01 19:00:00    NaN
#2015-11-01 20:00:00    NaN
#2015-11-01 21:00:00    NaN
#2015-11-01 22:00:00    NaN
#2015-11-01 23:00:00    NaN