用OHLC重采样熊猫

时间:2019-01-21 14:00:07

标签: python pandas

我是熊猫新手。所以让我知道我是否在做愚蠢的事情。

输入文件:(下面仅显示head。文件有10K +行)

$ head /var/tmp/ticks_data.csv 
2019-01-18 14:55:00,296
2019-01-18 14:55:01,296
2019-01-18 14:55:02,296
2019-01-18 14:55:03,296.05
2019-01-18 14:55:04,296.05
2019-01-18 14:55:05,296
2019-01-18 14:55:06,296
2019-01-18 14:55:08,296
2019-01-18 14:55:09,296
2019-01-18 14:55:10,296.05

代码:

$ cat create_candles.py 

import pandas as pd

filename = '/var/tmp/ticks_data.csv'
df = pd.read_csv(filename, names=['timestamp', 'ltp'], index_col=1, parse_dates=['timestamp'])
# print(df.head())
data = df['ltp'].resample('1min').ohlc()
print(data)

错误:

$ python3 create_candles.py 
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/indexes/base.py", line 3078, in get_loc
    return self._engine.get_loc(key)
  File "pandas/_libs/index.pyx", line 140, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/index.pyx", line 162, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/hashtable_class_helper.pxi", line 1492, in pandas._libs.hashtable.PyObjectHashTable.get_item
  File "pandas/_libs/hashtable_class_helper.pxi", line 1500, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'ltp'

我认为文件包含未知字符,因此我在dos2unix上运行了/var/tmp/ticks_data.csv,但仍然是同样的问题。

如果我尝试从index_col=1,中删除df

df = pd.read_csv(filename, names=['timestamp', 'ltp'], parse_dates=['timestamp'])

然后我得到以下错误:

Traceback (most recent call last):
  File "/Users/dheeraj.kabra/Desktop/Ticks/create_candles.py", line 6, in <module>
    data = df['ltp'].resample('1min').ohlc()
  File "/usr/local/lib/python3.7/site-packages/pandas/core/generic.py", line 7110, in resample
    base=base, key=on, level=level)
  File "/usr/local/lib/python3.7/site-packages/pandas/core/resample.py", line 1148, in resample
    return tg._get_resampler(obj, kind=kind)
  File "/usr/local/lib/python3.7/site-packages/pandas/core/resample.py", line 1276, in _get_resampler
    "but got an instance of %r" % type(ax).__name__)
TypeError: Only valid with DatetimeIndex, TimedeltaIndex or PeriodIndex, but got an instance of 'RangeIndex'
[Finished in 0.5s with exit code 1]

任何解决此问题的指针将非常有帮助。

1 个答案:

答案 0 :(得分:1)

index_col更改为0['timestamp'],以将第一列转换为DatatimeIndex

import pandas as pd

temp=u"""2019-01-18 14:55:00,296
2019-01-18 14:55:01,296
2019-01-18 14:55:02,296
2019-01-18 14:55:03,296.05
2019-01-18 14:55:04,296.05
2019-01-18 14:55:05,296
2019-01-18 14:55:06,296
2019-01-18 14:55:08,296
2019-01-18 14:55:09,296
2019-01-18 14:55:10,296.05"""
#after testing replace 'pd.compat.StringIO(temp)' to 'filename.csv'
#df = pd.read_csv(pd.compat.StringIO(temp), sep=";", index_col=None, parse_dates=False)
df = pd.read_csv(pd.compat.StringIO(temp), 
                 names=['timestamp', 'ltp'], 
                 index_col=0, 
                 parse_dates=['timestamp'])

替代解决方案:

df = pd.read_csv(pd.compat.StringIO(temp), 
                 names=['timestamp', 'ltp'], 
                 index_col=['timestamp'], 
                 parse_dates=['timestamp'])

print (df)
                        ltp
timestamp                  
2019-01-18 14:55:00  296.00
2019-01-18 14:55:01  296.00
2019-01-18 14:55:02  296.00
2019-01-18 14:55:03  296.05
2019-01-18 14:55:04  296.05
2019-01-18 14:55:05  296.00
2019-01-18 14:55:06  296.00
2019-01-18 14:55:08  296.00
2019-01-18 14:55:09  296.00
2019-01-18 14:55:10  296.05

data = df.resample('1min')['ltp'].ohlc()
print(data)
                      open    high    low   close
timestamp                                        
2019-01-18 14:55:00  296.0  296.05  296.0  296.05

原始解决方案的详细信息-index_col=1解析第二列,此处为ltp

df = pd.read_csv(pd.compat.StringIO(temp), 
                 names=['timestamp', 'ltp'], 
                 index_col=1, 
                 parse_dates=['timestamp'])


print (df)
                 timestamp
ltp                       
296.00 2019-01-18 14:55:00
296.00 2019-01-18 14:55:01
296.00 2019-01-18 14:55:02
296.05 2019-01-18 14:55:03
296.05 2019-01-18 14:55:04
296.00 2019-01-18 14:55:05
296.00 2019-01-18 14:55:06
296.00 2019-01-18 14:55:08
296.00 2019-01-18 14:55:09
296.05 2019-01-18 14:55:10