无法让LSTM收敛于一个简单的序列

时间:2018-06-03 01:31:59

标签: python keras lstm

我试图了解LSTM的工作原理。我决定用LSTM来解决一个微不足道的回归问题。我有一个简单的,标准化的序列,每7个元素循环一次。

enter image description here

基于序列中的4个元素,我想预测接下来的4个元素。我创建了滑动窗口,我将其提供给网络。

import keras
from keras.models import Sequential
from keras.layers import LSTM, Lambda, Dense
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt

df = pd.DataFrame([1,2,3,4,5,6,7]*10, columns=['close'])
df = df.pct_change().dropna()
df = (df-df.min()) / (df.max() - df.min())
window_len = 4
xs = np.array([list(df.close[i:i+window_len]) for i in range(len(df) - window_len)])
xs = xs.reshape((xs.shape[0], xs.shape[1], 1))
ys = np.array([x for x in xs[window_len:]])
xs = xs[:-window_len]

model = Sequential()
model.add(LSTM(units=10, input_shape=(window_len,1), return_sequences=True, stateful=False))
model.add(LSTM(1, return_sequences=True, stateful=False))

model.compile(loss = keras.losses.mean_squared_error, optimizer='nadam', metrics=['accuracy'])

fit_res = model.fit(xs, ys, epochs=300, batch_size=5)

plt.plot(fit_res.history['loss'])

pred = model.predict(xs)

pred_2d = pred[:,:,0]
ys_2d = ys[:,:,0]

我让它永远训练,不能得到接近预期的结果。

我尝试了不同数量的神经元和不同的批量大小但没有成功。

我做错了什么?

0 个答案:

没有答案