LSTM预测时间序列产生奇数结果

时间:2018-04-08 23:13:37

标签: machine-learning keras time-series lstm rnn

我正在尝试使用Keras预测未来几天的时间序列数据。我的标签数据是未来多天的目标值,回归模型有多个输出神经元(时间序列的“直接方法”)。

以下是使用60天历史记录预测10天的测试数据。

10 days prediction for test data

如您所见,所有日子的未来价值大致相同。我花了很多时间在上面,并且必须承认我可能遗漏了与LSTM有关的东西......

以下是带预测的训练数据:

10 days prediction for training data

为了确认我正在准备数据,我创建了一个“跟踪数据集”,用于可视化数据转换。这是......

数据集:

Open,High,Low,Close,Volume,OpenInt
111,112,113,114,115,0
121,122,123,124,125,0
131,132,133,134,135,0
141,142,143,144,145,0
151,152,153,154,155,0
161,162,163,164,165,0
171,172,173,174,175,0
181,182,183,184,185,0
191,192,193,194,195,0
201,202,203,204,205,0
211,212,213,214,215,0
221,222,223,224,225,0
231,232,233,234,235,0
241,242,243,244,245,0
251,252,253,254,255,0
261,262,263,264,265,0
271,272,273,274,275,0
281,282,283,284,285,0
291,292,293,294,295,0

使用2天历史记录训练集,预测未来3天的值(我使用历史日和未来日期的不同值,这一切对我都有意义),没有功能缩放以便可视化数据转换:

X train (6, 2, 5)
[[[111 112 113 114 115]
  [121 122 123 124 125]]

 [[121 122 123 124 125]
  [131 132 133 134 135]]

 [[131 132 133 134 135]
  [141 142 143 144 145]]

 [[141 142 143 144 145]
  [151 152 153 154 155]]

 [[151 152 153 154 155]
  [161 162 163 164 165]]

 [[161 162 163 164 165]
  [171 172 173 174 175]]]
Y train (6, 3)
[[131 141 151]
 [141 151 161]
 [151 161 171]
 [161 171 181]
 [171 181 191]
 [181 191 201]]

测试集

X test (6, 2, 5)
[[[201 202 203 204 205]
  [211 212 213 214 215]]

 [[211 212 213 214 215]
  [221 222 223 224 225]]

 [[221 222 223 224 225]
  [231 232 233 234 235]]

 [[231 232 233 234 235]
  [241 242 243 244 245]]

 [[241 242 243 244 245]
  [251 252 253 254 255]]

 [[251 252 253 254 255]
  [261 262 263 264 265]]]
Y test (6, 3)
[[221 231 241]
 [231 241 251]
 [241 251 261]
 [251 261 271]
 [261 271 281]
 [271 281 291]]

型号:

def CreateRegressor(self,
                    optimizer='adam', 
                    activation='tanh', # RNN activation
                    init_mode='glorot_uniform', 
                    hidden_neurons=50,
                    dropout_rate=0.0, 
                    weight_constraint=0,
                    stateful=False,
                    # SGD parameters
                    learn_rate=0.01, 
                    momentum=0):
    kernel_constraint = maxnorm(weight_constraint) if weight_constraint > 0 else None
    model = Sequential()

    model.add(LSTM(units=hidden_neurons, activation=activation, kernel_initializer=init_mode, kernel_constraint=kernel_constraint,
                   return_sequences=True, input_shape=(self.X_train.shape[1], self.X_train.shape[2]), stateful=stateful))
    model.add(Dropout(dropout_rate))

    model.add(LSTM(units=hidden_neurons, activation=activation, kernel_initializer=init_mode, kernel_constraint=kernel_constraint,
                   return_sequences=True, stateful=stateful))
    model.add(Dropout(dropout_rate))

    model.add(LSTM(units=hidden_neurons, activation=activation, kernel_initializer=init_mode, kernel_constraint=kernel_constraint,
                   return_sequences=True, stateful=stateful))
    model.add(Dropout(dropout_rate))

    model.add(LSTM(units=hidden_neurons, activation=activation, kernel_initializer=init_mode,  kernel_constraint=kernel_constraint,
                   return_sequences=False, stateful=stateful))
    model.add(Dropout(dropout_rate))

    model.add(Dense(units=self.y_train.shape[1]))

    if (optimizer == 'SGD'):
        optimizer = SGD(lr=learn_rate, momentum=momentum)

    model.compile(optimizer=optimizer, loss='mean_squared_error')
    return model

...我用这些参数创建了:

    self.CreateRegressor(optimizer = 'adam', hidden_neurons = 100)

......然后像这样:

    self.regressor.fit(self.X_train, self.y_train, epochs=100, batch_size=32)

...并预测:

    y_pred = self.regressor.predict(X_test)

......或

    y_pred_train = self.regressor.predict(X_train)

我错过了什么?

0 个答案:

没有答案