时间序列预测未来值LSTM

时间:2019-06-06 15:40:38

标签: keras time-series lstm

可以访问18个月的每日数据,获得549个数据点。想要生成未来90天的预测

scaler = MinMaxScaler(feature_range=(-1, 1))
dataset = scaler.fit_transform(daily_data)


train, test = dataset[:365], dataset[365:len(dataset),:]


def create_dataset(dataset, look_back=1):
    dataX, dataY = [], []
    for i in range(len(dataset)-look_back-1):
       a = dataset[i:(i+look_back), 0]
       dataX.append(a)
       dataY.append(dataset[i + look_back, 0])
return numpy.array(dataX), numpy.array(dataY)

look_back = 1
trainX, trainY = create_dataset(train, look_back)
testX, testY = create_dataset(test, look_back)

trainX = numpy.reshape(trainX, (trainX.shape[0], 1,    

trainX.shape[1]))
testX = numpy.reshape(testX, (testX.shape[0], 1, testX.shape[1]))

# Define the LSTM model
model = Sequential()
model.add(LSTM(150, input_shape=(1, look_back)))
model.add(Dropout(0.5))
model.add(Dense(1))
model.compile(loss='mean_squared_error', optimizer='adam')

model.fit(trainX, trainY, epochs=150, batch_size=1, verbose=2)


# make predictions
trainPredict = model.predict(trainX)
testPredict = model.predict(testX)
# invert predictions
trainPredict = scaler.inverse_transform(trainPredict)
trainY = scaler.inverse_transform([trainY])

testPredict = scaler.inverse_transform(testPredict)
testY = scaler.inverse_transform([testY])

# calculate root mean squared error
trainScore = math.sqrt(mean_squared_error(trainY[0],      

trainPredict[:,0]))
print('Train Score: %.2f RMSE' % (trainScore))
testScore = math.sqrt(mean_squared_error(testY[0], 

testPredict[:,0]))
print('Test Score: %.2f RMSE' % (testScore))

我尝试使用以下代码生成未来的90个时间步长:

def moving_test_window_preds(n_future_preds):
     preds_moving = []                                    
     moving_test_window = [testX[0,:].tolist()]          
     moving_test_window = np.array(moving_test_window)    

for i in range(n_future_preds):
    preds_one_step = model.predict(moving_test_window) 
    preds_moving.append(preds_one_step[0,0]) 
    preds_one_step = preds_one_step.reshape(1,1,1) 
    moving_test_window =np.concatenate((moving_test_window[:,1:,:],   

    preds_one_step), axis=1) 

preds_moving = scaler.inverse_transform([preds_moving])

return preds_moving

但是我认为这不是正确的方法。我尝试参考这篇文章:

https://machinelearningmastery.com/how-to-develop-lstm-models-for-multi-step-time-series-forecasting-of-household-power-consumption/

我是深度学习的新手,尤其是在LSTM领域。如果有人可以提供帮助,那就太好了。

0 个答案:

没有答案