我是Keras的新手,所以我非常感谢这里的任何帮助。对于我的项目,我正在尝试在多个时间序列上训练神经网络。我通过运行for循环使其适合模型的每个时间序列来使其工作。代码如下:
for i in range(len(train)):
history = model.fit(train_X[i], train_Y[i], epochs=20, batch_size=6, verbose=0, shuffle=True)
如果我没记错的话,我正在这里进行在线培训。现在,我正在尝试进行批量培训,以查看结果是否更好。我试图拟合一个包含所有时间序列的列表(每个时间序列都转换为numpy数组),但出现此错误:
Error when checking model input: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 array(s), but instead got the following list of 56 arrays:
以下是有关数据集和模型的信息:
model = Sequential()
model.add(LSTM(1, input_shape=(1,16),return_sequences=True))
model.add(Flatten())
model.add(Dense(1, activation='tanh'))
model.compile(loss='mae', optimizer='adam', metrics=['accuracy'])
model.summary()
Layer (type) Output Shape Param #
=================================================================
lstm_2 (LSTM) (None, 1, 1) 72
_________________________________________________________________
flatten_2 (Flatten) (None, 1) 0
_________________________________________________________________
dense_2 (Dense) (None, 1) 2
=================================================================
Total params: 74
Trainable params: 74
Non-trainable params: 0
print(len(train_X), train_X[0].shape, len(train_Y), train_Y[0].shape)
56 (1, 23, 16) 56 (1, 23, 1)
这是给我错误的代码块:
pyplot.figure(figsize=(16, 25))
history = model.fit(train_X, train_Y, epochs=1, verbose=1, shuffle=False, batch_size = len(train_X))
答案 0 :(得分:0)
LSTM的输入形状应为-batch_size
,timesteps
,features
。但是,如果您希望使用{{ 1}}。
batch_size
我不确定它是否符合您的目的。