ValueError:层序的输入 0 与层不兼容:预期 ndim=3,发现 ndim=2。收到的完整形状:[无,7]

时间:2021-02-18 10:58:28

标签: python keras deep-learning lstm recurrent-neural-network

我正在构建一个 RNN,我使用 LSTM。 X 矩阵有这个维度 (1824, 7)Y 有这个维度 (1824, 1)。 这是我的模型:

  num_units = 64
  learning_rate = 0.0001
  activation_function = 'sigmoid'
  adam = Adam(lr=learning_rate)
  loss_function = 'mse'
  batch_size = 5
  num_epochs = 50

  # Initialize the RNN
  model = Sequential()
  model.add(LSTM(units = num_units, activation=activation_function, input_shape=(1824, 7, )))
  model.add(LeakyReLU(alpha=0.5))
  model.add(Dropout(0.1))
  model.add(Dense(units = 1))

  # Compiling the RNN
  model.compile(optimizer=adam, loss=loss_function, metrics=['accuracy'])

  history = model.fit(
        X,
        y,
        validation_split=0.1,
        batch_size=batch_size,
        epochs=num_epochs,
        shuffle=False
  )

我知道错误在 input_shape 参数中。当我尝试拟合模型时,出现此错误:

ValueError: Input 0 of layer sequential is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 7]

我看到过类似的问题,我尝试应用其中的一些更改,例如:

input_dim = X.shape
input_dim=(7,)
input_dim=(1824, 7, 1)

但无论如何我都遇到了这种错误。我该如何解决?

1 个答案:

答案 0 :(得分:0)

正如@Nicolas Gervais 所评论的,

Tensorflow Keras LSTM 需要输入:形状为 [batch, timesteps, feature] 的 3D 张量。

工作示例代码

import tensorflow as tf
inputs = tf.random.normal([32, 10, 8])
print(inputs.shape)
lstm = tf.keras.layers.LSTM(4)
output = lstm(inputs)
print(output.shape)

输出

(32, 10, 8)
(32, 4)