我有:
def forward(self, x, hidden=None):
lstm_out, hidden = self.lstm(x, hidden)
print('lstm_out.size', lstm_out.size())
lstm_out = lstm_out.view(-1, lstm_out.shape[2])
out = self.linear(lstm_out)
print('out', out.size())
这不起作用。我的self.linear
是self.linear = nn.Linear(64 * seq_length, 5)
。我可以在以后更改5。
因此,我的size
中的lstm_out
是torch.Size([64, 20, 322])
。但是随后在执行self.linear
:RuntimeError: size mismatch, m1: [1280 x 322], m2: [1280 x 5] at
时出现错误。我在做什么错了?