How to customize number of multiple hidden layer units in pytorch LSTM?

时间:2019-01-07 13:20:39

标签: lstm pytorch recurrent-neural-network

In pytorch LSTM, RNN or GRU models, there is a parameter called "num_layers", which controls the number of hidden layers in an LSTM. I wonder that since there are multiple layers in an LSTM, why the parameter "hidden_size" is only one number instead of a list containing the number of hidden states in multiple layers, like [10, 20, 30].

I came across when I worked on a regression project, in which I feed sequence data of (seq_len, batch, feature) to LSTM, and I want to get the scalar output of every time step.

A helpful link to understand the pytorch LSTM framework, here. I'd really appreciate it if anyone can answer this.

1 个答案:

答案 0 :(得分:0)

似乎我已经找到了解决方案,可以改用LSTMCell。有用的链接:[1][2]。但是有没有更简单的方法?