这两种使用Dense
层的方法之间是否有区别?似乎输出形状相同,参数数量相同。
def test_rnn_output_v1():
max_seq_length = 10
n_features = 8
rnn_dim = 64
dense_dim = 16
input = Input(shape=(max_seq_length, n_features))
out = LSTM(rnn_dim, return_sequences=True)(input)
out = Dense(dense_dim)(out)
model = Model(inputs=[input], outputs=out)
print(model.summary())
# (None, max_seq_length, n_features)
# (None, max_seq_length, dense_dim)
def test_rnn_output_v2():
max_seq_length = 10
n_features = 8
rnn_dim = 64
dense_dim = 16
input = Input(shape=(max_seq_length, n_features))
out = LSTM(rnn_dim, return_sequences=True)(input)
out = TimeDistributed(Dense(dense_dim))(out)
model = Model(inputs=[input], outputs=out)
print(model.summary())
# (None, max_seq_length, n_features)
# (None, max_seq_length, dense_dim)
答案 0 :(得分:2)
TimeDistributed(Dense(...))
和Dense(...)
之间没有区别,它们具有完全相同的输出尺寸和连接性。这是因为Dense
层应用于其输入的最后一个轴;因此,是否将其包装在TimeDistributed
层中并没有什么不同。 This answer更详细地说明了Dense
层的工作原理。