随后的辍学层都与喀拉拉邦的第一层相连?

时间:2018-10-31 22:16:23

标签: tensorflow neural-network keras lstm dropout

因此,我刚刚安装了TensorBoard,并查看了我一直在研究的LSTM,并在图形上看到了我所期望不到的东西。我有5个隐藏的LSTM层,每个层后面都有一个辍学层,它表明第一个辍学层直接连接到下面的4个辍学层中的每一个:

LSTM深度神经网络图: LSTM deep neural network graph

这到底是怎么回事?我期望所有这些只是从顶部到底部的简单直线。这是我的结构在代码中的样子,我没有更改很多默认参数...

# dataset.NUM_STEPS = 64
# dataset.NUM_FEATURES = 1109
# len(dataset.note_to_one_hot) = 123
# len(dataset.duration_to_one_hot) = 846
# len(dataset.offset_to_one_hot) = 140

inputs = keras.layers.Input(shape=(dataset.NUM_STEPS, dataset.NUM_FEATURES))
x = keras.layers.LSTM(units=512, return_sequences=True)(inputs)
x = keras.layers.Dropout(.333)(x)
x = keras.layers.LSTM(units=512, return_sequences=True)(x)
x = keras.layers.Dropout(.333)(x)
x = keras.layers.LSTM(units=512, return_sequences=True)(x)
x = keras.layers.Dropout(.333)(x)
x = keras.layers.LSTM(units=512, return_sequences=True)(x)
x = keras.layers.Dropout(.333)(x)
x = keras.layers.LSTM(units=512)(x)
x = keras.layers.Dropout(.333)(x)

note_output = keras.layers.Dense(name="n", units=len(dataset.note_to_one_hot), activation='softmax')(x)
duration_output = keras.layers.Dense(name="d", units=len(dataset.duration_to_one_hot), activation='softmax')(x)
offset_output = keras.layers.Dense(name="o", units=len(dataset.offset_to_one_hot), activation='softmax')(x)

model = keras.models.Model(name=model_name, inputs=inputs, outputs=[note_output, duration_output, offset_output])
optimizer = keras.optimizers.RMSprop(lr=6.66e-5, rho=0.9, epsilon=None, decay=0.0)
losses = {"n": "categorical_crossentropy", "d": "categorical_crossentropy", "o": "categorical_crossentropy"}
metrics = {"n": "categorical_accuracy", "d": "categorical_accuracy", "o": "categorical_accuracy"}
model.compile(optimizer=optimizer, loss=losses, metrics=metrics)

0 个答案:

没有答案