初始代码如下。应该如何更改激活才能使用leakyrelu?
model = tf.keras.models.Sequential([
L.InputLayer(input_shape=shape),
L.BatchNormalization(),
L.Dropout(0.25),
tfa.layers.WeightNormalization(L.Dense(512,activation='relu',kernel_initializer="he_normal")),
L.BatchNormalization(),
L.Dropout(0.3),
tfa.layers.WeightNormalization(L.Dense(output_shape,activation="sigmoid",\
kernel_initializer="he_normal")
])
我尝试了很多方法来将Leakyrelu和weightnorm放在一起,例如:
L.BatchNormalization(),
L.Dropout(0.25),
L.Dense(512,kernel_initializer='he_normal'),
tfa.layers.WeightNormalization(L.LeakyReLU(alpha=0.1)),
或:
L.BatchNormalization(),
L.Dropout(0.25),
tfa.layers.WeightNormalization(L.Dense(512,activation=L.LeakyReLU(alpha=0.1),\
kernel_initializer="he_normal")),
但是它们都不正确。