我应该如何一起使用leakyrelu和weightnorml?

时间:2020-10-31 07:37:23

标签: python keras activation

初始代码如下。应该如何更改激活才能使用leakyrelu?

model = tf.keras.models.Sequential([
    L.InputLayer(input_shape=shape),
      
    L.BatchNormalization(),
    L.Dropout(0.25),             
    tfa.layers.WeightNormalization(L.Dense(512,activation='relu',kernel_initializer="he_normal")),

    L.BatchNormalization(),
    L.Dropout(0.3),

    tfa.layers.WeightNormalization(L.Dense(output_shape,activation="sigmoid",\
                                   kernel_initializer="he_normal")
])

我尝试了很多方法来将Leakyrelu和weightnorm放在一起,例如:

L.BatchNormalization(),
L.Dropout(0.25),
L.Dense(512,kernel_initializer='he_normal'),
tfa.layers.WeightNormalization(L.LeakyReLU(alpha=0.1)),

或:

L.BatchNormalization(),
L.Dropout(0.25),
tfa.layers.WeightNormalization(L.Dense(512,activation=L.LeakyReLU(alpha=0.1),\
                               kernel_initializer="he_normal")),

但是它们都不正确。

0 个答案:

没有答案