连接Keras中的图层

时间:2019-10-16 23:01:24

标签: keras deep-learning keras-layer

我正在尝试实现此paper的代码,作为深度学习的初学者,我无法完全理解他们通过级联来生成“广而深的神经网络(WDNN)”的工作。这是他们用来生成WDNN的函数:

 def WDNN(data):
    input = Input(shape=(data.shape[1],))
    x = Dense(256, activation='relu', kernel_regularizer=regularizers.l2(1e-8))(input)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Dense(256, activation='relu', kernel_regularizer=regularizers.l2(1e-8))(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Dense(256, activation='relu', kernel_regularizer=regularizers.l2(1e-8))(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    wide_deep = concatenate([input, x])
    preds = Dense(1, activation='sigmoid', kernel_regularizer=regularizers.l2(1e-8))(wide_deep)
    model = Model(input=input, output=preds)
    opt = Adam(lr=np.exp(-1.0 * 9))
    model.compile(optimizer=opt,
                  loss='binary_crossentropy',
                  metrics=['accuracy'])
    return model

按照Keras开发人员写的《用Keras进行深度学习》一书中的指南,我想到了以下功能。但是我无法弄清楚原始函数实际上是如何使用串联的,如何在自己的代码中实现它以执行相同的操作?任何提示表示赞赏。

def WDNN(data):
    model = models.Sequential()
    model.add(layers.Dense(256,  activation='relu', kernel_regularizer=regularizers.l2(1e-8), input_shape=(data.shape[1],)))
    model.add(layers.BatchNormalization())
    model.add(layers.Dropout(0.5))
    model.add(layers.Dense(256,  activation='relu', kernel_regularizer=regularizers.l2(1e-8)))
    model.add(layers.BatchNormalization())
    model.add(layers.Dropout(0.5))
    model.add(layers.Dense(256,  activation='relu', kernel_regularizer=regularizers.l2(1e-8)))
    model.add(layers.BatchNormalization())
    model.add(layers.Dropout(0.5))
    model.add(layers.Dense(1,  activation='sigmoid', kernel_regularizer=regularizers.l2(1e-8)))
    # Compile model
    opt = Adam(lr=np.exp(-1.0 * 9))
    model.compile(optimizer=opt,
          loss='binary_crossentropy',
          metrics=['accuracy'])
    return (model)

0 个答案:

没有答案