在CNN层中添加额外的内核,同时保持对其他内核学习的权重

时间:2019-06-24 19:03:26

标签: tensorflow keras deep-learning conv-neural-network

我正在cifar10数据集上训练一个简单的前馈转换神经网络。运行了几个纪元后,我想将第二个conv层的内核数从16增加到k。

在保持模型中其他内核和层的训练权重不变的情况下,我该如何做?


def conv_layer(inp, fltrs):
    inp = Conv2D(filters = fltrs, kernel_size = 3, strides = 1, padding = 'valid')(inp)
    inp = BatchNormalization()(inp)
    inp = Dropout(0.25)(inp)
    inp = Activation('relu')(inp)
    return inp

inp = Input(shape = (32, 32, 3))

x0 = conv_layer(inp, 8)
x1 = conv_layer(x0, 16)
x2 = MaxPooling2D(pool_size= 2, strides=None, padding='valid', data_format=None)(x1)
x3 = conv_layer(x2, 32)
x4 = conv_layer(x3, 48)
x5 = conv_layer(x4, 64)
x6 = MaxPooling2D(pool_size= 2, strides=None, padding='valid', data_format=None)(x5)
x7 = Flatten()(x6)
x8 = Dense(512)(x7)
x9 = BatchNormalization()(x8)
x10 = Dropout(0.25)(x9)
x11 = Activation('relu')(x10)
x12 = Dense(num_classes, activation='softmax')(x11)


model = Model(inputs = [inp], outputs = [x12])

0 个答案:

没有答案