keras:如何阻止卷积层权重

时间:2016-12-27 18:39:57

标签: python machine-learning keras conv-neural-network

Here我有一个针对Keras的GoogleNet模型。是否有任何可能的方法来阻止单个网络层的更改?我想从变化中阻止前两层预训练模型。

1 个答案:

答案 0 :(得分:3)

通过阻止各个层的更改'我假设你不想训练那些图层,也就是说你不想修改加载的权重(可能在以前的训练中学到的)。

如果是这样,您可以将trainable=False传递给图层,参数不会用于训练更新规则。

示例:

from keras.models import Sequential
from keras.layers import Dense, Activation

model = Sequential([
    Dense(32, input_dim=100),
    Dense(output_dim=10),
    Activation('sigmoid'),
])

model.summary()

model2 = Sequential([
    Dense(32, input_dim=100,trainable=False),
    Dense(output_dim=10),
    Activation('sigmoid'),
])

model2.summary()

您可以在模型摘要中看到第二个模型,参数被计为不可训练的参数。

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
dense_1 (Dense)                  (None, 32)            3232        dense_input_1[0][0]              
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 10)            330         dense_1[0][0]                    
____________________________________________________________________________________________________
activation_1 (Activation)        (None, 10)            0           dense_2[0][0]                    
====================================================================================================
Total params: 3,562
Trainable params: 3,562
Non-trainable params: 0
____________________________________________________________________________________________________
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
dense_3 (Dense)                  (None, 32)            3232        dense_input_2[0][0]              
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 10)            330         dense_3[0][0]                    
____________________________________________________________________________________________________
activation_2 (Activation)        (None, 10)            0           dense_4[0][0]                    
====================================================================================================
Total params: 3,562
Trainable params: 330
Non-trainable params: 3,232