如何用train_on_batch [Keras]检查学习率

时间:2018-04-18 13:29:34

标签: machine-learning neural-network deep-learning keras conv-neural-network

我在Python2上使用Keras。 有谁知道如何检查和修改ADAM优化器的学习率?这是我的神经网络,我定义了自己的优化器。使用model.train_on_batch(...)进行批量培训时,我无法跟踪学习率。谢谢你的帮助

   
def CNN_model():
    # Create model
    model = Sequential()
    model.add(Conv2D(12, (5, 5), input_shape=(1, 256, 256), activation='elu'))
    model.add(MaxPooling2D(pool_size=(3, 3)))
    model.add(Conv2D(12, (5, 5), activation='elu'))
    model.add(MaxPooling2D(pool_size=(4, 4)))
    model.add(Conv2D(12, (3, 3), activation='elu'))
    model.add(MaxPooling2D(pool_size=(3, 3)))
    model.add(Flatten())
    model.add(Dropout(0.3))
    model.add(Dense(128, activation='elu'))
    model.add(Dropout(0.3))
    model.add(Dense(32, activation='elu'))
    model.add(Dense(2, activation='softmax'))
    # Compile model
    my_optimizer = Adam(lr=0.001, decay=0.05)
    model.compile(loss='categorical_crossentropy', optimizer=my_optimizer, metrics=['accuracy'])
    return model

2 个答案:

答案 0 :(得分:3)

您可以通过多种方式完成此操作。我最简单的想法是通过callbacks

来完成
from keras.callbacks import Callback
from keras import backend as K
class showLR( Callback ) :
    def on_epoch_begin(self, epoch, logs=None):
        lr = float(K.get_value(self.model.optimizer.lr))
        print " epoch={:02d}, lr={:.5f}".format( epoch, lr )

答案 1 :(得分:0)

您可以使用ReduceLROnPlateau回调。在您的回调列表中添加ReduceLROnPlateau callback,然后将您的回调列表包含在您的列车计划中。

from keras.callbacks import ModelCheckpoint, ReduceLROnPlateau
callbacks= [ReduceLROnPlateau(monitor='val_acc', 
                                        patience=5, 
                                        verbose=1, 
                                        factor=0.5, 
                                        min_lr=0.00001)]
model=CNN_model()
model.fit(x_train, y_train, batch_size=batch_size,
           epochs=epochs,
           validation_data=(x_valid, y_valid),
           callbacks = callbacks)