如何在keras中为tensorboard提供学习率值

时间:2017-09-19 11:05:21

标签: python tensorflow keras tensorboard

我使用keras并希望通过keras.callbacks.LearningRateScheduler

实现自定义学习率

如何通过学习率才能在张量板中监控? (keras.callbacks.TensorBoard

目前我有:

lrate = LearningRateScheduler(lambda epoch: initial_lr * 0.95 ** epoch)

tensorboard = TensorBoard(log_dir=LOGDIR, histogram_freq=1,
                          batch_size=batch_size, embeddings_freq=1,
                          embeddings_layer_names=embedding_layer_names )

model.fit_generator(train_generator, steps_per_epoch=n_steps,
                    epochs=n_epochs,
                    validation_data=(val_x, val_y),
                    callbacks=[lrate, tensorboard])

2 个答案:

答案 0 :(得分:1)

我不知道如何将它传递给Tensorboard,但你可以从python中监视它。

from keras.callbacks import Callback
class LossHistory(Callback):
    def on_train_begin(self, logs={}):
        self.losses = []
        self.lr = []

    def on_epoch_end(self, batch, logs={}):
        self.losses.append(logs.get('loss'))
        self.lr.append(initial_lr * 0.95 ** len(self.losses))

loss_hist = LossHistory()

然后只需将loss_hist添加到callbacks

<强>更新

基于this回答:

class LRTensorBoard(TensorBoard):

    def __init__(self, log_dir='./logs', **kwargs):
        super(LRTensorBoard, self).__init__(log_dir, **kwargs)

        self.lr_log_dir = log_dir

    def set_model(self, model):
        self.lr_writer = tf.summary.FileWriter(self.lr_log_dir)
        super(LRTensorBoard, self).set_model(model)

    def on_epoch_end(self, epoch, logs=None):
        lr = initial_lr * 0.95 ** epoch

        summary = tf.Summary(value=[tf.Summary.Value(tag='lr',
                                                     simple_value=lr)])
        self.lr_writer.add_summary(summary, epoch)
        self.lr_writer.flush()

        super(LRTensorBoard, self).on_epoch_end(epoch, logs)

    def on_train_end(self, logs=None):
        super(LRTensorBoard, self).on_train_end(logs)
        self.lr_writer.close()

只需像普通TensorBoard一样使用它。

答案 1 :(得分:1)

TensorFlow 2.x:

可以通过以下操作为x=5 y=6创建LearningRateScheduler日志:

TensorBoard

然后可以通过{p>在from tensorflow.keras.callbacks import LearningRateScheduler, TensorBoard # Define your scheduling function def scheduler(epoch): return return 0.001 * 0.95 ** epoch # Define scheduler lr_scheduler = LearningRateScheduler(scheduler) # Alternatively, use an anonymous function # lr_scheduler = LearningRateScheduler(lambda epoch: initial_lr * 0.95 ** epoch) # Define TensorBoard callback child class class LRTensorBoard(TensorBoard): def __init__(self, log_dir, **kwargs): super().__init__(log_dir, **kwargs) self.lr_writer = tf.summary.create_file_writer(self.log_dir + '/learning') def on_epoch_end(self, epoch, logs=None): lr = getattr(self.model.optimizer, 'lr', None) with self.lr_writer.as_default(): summary = tf.summary.scalar('learning_rate', lr, epoch) super().on_epoch_end(epoch, logs) def on_train_end(self, logs=None): super().on_train_end(logs) self.lr_writer.close() # Create callback object tensorboard_callback = LRTensorBoard(log_dir='./logs/', histogram_freq=1) # Compile the model model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) # Train the model r = model.fit(X_train, y_train, validation_data=(X_val, y_val), epochs=25, batch_size=200, callbacks=[tensorboard_callback, lr_scheduler]) 中查看学习率

TensorBoard

Learning rate logging in TensorBoard