如何在TensorFlow中设置超参数(learning_rate)计划?

时间:2017-12-19 23:26:36

标签: tensorflow hyperparameters

在TensorFlow中安排超参数的方法是什么?

即,为了重现性,我想使用建议的学习率计划{0:0.1,1:1,100:0.01,150:0.001}来实施ResNet(您的名字),或者启用权重只有在最初的几个时期之后才会腐烂。

例如,tensorpack提供了以下选项:

ScheduledHyperParamSetter('learning_rate', [(1, 0.1), (82, 0.01), (123, 0.001), (300, 0.0002)])

如何在原生TF中完成?

1 个答案:

答案 0 :(得分:0)

好吧,这不是那么难的

    schedule = {1: 0.1, 2: 0.2, 3: 0.3, 4: 0.4, 100: 0.01, 150: 0.001}
    schedule = sorted(config.lr_schedule.items(), key=lambda x: x[0])

    boundaries = [num_train_iter * int(x[0]) for x in schedule]
    rates = [x[1] for x in schedule]
    rates = rates[:1] + rates  # 
    assert len(boundaries) + 1 == len(rates)

    learning_rate = tf.train.piecewise_constant(tf.cast(global_step, tf.int32), boundaries, rates)