我想使用inverse time decay of learning rate,公式为
decayed_learning_rate = learning_rate / (1 + decay_rate * t)
还有一个例子
...
global_step = tf.Variable(0, trainable=False)
learning_rate = 0.1
k = 0.5
learning_rate = tf.train.inverse_time_decay(learning_rate, global_step, k)
# Passing global_step to minimize() will increment it at each step.
learning_step = (
tf.train.GradientDescentOptimizer(learning_rate)
.minimize(...my loss..., global_step=global_step)
)
在上面的示例中,k
是decay_rate
,但如何设置t
? t
是global_step
吗?
答案 0 :(得分:0)