亚当优化器的学习率不变

时间:2018-10-16 07:11:30

标签: tensorflow tensorflow-estimator

我想用张量流估算器记录亚当优化器的学习率,如下所示:

def def model_fn(features, labels, mode):
    ...
    optimizer = tf.train.AdamOptimizer(learning_rate=0.1)
    log_hook = tf.train.LoggingTensorHook({"lr" : optimizer._lr_t}, every_n_iter=10)
    return tf.estimator.EstimatorSpec(mode, loss=loss, train_op=train_op, training_hooks=[log_hook]) 
    ...

我们知道tf.train.AdamOptimizer的学习率会下降。但是我的结果总是这样的1.0:

INFO:tensorflow:lr = 0.1 (4.537 sec)
INFO:tensorflow:global_step/sec: 2.18827
INFO:tensorflow:loss = 8.285036e-07, step = 16180 (4.570 sec)
INFO:tensorflow:lr = 0.1 (4.570 sec)
INFO:tensorflow:global_step/sec: 2.21156
INFO:tensorflow:loss = 8.225431e-07, step = 16190 (4.521 sec)
INFO:tensorflow:lr = 0.1 (4.521 sec)

我是否正确选择AdamOptimizer的日志学习率?

更新: 我登录了optimizer。_lr引用了this answer,但收到此错误:

ValueError: Passed 0.1 should have graph attribute that is equal to current graph <tensorflow.python.framework.ops.Graph object at 0x7f96a290a350>.

0 个答案:

没有答案