在张量流中使用Adam优化器TWICE

时间:2017-07-07 08:44:17

标签: python tensorflow

我正在尝试使用adam优化器两次以最小化我的代码中的不同张量,我尝试过两次使用GradientDescentOptimizer,这很好,但是当我使用adam优化器两次时出错了,我问了另一个问题: tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但该解决方案在此无效。我也查看页面:https://github.com/tensorflow/tensorflow/issues/6220,但我仍然不明白。

这是我的代码,我收到错误消息:ValueError:Variable NN / NN / W / Adam_2 /不存在,或者不是用tf.get_variable()创建的。你的意思是在VarScope中设置reuse = None吗?

然后我在tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist尝试了解决方案,但是没有工作

import tensorflow as tf

def main():
    optimizer = tf.train.GradientDescentOptimizer(0.005)
    # optimizer = tf.train.AdamOptimizer(0.005)

    with tf.variable_scope('NN') as scope:
        W = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        X = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        y_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        y1 = W + X
        loss_1 = tf.reduce_mean(tf.abs(y_ - y1))


        # train_op1 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_1)
        train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)
        # with tf.variable_scope('opt'):
        #     train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)

        ##############################################################################################
        scope.reuse_variables()

        W2 = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        X2 = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        b = tf.Variable(tf.random_normal(shape=[5, 1], dtype=tf.float32))
        y2 = W2 + X2 + b
        loss_2 = tf.reduce_mean(tf.abs(y_ - y2))

        # train_op2 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_2)
        train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)
        # with tf.variable_scope('opt'):
        #     train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)


if __name__ == '__main__':
    main()

2 个答案:

答案 0 :(得分:0)

如果你必须在相同的范围内进行, 确保所有变量都及时定义。 我必须做更多关于它为何如此工作的研究,但优化器设置在较低级别锁定在图表中,不再可动态访问。

最小的工作示例:

import tensorflow as tf

def main():
    optimizer = tf.train.GradientDescentOptimizer(0.005)
    # optimizer = tf.train.AdamOptimizer(0.005)

    with tf.variable_scope('NN') as scope:
        assert scope.reuse == False
        W2 = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        X2 = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        y2_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        b = tf.get_variable(name='b', initializer=tf.random_normal(shape=[5, 1], dtype=tf.float32))
        y2 = W2 + X2 + b
        loss_2 = tf.reduce_mean(tf.abs(y2_ - y2))

        # train_op2 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_2)
        train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)


        # with tf.variable_scope('opt'):
        #     train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)

    ##############################################################################################
    with tf.variable_scope('NN', reuse = True) as scope:


        W = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        X = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        y_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
        b = tf.get_variable(name='b', initializer=tf.random_normal(shape=[5, 1], dtype=tf.float32))

        y1 = W + X
        loss_1 = tf.reduce_mean(tf.abs(y_ - y1))


        # train_op1 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_1)
        train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)
        # with tf.variable_scope('opt'):
        #     train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)


if __name__ == '__main__':
    main()

答案 1 :(得分:0)

解决此问题的最简单方法是将第二个优化器放在不同的范围内。这样命名不会引起任何混淆。