长话短说。接下来的代码和平描述了期望的行为,即优化器
修改了变量值 x = tf.Variable(5.0)
# x = x.assign(tf.constant(5.0))
opt = tf.train.MomentumOptimizer(learning_rate=0.0001, momentum=0.9)
train_op = opt.minimize(x)
with tf.Session() as sess:
sess.run(tf.initialize_all_variables())
sess.run(train_op)
print(x.eval()) # => 4.9999 - Desired result
虽然在运行中将值赋值给变量但被认为不是梯度值得操作:
x = tf.Variable(5.0)
x_ = x.assign(tf.constant(5.0))
opt = tf.train.MomentumOptimizer(learning_rate=0.0001, momentum=0.9)
train_op = opt.minimize(x_)
with tf.Session() as sess:
sess.run(tf.initialize_all_variables())
sess.run(train_op) # => ERROR: No gradients provided for any variable
print(x.eval())
有没有办法将变量设置为某个值,但是由于列车运行,还是通过渐变修改了它的值?
编辑:代码修复
答案 0 :(得分:0)
这个肮脏的黑客:
x = tf.Variable(99.0)
const = tf.constant(5.0)
x_ = x + tf.stop_gradient(-x) + const # ARGHH
opt = tf.train.MomentumOptimizer(learning_rate=0.0001, momentum=0.9)
train = opt.minimize(x_)
with tf.Session() as sess:
sess.run(tf.initialize_all_variables())
print(x_.eval())
x_original = x.eval()
sess.run(train)
print(x.eval() - x_original + const.eval())