Optimizer.apply_gradients()的结果为None

时间:2018-09-24 03:22:58

标签: tensorflow gradient-descent

我正在尝试对梯度中的可训练变量子集采取额外的步骤-可以说是一个简单的乘法。这是什么:

def do_something(tgvt):
new_tgvt = []
for gv in tgvt:
    if gv[0] == None:
        sh = tf.shape(gv[1])
        gv0 = tf.zeros(sh)
        gv0t = tf.convert_to_tensor(gv0)
        new_tgvt.append((gv0t, gv[1]))
    else:
        new_tgvt.append((gv[0]*5, gv[1]))

return new_tgvt

optimizer = tf.train.GradientDescentOptimizer(learning_rate = 1e-5)
params = tf.trainable_variables()
pars = [params[27], params[29]]
gradients = optimizer.compute_gradients(cost,pars)
tgv = [(g,v) for (g,v) in gradients]

new_gradients = do_something(tgv)
train_op = optimizer.apply_gradients(new_gradients)

session = tf.Session()
session.run(tf.global_variables_initializer())
total_iterations = 0  # record the total iterations
for i in range(total_iterations,total_iterations + num_iterations):
    x_batch, y_batch = data.train.next_batch(batch_size)
    feed_dict = {X: x_batch, y_true: y_batch, keep_prob: 0.5}
    result = session.run([train_op, pars], feed_dict=feed_dict)

当我打印result时,渐变为None

print(result[0])
print((result[1][0]).shape)      
print((result[1][1]).shape)

None
(5, 5, 1, 36)
(5, 5, 36, 64)

有什么解决方法吗?

1 个答案:

答案 0 :(得分:1)

来自docs: train_op应该返回:

  

应用指定渐变的操作。

sess.run上调用train_op会产生None,因为此操作不会产生价值,而是会适用。

为什么不通过打印其中一个变量的旧值和更新值来自己检查?