缩放sparse_softmax_cross_entropy_with_logits

时间:2018-01-19 21:59:44

标签: softmax cross-entropy

如何缩放损失来自 sparse_softmax_cross_entropy_with_logits 的渐变。例如,我试图将128除以如下,但我发现错误:

new_gradients = [(grad/128, var) for (grad, var) in gradients] 
TypeError: unsupported operand type(s) for /: 'IndexedSlices' and 'int'

我使用的代码如下:

loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits, labels=labels)

gradients = opt.compute_gradient(loss)

new_gradients = [(grad/128, var) for (grad, var) in gradients]

train_step = opt.appy_gradients(new_gradients)

1 个答案:

答案 0 :(得分:0)

我找到了解决问题的方法如下:

new_gradients = [(grad/128, var) for (grad, var) in gradients]

应该是

new_gradients = [(tf.div(grad, 128), var) for (grad, var) in gradients]