gradient registry没有条目:FloorMod

时间:2017-10-24 01:01:13

标签: python tensorflow python-3.5

mod操作声称没有定义渐变。运行此帖子底部的代码时,我会收到以下消息:

LookupError: gradient registry has no entry for: FloorMod

LookupError: No gradient defined for operation 'mod' (op type: FloorMod)

系统信息

  • 操作系统平台和分发: Ubuntu 16.04
  • 安装的TensorFlow: docker gpu image
  • TensorFlow版本: 1.3.0 v1.3.0-rc2-20-g0787eee 1.3.0
  • Python版: 3.5.2
  • CUDA / cuDNN版: V8.0.61
  • GPU型号和内存: GeForce GTX 1080,8GB

源代码/日志

这里有一个可重复性最小的例子:

import tensorflow as tf

sess = tf.InteractiveSession()
a = tf.placeholder(dtype=tf.float32, shape=[5, 2])

# b = snt.Linear(output_size=4)(a)
W = tf.Variable(tf.zeros([2, 10]))
b = tf.Variable(tf.zeros([10]))
b = tf.matmul(a, W) + b

loss = tf.reduce_sum(b) % 2
update_op = tf.train.AdamOptimizer(learning_rate=0.0001).minimize(loss)

sess.run(tf.global_variables_initializer())
sess.run(update_op, {a: [[1, 2], [3, 4]]})

这导致以下追溯:

---------------------------------------------------------------------------
LookupError                               Traceback (most recent call last)
/usr/local/lib/python3.5/dist-
packages/tensorflow/python/ops/gradients_impl.py in gradients(ys, xs, 
grad_ys, name, colocate_gradients_with_ops, gate_gradients, 
aggregation_method)
    511             try:
--> 512               grad_fn = ops.get_gradient_function(op)
    513             except LookupError:

/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py in 
get_gradient_function(op)
   1835     op_type = op.type
-> 1836   return _gradient_registry.lookup(op_type)
   1837 

/usr/local/lib/python3.5/dist-        packages/tensorflow/python/framework/registry.py in lookup(self, name)
     92       raise LookupError(
---> 93           "%s registry has no entry for: %s" % (self._name, name))

LookupError: gradient registry has no entry for: FloorMod

During handling of the above exception, another exception occurred:

LookupError                               Traceback (most recent call last)
<ipython-input-1-7b9ad04151d6> in <module>()
     10 
     11 loss = tf.reduce_sum(b) % 2
---> 12 update_op =     tf.train.AdamOptimizer(learning_rate=0.0001).minimize(loss)
     13 
     14 sess.run(tf.global_variables_initializer())

/usr/local/lib/python3.5/dist-    packages/tensorflow/python/training/optimizer.py in minimize(self, loss,     global_step, var_list, gate_gradients, aggregation_method,     colocate_gradients_with_ops, name, grad_loss)
    313         aggregation_method=aggregation_method,
    314         colocate_gradients_with_ops=colocate_gradients_with_ops,
--> 315         grad_loss=grad_loss)
    316 
    317     vars_with_grad = [v for g, v in grads_and_vars if g is not None]

/usr/local/lib/python3.5/dist-    packages/tensorflow/python/training/optimizer.py in compute_gradients(self,     loss, var_list, gate_gradients, aggregation_method, colocate_gradients_with_ops,     grad_loss)
    384         gate_gradients=(gate_gradients == Optimizer.GATE_OP),
    385         aggregation_method=aggregation_method,
--> 386         colocate_gradients_with_ops=colocate_gradients_with_ops)
    387     if gate_gradients == Optimizer.GATE_GRAPH:
    388       grads = control_flow_ops.tuple(grads)

/usr/local/lib/python3.5/dist-    packages/tensorflow/python/ops/gradients_impl.py in gradients(ys, xs, grad_ys,     name, colocate_gradients_with_ops, gate_gradients, aggregation_method)
    514               raise LookupError(
    515                   "No gradient defined for operation '%s' (op type:     %s)" %
--> 516                   (op.name, op.type))
    517         if loop_state:
    518           loop_state.EnterGradWhileContext(op, before=False)

LookupError: No gradient defined for operation 'mod' (op type: FloorMod)

关于解决方法的任何想法?我需要使用modulo作为我的损失函数的一部分。 我使用它将一些坐标从全局空间转换为它们与特定网格单元的相对位置(问我是否需要更好的解释 - 我不认为它特别相关)。

谢谢!

1 个答案:

答案 0 :(得分:0)

最近添加了FloorMod的渐变,但它不包含在TensorFlow 1.3.0中。如果从最新的源代码构建TensorFlow,我已经验证了您的程序运行良好。

可以找到一些讨论here