改变梯度图估计

时间:2017-10-30 17:11:32

标签: tensorflow

我正在尝试估算下面函数的正向传递和后向渐变:

  def func(img-batch, X1,X2):
    L=1
    A1 = X1*L**2
    A2 = X2*L**2
    AA1 = A1*A1
    AA2 = A2*A2
    A11A2 = A1*A2
    v = tf.nn.conv2d(img-batch, A1A2, strides=[1, 1, 1, 1], padding='SAME')
    v = v+ AA1+AA2
    return v

当我将此功能添加到网络时,默认情况下将对该功能的每条指令执行渐变。

如何使用此函数并在正向传递中计算它,同时忽略函数中每条指令的渐变并提供其他梯度估计并将其添加到模型的主梯度?

1 个答案:

答案 0 :(得分:1)

您可以使用py_func忽略此函数中的渐变,并使用gradient_override_map提供自定义渐变。这是一个例子:

import tensorflow as tf

def myfunc(X1, X2):
  L = 1
  A1 = X1 * L**2
  A2 = X2 * L**2
  AA1 = A1 * A1
  AA2 = A2 * A2
  A11A2 = A1 * A2
  ...
  v = AA1 + AA2 + A11A2
  return v


@tf.RegisterGradient("GradMyfunc")
def grad_myfunc(op, grad):
  X1 = op.inputs[0]
  X2 = op.inputs[1]
  return [grad * X2, grad * X1]


X1 = tf.Variable(tf.constant(1.1, dtype=tf.float64))
X2 = tf.Variable(tf.constant(2.2, dtype=tf.float64))
g = tf.get_default_graph()
with g.gradient_override_map({"PyFunc": "GradMyfunc"}):
  y = tf.py_func(myfunc, [X1, X2], [tf.float64])


with tf.Session() as sess:
  grad = tf.gradients(y, [X1, X2])
  sess.run(tf.global_variables_initializer())
  print(sess.run(y))
  print(sess.run(grad))