这是追溯:
Traceback (most recent call last):
File "test.py", line 39, in <module>
hess = tf.hessians(loss, wrt_variables)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/gradients_impl.py", line 970, in hessians
_gradients = array_ops.unstack(_gradients)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/array_ops.py", line 952, in unstack
value = ops.convert_to_tensor(value)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 639, in convert_to_tensor
as_ref=False)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 704, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/constant_op.py", line 113, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/constant_op.py", line 102, in constant
tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape, verify_shape=verify_shape))
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/tensor_util.py", line 360, in make_tensor_proto
raise ValueError("None values not supported.")
ValueError: None values not supported.
变量:
import tensorflow as tf
data_x = [0., 1., 2.]
data_y = [-1., 1., 3.]
batch_size = len(data_x)
x = tf.placeholder(shape=[batch_size], dtype=tf.float32, name="x")
y = tf.placeholder(shape=[batch_size], dtype=tf.float32, name="y")
W = tf.Variable(tf.ones(shape=[1]), dtype=tf.float32, name="W")
b = tf.Variable(tf.zeros(shape=[1]), dtype=tf.float32, name="b")
pred = x * W + b
loss = tf.reduce_mean(0.5 * (y - pred)**2)
然后,跟进此代码将起作用:
wrt_variables = [W, b]
hess = tf.hessians(loss, wrt_variables)
但这失败了:
wrt_variables = tf.concat([W, b], axis=0)
hess = tf.hessians(loss, wrt_variables)
这也会失败:
wrt_variables = [tf.concat([W, b], axis=0)]
hess = tf.hessians(loss, wrt_variables)
重塑操作也失败了。
此代码的完整版本和评论可以在这里看到: https://gist.github.com/guillaume-chevalier/6b01c4e43a123abf8db69fa97532993f
谢谢!
答案 0 :(得分:1)
这是因为在您的图表中,节点loss
不依赖于节点tf.concat([W,b], axis=0)
。没有一个反向传播到另一个,因此没有衍生物。
Tensorflow不是正式的微积分引擎,如果前者位于后者的下游,它只能估计另一个节点的节点衍生物。所以例如甚至
tf.hessian(loss, 2*W)
会因同样的原因而失败(2*W
是一个新节点而loss
不依赖它),即使与tf.hessian(loss, W)
的关系是直接的。
请注意,标记与tf.gradients
相同,即使它以不同方式失败:它返回None
而不是抛出异常。