我正在尝试使用reuse
选项构建一个简单的神经网络但是我得到一个奇怪的错误。我不明白问题出在哪里。也许我没有正确使用mse
。
import tensorflow as tf
n_inputs = 8
x_ = tf.placeholder(tf.float32, [None, n_inputs])
l1 = tf.layers.dense(x_, 100, activation=tf.nn.relu, use_bias=True, name='l1', reuse=None)
l2 = tf.layers.dense(l1, 100, activation=tf.nn.relu, use_bias=True, name='l2', reuse=None)
l3 = tf.layers.dense(l2, 20, activation=tf.nn.relu, use_bias=True, name='l3', reuse=None)
y_ = tf.placeholder(tf.float32, [None, n_inputs])
w1 = tf.layers.dense(y_, 100, activation=tf.nn.relu, use_bias=True, name='l1', reuse=True)
w2 = tf.layers.dense(w1, 100, activation=tf.nn.relu, use_bias=True, name='l2', reuse=True)
w3 = tf.layers.dense(w2, 20, activation=tf.nn.relu, use_bias=True, name='l3', reuse=True)
z_ = tf.placeholder(tf.float32, [None, n_inputs])
u1 = tf.layers.dense(z_, 100, activation=tf.nn.relu, use_bias=True, name='l1', reuse=True)
u2 = tf.layers.dense(u1, 100, activation=tf.nn.relu, use_bias=True, name='l2', reuse=True)
u3 = tf.layers.dense(u2, 20, activation=tf.nn.relu, use_bias=True, name='l3', reuse=True)
mse1, _ = tf.metrics.mean_squared_error(l3, w3)
mse2, _ = tf.metrics.mean_squared_error(l3,u3)
cost = tf.subtract(mse1, mse2)
opts = tf.train.AdamOptimizer().minimize(cost)
sess = tf.InteractiveSession()
ERROR:
ValueError Traceback (most recent call last)
<ipython-input-4-0e3679c2a898> in <module>()
----> 1 __pyfile = open('''/tmp/py3823Cbm''');exec(compile(__pyfile.read(), '''/home/lpuggini/mlp/scratch/Kerberos/flow_ui.py''', 'exec'));__pyfile.close()
/home/lpuggini/mlp/scratch/Kerberos/flow_ui.py in <module>()
33 cost = tf.subtract(mse1, mse2)
34
---> 35 opts = tf.train.AdamOptimizer().minimize(cost)
36 sess = tf.InteractiveSession()
37
/home/lpuggini/MyApps/scientific_python_2_7/lib/python2.7/site-packages/tensorflow/python/training/optimizer.pyc in minimize(self, loss, global_step, var_list, gate_gradients, aggregation_method, colocate_gradi\
ents_with_ops, name, grad_loss)
320 "No gradients provided for any variable, check your graph for ops"
321 " that do not support gradients, between variables %s and loss %s." %
--> 322 ([str(v) for _, v in grads_and_vars], loss))
323
324 return self.apply_gradients(grads_and_vars, global_step=global_step,
ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables ["<tf.Variable 'l1/kernel:0' shape=(8, 100) dtype=float32_ref>", "<tf.Variable 'l1/b\
ias:0' shape=(100,) dtype=float32_ref>", "<tf.Variable 'l2/kernel:0' shape=(100, 100) dtype=float32_ref>", "<tf.Variable 'l2/bias:0' shape=(100,) dtype=float32_ref>", "<tf.Variable 'l3/kernel:0' shape=(100, 20)\
dtype=float32_ref>", "<tf.Variable 'l3/bias:0' shape=(20,) dtype=float32_ref>"] and loss Tensor("Sub:0", shape=(), dtype=float32).
答案 0 :(得分:2)
metrics
不是losses
。指标随时间记录某些统计信息。通过它们进行区分没有任何意义。除了有关指标的TF核心文档之外,这里还有一个不错的write up。
您想要的是https://www.tensorflow.org/api_docs/python/tf/losses。更具体地说,https://www.tensorflow.org/api_docs/python/tf/losses/mean_squared_error