我想在Tensorflow中多次进行正向传递,而一次进行反向传递。有可能吗?
例如: 如果我已经满足一些条件可以进行后退,则可以进行五次正向通过。 伪代码:
for i in range(5):
Forward Pass
if i % 5 == 0:
Backward pass
我试图通过这种方式做到这一点:
batch_t = np.array(...)
batch_l = np.array(...)
x = tf.placeholder(tf.float32, (None, EVENT_SIZE))
# Hidden layer 1
W1 = tf.Variable(tf.random_normal([FEATURE_NO, HIDDEN_1]))
b1 = tf.Variable(tf.ones([HIDDEN_1]))
z1 = tf.add(tf.matmul(x, W1), b1)
a1 = tf.nn.relu(z1)
last = sess.run(a1, feed_dict={ x: batch_t })
if SOME CONDITION:
target = tf.placeholder(tf.int32)
forward = tf.placeholder(tf.float32)
probs = classifier(logits=forward, labels=target)
loss = tf.reduce_mean(probs)
train_op = optimizer.minimize(loss)
sess.run(train_op, feed_dict={ forward: last, target: batch_l})
但是,我有问题,图形在第一行被切断,我无法使用分类器和优化器连接“模型”以及零件中的变量。
我收到错误消息:
ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables ["<tf.Variable 'Variable:0' shape=(10000, 10) dtype=float32_ref>", "<tf.Variable 'Variable_1:0' shape=(10,) dtype=float32_ref>", "<tf.Variable 'Variable_2:0' shape=(10, 15) dtype=float32_ref>", "<tf.Variable 'Variable_3:0' shape=(15,) dtype=float32_ref>", "<tf.Variable 'Variable_4:0' shape=(15, 20) dtype=float32_ref>", "<tf.Variable 'Variable_5:0' shape=(20,) dtype=float32_ref>", "<tf.Variable 'Variable_6:0' shape=(20, 15) dtype=float32_ref>", "<tf.Variable 'Variable_7:0' shape=(15,) dtype=float32_ref>", "<tf.Variable 'Variable_8:0' shape=(15, 10) dtype=float32_ref>", "<tf.Variable 'Variable_9:0' shape=(10,) dtype=float32_ref>"] and loss Tensor("Mean:0", shape=(), dtype=float32).
如果有人能解决第一种或第二种问题,我将不胜感激。谢谢。