我的网络无法正常运行,因此我尝试删除损失函数的一部分。 tensorflow不再试图忽略不再需要的部分,而仍然尝试累积渐渐消失的梯度并破坏代码。
如何有效地从损失计算中删除部分而不删除代码的全部部分?
这是它的样子:
variable_scope(image_generation)
convolutions
upconvolutions with output
variable_scope(pose_estimation)
convolutions with output
loss a = calculation similarity between input and image_generation
loss b = calculation other factors between input and image_generation
loss c = calculation from pose_estimation
total_loss = a + b + c -> changed to total_loss = a
opt_step = tf.train.AdamOptimizer(learning_rate)
loss = model.total_loss
grads = opt_step.compute_gradients(loss)
毕业生现在仍然包含与
不相关的部分