在张量流中为训练变量添加重用

时间:2018-11-19 11:48:30

标签: python tensorflow

我是tensorflow的新手,但是试图运行一个上载到github的旧脚本。此刻我被卡住是因为tensorflow试图使用一个已经存在的变量,而且我不知道要在哪里添加重用。任何想法/建议都会非常有帮助!谢谢!

代码:

self.t_vars = tf.trainable_variables()

self.q_vars = [var for var in self.t_vars if (self.model_name+'_q_') in var.name]
self.g_vars = [var for var in self.t_vars if (self.model_name+'_g_') in var.name]
self.d_vars = [var for var in self.t_vars if (self.model_name+'_d_') in var.name]
self.both_vars = self.q_vars+self.g_vars
#self.vae_vars = self.q_vars # in this version, g_vars don't concern vae_loss

# clip gradients
d_opt_real_grads, _ = tf.clip_by_global_norm(tf.gradients(self.d_loss_real, self.d_vars), self.grad_clip)
d_opt_grads, _ = tf.clip_by_global_norm(tf.gradients(self.d_loss, self.d_vars), self.grad_clip)
g_opt_grads, _ = tf.clip_by_global_norm(tf.gradients(self.balanced_loss, self.both_vars), self.grad_clip)
vae_opt_grads, _ = tf.clip_by_global_norm(tf.gradients(self.vae_loss, self.q_vars), self.grad_clip)

d_real_optimizer = tf.train.AdamOptimizer(self.learning_rate_d, beta1=self.beta1)
d_optimizer = tf.train.AdamOptimizer(self.learning_rate_d, beta1=self.beta1)
g_optimizer = tf.train.AdamOptimizer(self.learning_rate_g, beta1=self.beta1)
vae_optimizer = tf.train.AdamOptimizer(self.learning_rate_vae, beta1=self.beta1)

self.d_opt_real = d_real_optimizer.apply_gradients(zip(d_opt_real_grads, self.d_vars))
self.d_opt = d_optimizer.apply_gradients(zip(d_opt_grads, self.d_vars))
self.g_opt = g_optimizer.apply_gradients(zip(g_opt_grads, self.both_vars))
self.vae_opt = vae_optimizer.apply_gradients(zip(vae_opt_grads, self.q_vars))

错误: ValueError:变量cppnvae_d_h0_conv / w / Adam /已经存在,不允许使用。您是要在VarScope中设置“ reuse = True”还是“ reuse = tf.AUTO_REUSE”?

0 个答案:

没有答案
相关问题