我正在训练RNN,然后再训练几个完全连接的层。我试图查看每个内核的价值。但是,我不理解“ OptimizeLoss / ..”部分。对于每个变量名称,有两个值:'OptimizeLoss / id_classifyer / dense / kernel / RMSProp:0和'OptimizeLoss / id_classifyer / dense / kernel / RMSProp_1:0'。有人可以解释为什么同一变量有两个实例吗? (为清晰起见,请参阅所附图片。) 预先感谢!
def classifier():
with tf.variable_scope('id_classifyer', reuse=tf.AUTO_REUSE):
self.id_logits = self.dense_layer(self.combined_feats, units=64, use_bias=True, activation=tf.nn.relu)
self.id_logits = self.dense_layer(self.id_logits, units=self.out_units, use_bias=True, activation=None)
self.id_scores = tf.nn.softmax(self.id_logits, name='softmax_id_scores')
self.id_loss = tf.losses.softmax_cross_entropy(self.labels, self.id_logits, scope='softmax_id_loss')
优化器的一部分如下:
def optimizer()
optimizer = tf.train.RMSPropOptimizer(self.adp_lr, name='RMSProp')
self.train_op=tf.contrib.layers.optimize_loss(loss=self.id_loss,global_step=self.global_step, learning_rate=None,optimizer=optimizer, clip_gradients=config.grad_clip, learning_rate_decay_fn=None)