如何在不使张量板混乱的情况下在张量流中恢复模型?

时间:2018-09-16 11:36:14

标签: tensorflow

当我通过tf.train.Saver恢复模型时,张量板上的损失变得非常混乱,

enter image description here

似乎它是从0迭代中恢复时重新开始的,有没有办法使它看起来更好?

1 个答案:

答案 0 :(得分:2)

使用变量(trainable = False)来指示迭代,并在训练进行时增加变量。每当我们要将摘要添加到张量板时,我们都会运行该变量并将结果作为add_summary传递给global_step

例如

# the variable mentioned before, counting training steps
train_steps = tf.get_variable('train_steps', shape=[], initializer=tf.constant_initializer(), trainable=False)

step_op = tf.assign(train_steps, train_steps + 1)

# update learn_step whenever optimization is performed
with tf.control_dependencies([step_op]):
    opt_op = optimize(loss)

...
train_steps, _, summary = sess.run([train_steps, opt_op, merged_op], feed_dict=feed_dict)
# write summary to tensorboard
writer.add_summary(summary, train_steps)
# save model
saver.save(sess, filename)