TypeError:要保存的变量不是变量

时间:2017-01-06 04:21:12

标签: python session tensorflow distributed-computing

我有以下代码在分布式张量流中进行一些简单的算术计算。一个可重复性最小的例子是: -

import tensorflow as tf

global_step_tensor = tf.Variable(10, trainable=False, name='global_step')

cluster = tf.train.ClusterSpec({"local": ["localhost:2222", "localhost:2223","localhost:2224", "localhost:2225"]})
x = tf.constant(2)

with tf.device("/job:local/task:0"):
    y = x + 300

model = tf.global_variables_initializer()

saver = tf.train.Saver([y])

ChiefSessionCreator = tf.train.ChiefSessionCreator(scaffold=None, master='grpc://localhost:2222', config=None, checkpoint_dir='/home/chaitanya/tensorflow/codes/checkpoints')
saver_hook = tf.train.CheckpointSaverHook(checkpoint_dir='/home/chaitanya/tensorflow/codes/checkpoints', save_secs=10, save_steps=None, saver=y, checkpoint_basename='model.ckpt', scaffold=None)
summary_hook = tf.train.SummarySaverHook(save_steps=None, save_secs=10, output_dir='/home/chaitanya/tensorflow/codes/savepoints', summary_writer=None, scaffold=None, summary_op=y)

with tf.train.MonitoredTrainingSession(master='grpc://localhost:2222', is_chief=True, checkpoint_dir='/home/chaitanya/tensorflow/codes/checkpoints', 
    scaffold=None, hooks=[saver_hook, summary_hook], chief_only_hooks=None, save_checkpoint_secs=10, save_summaries_steps=None, config=None) as sess:

    while not sess.should_stop():
        sess.run(model)

    while not sess.should_stop():
        result = sess.run(y)
        print(result)

以下是错误: -

Traceback (most recent call last):
  File "add_1.py", line 13, in <module>
    saver = tf.train.Saver([y])
    raise TypeError("Variable to save is not a Variable: %s" % var)
TypeError: Variable to save is not a Variable: Tensor("add_3:0", shape=(), dtype=int32, device=/job:local/task:3)

请帮我弄清楚使用此功能的正确方法。

1 个答案:

答案 0 :(得分:1)

当您只是撰写x + 300时,您没有创建tf.Variable。您需要明确使用tf.get_variable()tf.Variable()来创建可以保存的变量。

y = tf.Variable(x + 300)