我试图了解Tensorflow,并且我看到了一个官方的例子,即Cifar-10模型。
在cifar10.py中,在inference()中,您可以看到以下几行:
with tf.variable_scope('softmax_linear') as scope:
weights = _variable_with_weight_decay('weights', [192, NUM_CLASSES],
stddev=1/192.0, wd=0.0)
biases = _variable_on_cpu('biases', [NUM_CLASSES],
tf.constant_initializer(0.0))
softmax_linear = tf.add(tf.matmul(local4, weights), biases, name=scope.name)
_activation_summary(softmax_linear)
scope.name应该是softmax_linear,它应该是节点的名称。我使用以下行保存了图形原型(它与教程不同):
with tf.Graph().as_default():
global_step = tf.Variable(0, trainable=False)
# Get images and labels
images, labels = cifar10.distorted_inputs()
# Build a Graph that computes the logits predictions from the
# inference model.
logits = cifar10.inference(images)
# Calculate loss.
loss = cifar10.loss(logits, labels)
# Build a Graph that trains the model with one batch of examples and
# updates the model parameters.
train_op = cifar10.train(loss, global_step)
# Create a saver.
saver = tf.train.Saver(tf.global_variables())
# Build the summary operation based on the TF collection of Summaries.
summary_op = tf.summary.merge_all()
# Build an initialization operation to run below.
init = tf.global_variables_initializer()
# Start running operations on the Graph.
sess = tf.Session(config=tf.ConfigProto(
log_device_placement=FLAGS.log_device_placement))
sess.run(init)
# save the graph
tf.train.write_graph(sess.graph_def, FLAGS.train_dir, 'model.pbtxt')
....
但我无法在model.pbtxt中看到名为softmax_linear的节点。我究竟做错了什么?我只想输出节点的名称来导出图形。
答案 0 :(得分:1)
运营商名称不会是"softmax_linear"
。 tf.name_scope()
前缀名称的运算符名称,以/
分隔。每个运营商都有自己的名称。例如,如果你写
with tf.name_scope("foo"):
a = tf.constant(1, name="bar")
然后常量将具有名称"foo/bar"
。
希望有所帮助!