Tensorflow中variable_scope和name_scope之间的特殊差异

时间:2017-08-23 22:38:57

标签: tensorflow

虽然我已浏览了有关同一问题的网页:Difference between variable_scope and name_scope in TensorFlowDifference between variable_ops_scope and variable_scope in tensorflow?

我仍然无法完全理解他们的分歧。我曾尝试使用tf.variable_scope和tf.name_scope作相同的代码,我发现它们与TensorBoard具有相同的图形。

其他人已经讨论过他们与图表中生成的名称的主要区别,而他们的名字是如此重要?我还看到了具有相同名称的变量将被重用。什么是重用场合?

非常感谢

1 个答案:

答案 0 :(得分:0)

关键是要理解图中变量和其他张量之间的区别。任何新创建的张量都将从名称范围获得前缀。 tf.get_variable将查找没有名称范围修饰符的现有变量。使用tf.get_variable新创建的变量仍然可以使其名称得到增强。

下面的脚本突出显示了这些差异。目的是通过将simple行和变量创建重构为单独的函数tf.matmul(x, A) + b来重现add_layer函数。

import tensorflow as tf


def get_x():
    return tf.constant([[1., 2., 3.]], dtype=tf.float32)


def report(out1, out2):
    print(out1.name)
    print(out2.name)
    variables = tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES)
    print([v.name for v in variables])


def simple():
    A = tf.get_variable(shape=(3, 3), dtype=tf.float32, name='A')
    b = tf.get_variable(shape=(3,), dtype=tf.float32, name='b')
    x = get_x()
    out1 = tf.matmul(x, A) + b
    out2 = tf.matmul(out1, A) + b
    return out1, out2


def add_layer(x):
    A = tf.get_variable(shape=(3, 3), dtype=tf.float32, name='A')
    b = tf.get_variable(shape=(3,), dtype=tf.float32, name='b')
    return tf.matmul(x, A) + b


def no_scoping():
    x = get_x()
    out1 = add_layer(x)
    out2 = add_layer(out1)
    return out1, out2


def different_name_scopes():
    x = get_x()
    with tf.name_scope('first_layer'):
        out1 = add_layer(x)
    with tf.name_scope('second_layer'):
        out2 = add_layer(out1)
    return out1, out2


def same_name_scope():
    x = get_x()
    with tf.name_scope('first_layer'):
        out1 = add_layer(x)
    with tf.name_scope('first_layer'):
        out2 = add_layer(out1)
    return out1, out2


def different_variable_scopes():
    x = get_x()
    with tf.variable_scope('first_layer'):
        out1 = add_layer(x)
    with tf.variable_scope('second_layer'):
        out2 = add_layer(out1)
    return out1, out2


def same_variable_scope():
    x = get_x()
    with tf.variable_scope('first_layer'):
        out1 = add_layer(x)
    with tf.variable_scope('first_layer'):
        out2 = add_layer(out1)
    return out1, out2


def same_variable_scope_reuse():
    x = get_x()
    with tf.variable_scope('first_layer'):
        out1 = add_layer(x)
    with tf.variable_scope('first_layer', reuse=True):
        out2 = add_layer(out1)
    return out1, out2


def test_fn(fn, name):
    graph = tf.Graph()
    with graph.as_default():
        try:
            print('****************')
            print(name)
            print('****************')
            out1, out2 = fn()
            report(out1, out2)
            print('----------------')
            print('SUCCESS')
            print('----------------')
        except Exception:
            print('----------------')
            print('FAILED')
            print('----------------')


for fn, name in [
        [simple, 'simple'],
        [no_scoping, 'no_scoping'],
        [different_name_scopes, 'different_name_scopes'],
        [same_name_scope, 'same_name_scope'],
        [different_variable_scopes, 'different_variable_scopes'],
        [same_variable_scope, 'same_variable_scope'],
        [same_variable_scope_reuse, 'same_variable_scope_reuse']
        ]:
    test_fn(fn, name)

结果:

****************
simple
****************
add:0
add_1:0
[u'A:0', u'b:0']
----------------
SUCCESS
----------------
****************
no_scoping
****************
----------------
FAILED
----------------
****************
different_name_scopes
****************
----------------
FAILED
----------------
****************
same_name_scope
****************
----------------
FAILED
----------------
****************
different_variable_scopes
****************
first_layer/add:0
second_layer/add:0
[u'first_layer/A:0', u'first_layer/b:0', u'second_layer/A:0', u'second_layer/b:0']
----------------
SUCCESS
----------------
****************
same_variable_scope
****************
----------------
FAILED
----------------
****************
same_variable_scope_reuse
****************
first_layer/add:0
first_layer_1/add:0
[u'first_layer/A:0', u'first_layer/b:0']
----------------
SUCCESS
----------------

请注意,在不重复使用的情况下使用不同的variable_scope不会产生错误,但会创建Ab的多个副本,这些副本可能无意。