当重用神经网络层的子集时,有很多代码。我一直使用以下代码,例如here:
def discriminator(self, image, y=None, reuse=False):
with tf.variable_scope("discriminator") as scope:
if reuse:
scope.reuse_variables()
此外,当某层具有reuse
参数时,它也应该被继承。但是当我对此进行测试时,它对我不起作用:
import tensorflow as tf
a = tf.placeholder(shape=(3,10), dtype=tf.float32)
def func(a, reuse=False):
with tf.variable_scope("discriminator") as scope:
if reuse:
scope.reuse_variables()
b = tf.layers.dense(a, 10, name='dense1', reuse=reuse)
print(b)
return b
print(tf.__version__)
b1 = func(a)
b2 = func(a, reuse=True)
输出:
1.10.1
Tensor("discriminator/dense1/BiasAdd:0", shape=(3, 10), dtype=float32)
Tensor("discriminator_1/dense1/BiasAdd:0", shape=(3, 10), dtype=float32)
Tensorflow只是创建了不同的作用域而已!
with tf.variable_scope("discriminator", reuse=reuse) as scope:
无济于事。
阅读this后,我设法下了一层:
import tensorflow as tf
a = tf.placeholder(shape=(3,10), dtype=tf.float32)
def func(a, reuse=False):
with tf.variable_scope('discriminator/', reuse=reuse) as scope:
if reuse:
scope.reuse_variables()
b = tf.layers.dense(a, 10, name='dense1', reuse=reuse)
print(b)
return b
print(tf.__version__)
b1 = func(a)
b2 = func(a, reuse=True)
1.10.1
Tensor("discriminator/dense1/BiasAdd:0", shape=(3, 10), dtype=float32)
Tensor("discriminator/dense1_1/BiasAdd:0", shape=(3, 10), dtype=float32)
其他任何方法都不起作用。例如,
import tensorflow as tf
a = tf.placeholder(shape=(3,10), dtype=tf.float32)
def func(a, reuse=False):
if reuse:
s = 'discriminator/'
else:
s = 'discriminator'
with tf.variable_scope(s, reuse=reuse) as scope:
if reuse:
scope.reuse_variables()
b = tf.layers.dense(a, 10, name='dense1', reuse=reuse)
print(b)
return b
print(tf.__version__)
b1 = func(a)
b2 = func(a, reuse=True)
输出Variable discriminator//dense1/kernel does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=tf.AUTO_REUSE in VarScope?
代码相同
import tensorflow as tf
a = tf.placeholder(shape=(3,10), dtype=tf.float32)
s = None
def func(a, reuse=False):
global s
if reuse:
sc = s.original_name_scope
else:
sc = 'discriminator'
with tf.variable_scope(sc) as scope:
s = scope
b = tf.layers.dense(a, 10, name='dense1', reuse=reuse)
print(b)
return b
print(tf.__version__)
b1 = func(a)
b2 = func(a, reuse=True)
在tensorflow 1.10和1.8中进行了测试。这是否意味着Github上的许多代码已经不起作用?
答案 0 :(得分:0)
回答自己。根据{{3}},它们的确具有相同的权重,但是图中的张量是分开的:
import tensorflow as tf
a = tf.placeholder(shape=(3,10), dtype=tf.float32)
def func(a, reuse=False):
with tf.variable_scope('discriminator', reuse=reuse) as scope:
b = tf.layers.dense(a, 10, name='dense1', reuse=reuse)
print(b)
return b
print(tf.__version__)
b1 = func(a)
b2 = func(a, reuse=True)
print([x.name for x in tf.global_variables()])
1.10.1
Tensor("discriminator/dense1/BiasAdd:0", shape=(3, 10), dtype=float32)
Tensor("discriminator_1/dense1/BiasAdd:0", shape=(3, 10), dtype=float32)
['discriminator/dense1/kernel:0', 'discriminator/dense1/bias:0']