太过我有两个模型foo
和bar
。假设bar
已预先训练并加载。我想在foo
中定义成本函数,如下面的代码粗略描绘的那样(它实际上是一个自动编码器)。请注意,这是重现我的问题的最小示例,因此它们在数学上没有意义。
import tensorflow as tf
def foo(X):
with tf.variable_scope("foo"):
A = tf.get_variable("A",shape=[1])
return tf.add(X,A)
def bar(X):
with tf.variable_scope("bar"):
B = tf.get_variable("B",shape=[1])
return tf.multiply(X,B)
X = tf.placeholder("float")
X_prime = foo(X)
Y = bar(X)
tf.get_variable_scope().reuse_variables()
Y_prime = bar(X_prime)
#foo(X) is manipulated with some other terms, but the point is foo is called again
cost = foo(X) + tf.pow(Y-Y_prime,2)
optimizer = tf.train.AdamOptimizer(learning_rate=0.01).minimize(cost)
如果我运行脚本(TF版本1.0),我会收到以下错误:
ValueError: Variable foo/A/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?
然而,GradientDescentOptimizer
不会发生这种情况。任何解释和指针都将不胜感激。
答案 0 :(得分:0)
您的ValueError是由variable_scope.reuse == True中的新变量引起的。
当您调用Adam的最小化函数时,Adam会创建变量,以便在图表中保存每个可训练变量的动量。
您已将重用设置为True,因此默认值为variable_scope.reuse == True。一旦将重置状态设置为True,重用状态就不能永远变回False。然后,Adam在状态重用== True下创建变量,这会引发错误。
解决方案是在设置variable_scope.reuse = True时在图形的默认范围下添加子范围,然后默认scope.reuse仍为False,Adam.minimize将起作用,如下所示:
import tensorflow as tf
def foo(X):
with tf.variable_scope("foo"):
A = tf.get_variable("A",shape=[1])
return tf.add(X,A)
def bar(X):
with tf.variable_scope("bar"):
B = tf.get_variable("B",shape=[1])
return tf.multiply(X,B)
X = tf.placeholder("float")
with tf.variable_scope("for_reuse_scope"):
X_prime = foo(X)
Y = bar(X)
tf.get_variable_scope().reuse_variables()
Y_prime = bar(X_prime)
#foo(X) is manipulated with some other terms, but the point is foo is called again
cost = foo(X) + tf.pow(Y-Y_prime,2)
optimizer = tf.train.AdamOptimizer(learning_rate=0.01).minimize(cost)