我正在使用Tensorflow并遇到此代码的问题:
def process_tree_tf(matrix, weights, idxs, name=None):
with tf.name_scope(name, "process_tree", [tree, weights, idxs]).as scope():
loop_index = tf.sub(tf.shape(matrix)[0], 1)
loop_vars = loop_index, matrix, idxs, weights
def loop_condition(loop_idx, *_):
return tf.greater(loop_idx, 0)
def loop_body(loop_idx, mat, idxs, weights):
x = mat[loop_idx]
w = weights
bias = tf.Variable(tf.constant(0.1, [2], dtype=tf.float64)) # Here?
...
return loop_idx-1, mat, idxs, weights
return tf.while_loop(loop_condition, loop_body, loop_vars, name=scope)[1]
我以这种方式评估功能:
height = 2
width = 2
nodes = 4
matrix = np.ones((nodes, width+height))
weights = np.ones((width+height, width))/100
idxs = [0,0,1,2]
with tf.Session as sess():
sess.run(tf.global_variables_initializer()) # Error Here!
r = process_tree_tf(matrix, weights, idxs)
print(r.eval())
我收到此错误:
InvalidArgumentError:节点' process_tree_tf / Variable / Assign'有来自不同帧的输入。输入' process_tree_tf / Const_1'在框架内' process_tree_tf / process_tree_tf /'。输入' process_tree_tf / Variable'在框架中'
。
奇怪的是,如果我在jupyter笔记本中重新启动内核并再次运行,我会收到此错误:
FailedPreconditionError(参见上面的回溯):尝试使用未初始化的值偏差 [[Node:bias / read = IdentityT = DT_FLOAT,_class = [" loc:@ bias"],_ device =" / job:localhost / replica:0 / task:0 / cpu:0& #34;]]
我尝试使用此代替:
bias = tf.get_variable("bias", shape=[2], initializer=tf.constant_initializer(0.1))
,但这也无效。
如果我忽略了一些显而易见的事情,我很抱歉,但如果有人能告诉我哪里出错了,我真的很感激。
非常感谢!
答案 0 :(得分:8)
这实际上是TensorFlow tf.Variable
中tf.while_loop()
个对象的一个微妙问题。 TensorFlow变得混乱,因为您初始化变量的tf.constant()
似乎是在循环内创建的值(即使它显然是循环不变的),但是所有变量都在环。最简单的解决方案是将变量的创建移到循环之外:
def process_tree_tf(matrix, weights, idxs, name=None):
with tf.name_scope(name, "process_tree", [tree, weights, idxs]).as scope():
loop_index = tf.sub(tf.shape(matrix)[0], 1)
loop_vars = loop_index, matrix, idxs, weights
# Define the bias variable outside the loop to avoid problems.
bias = tf.Variable(tf.constant(0.1, [2], dtype=tf.float64))
def loop_condition(loop_idx, *_):
return tf.greater(loop_idx, 0)
def loop_body(loop_idx, mat, idxs, weights):
x = mat[loop_idx]
w = weights
# You can still refer to `bias` in here, and the loop body
# will capture it appropriately.
...
return loop_idx-1, mat, idxs, weights
return tf.while_loop(loop_condition, loop_body, loop_vars, name=scope)[1]
(另一种可能的解决方案是在创建变量时使用tf.constant_initializer()
而不是tf.constant()
。)
答案 1 :(得分:0)
您可以在biases
内部初始化loop_body
,如下所示:
def loop_body(loop_idx, mat, idxs, weights):
x = mat[loop_idx]
w = weights
bias = tf.get_variable(dtype=tf.float64,
shape=[2],
initializer=tf.constant_initializer(value=np.array([0.1,0.1]), dtype=tf.float64))
您说您曾经尝试过tf.get_variable
和tf.constant_initializer
,但我想知道您是否找到了另一个解决方案?