当我为RNN的tf.metrics.mean_per_class_accuracy操作运行一个张量流会话时,我的代码将返回FailedPreconditionError,如:
FailedPreconditionError: Attempting to use uninitialized value mean_accuracy/count [[Node: mean_accuracy/count/read = Identity[T=DT_FLOAT, _class=["loc:@mean_accuracy/count"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](mean_accuracy/count)]]
我想知道是否在操作平均精度之前初始化张量,然后初始化权重和偏差张量,以使其与训练过的张量值不同。
我的代码如下:
import tensorflow as tf
import numpy as np
x_raw = np.random.rand(10,3,3)
y_raw = np.random.randint(0,high=2,size=10)
x_test = np.random.rand(1,3,3)
y_test = np.random.randint(0,high=2,size=1)
tf.reset_default_graph()
x = tf.placeholder(tf.float32, shape=[None, 3, 3])
y = tf.placeholder(tf.float32, shape=[None, 2])
def rnn(x):
l = {'w': tf.Variable(tf.random_normal([5, 2])),
'b': tf.Variable(tf.random_normal([2]))}
x = tf.transpose(x, (1,0,2))
x = tf.reshape(x, (-1, 3))
x = tf.split(x, 3, axis=0)
cell = tf.nn.rnn_cell.BasicLSTMCell(5)
out, stt = tf.contrib.rnn.static_rnn(cell, x, dtype=tf.float32)
o = tf.matmul(out[-1], l['w'])+l['b']
return o
pr = rnn(x)
co = tf.nn.softmax_cross_entropy_with_logits_v2(labels=y, logits=pr)
om = tf.train.AdamOptimizer(learning_rate=0.01).minimize(co)
cr = tf.equal(tf.argmax(pr,1), tf.argmax(y,1))
ac0 = tf.reduce_mean(tf.cast(cr,tf.float32))
ac1 = tf.metrics.mean_per_class_accuracy(y,pr,2)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
c, _ = sess.run([co,om], feed_dict={x:x_raw[:5], y:tf.Session().run(tf.one_hot(y_raw[:5],2))})
print(sess.run(ac0, feed_dict={x:x_test, y:tf.Session().run(tf.one_hot(y_test,2))}))
print(sess.run(ac1, feed_dict={x:x_test, y:tf.Session().run(tf.one_hot(y_test,2))}))
当我的控制台接近用于操作“ ac1”张量的线时,将引发错误消息。
答案 0 :(得分:0)
tf.metrics.mean_per_class_accuracy
在tf.GraphKeys.LOCAL_VARIABLES
中创建变量,而不是在GLOBAL_VARIABLES
中创建变量。这意味着它们不会由您的tf.global_variables_initializer()
初始化。
幸运的是,有一个相应的tf.local_variables_initializer()
可以帮助您。将其与您的全局初始化程序一起放入,应该很好!
sess.run([tf.global_variables_initializer(), tf.local_variables_initializer()])