有人可以解释我从这个简单的代码得到的奇怪结果吗?我做错了吗?为什么输入参数a_C和a_G会发生变化?与传递的值有什么不同?
#-----------------------------
def dummy_function(a_C, a_G):
diff = tf.subtract(a_C, a_G)
sqr = tf.square(diff)
return a_C, a_G, diff, sqr
#-----------------------------
tf.reset_default_graph()
with tf.Session() as test:
tf.set_random_seed(1)
a_C = tf.random_normal([1], mean=1, stddev=4)
a_G = tf.random_normal([1], mean=1, stddev=4)
a_C_returned, a_G_returned, diff, sqr = dummy_function(a_C, a_G)
print("a_C = " + str(a_C.eval()))
print("a_G = " + str(a_G.eval()))
print("a_C_returned = " + str(a_C_returned.eval()))
print("a_G_returned = " + str(a_G_returned.eval()))
print("diff = " + str(diff.eval()))
print("sqr = " + str(sqr.eval()))
#-----------------------------
# results
a_C = [-1.68344498]
a_G = [-0.39043474]
a_C_returned = [ 4.70364952]
a_G_returned = [ 0.84769011]
diff = [-9.30598831]
sqr = [ 25.68828583]
提前感谢您的帮助, 最好的祝福, KASIA
答案 0 :(得分:2)
你的a_C不是tf.random_normal
的结果张量!这是在每个eval
中获取随机数的操作。这是从不使用.eval()
的最佳演示。
相反,您需要在一个运行中评估这些张量,如
import tensorflow as tf
def dummy_function(a_C, a_G):
diff = tf.subtract(a_C, a_G)
sqr = tf.square(diff)
return a_C, a_G, diff, sqr
with tf.Session() as sess:
tf.set_random_seed(1)
a_C = tf.random_normal([1], mean=1, stddev=4)
a_G = tf.random_normal([1], mean=1, stddev=4)
a_C_returned, a_G_returned, diff, sqr = dummy_function(a_C, a_G)
a_C_, a_G_, a_C_returned_, a_G_returned_, diff_, sqr_ = sess.run([a_C, a_G, a_C_returned, a_G_returned, diff, sqr])
print("a_C = " + str(a_C_))
print("a_G = " + str(a_G_))
print("a_C_returned = " + str(a_C_returned_))
print("a_G_returned = " + str(a_G_returned_))
print("diff = " + str(diff_))
print("sqr = " + str(sqr_))
这可以保证所有返回的结果都基于相同的条目节点(即a_C, a_g
)