在以下示例中,sess.run(init)
是否在for循环中,结果是相同的。有人能帮我理解为什么会这样吗?初始化在tensorflow中实际做了什么?
==> main.py <==
#!/usr/bin/env python
# vim: set noexpandtab tabstop=2 shiftwidth=2 softtabstop=-1 fileencoding=utf-8:
import tensorflow as tf
x = tf.Variable(1)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
for i in xrange(5):
x = x + 1
print(x.eval())
==> main_rep.py <==
#!/usr/bin/env python
# vim: set noexpandtab tabstop=2 shiftwidth=2 softtabstop=-1 fileencoding=utf-8:
import tensorflow as tf
x = tf.Variable(1)
init = tf.global_variables_initializer()
with tf.Session() as sess:
for i in xrange(5):
sess.run(init)
x = x + 1
print(x.eval())
答案 0 :(得分:0)
问题不在于sess.run(init)
,而在于此声明
x = x + 1
你基本上是在创建一个名为x
的新张量,它会覆盖你的变量x
。要验证是否运行此代码:
import tensorflow as tf
x = tf.Variable(1)
init = tf.global_variables_initializer()
with tf.Session() as sess:
for i in xrange(5):
sess.run(init)
x = x + 1
print(x.eval())
print(x)
您将获得以下输出:
2
Tensor("add:0", shape=(), dtype=int32)
3
Tensor("add_1:0", shape=(), dtype=int32)
4
Tensor("add_2:0", shape=(), dtype=int32)
5
Tensor("add_3:0", shape=(), dtype=int32)
6
Tensor("add_4:0", shape=(), dtype=int32)
正如您所看到的,每次都会创建一个新的张量。如果您确实希望确保x
再次获得初始值,则必须保留变量以便再次重新创建。您可以采取的一种方法是使用load
操作。考虑这个例子:
y = tf.Variable(1)
init = tf.global_variables_initializer()
with tf.Session() as sess:
for i in xrange(5):
sess.run(init)
y.load(y.eval() + 1)
print(y.eval())
print(y)
您将获得以下输出:
2
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
2
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
2
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
2
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
2
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>