我正在尝试将aX^2+bX+c
评估为张量流中的[a,b,c]\*[X*X X 1]
。
我尝试了以下代码:
import tensorflow as tf
X = tf.placeholder(tf.float32, name="X")
W = tf.Variable([1,2,1], dtype=tf.float32, name="weights")
W=tf.reshape(W,[1,3])
F = tf.Variable([X*X,X,1.0], dtype=tf.float32, name="Filter")
F=tf.reshape(F,[3,1])
print(W.shape)
print(F.shape)
Y=tf.matmul(W,F)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for i in range(10):
sess.run(Y, feed_dict={X: i})
Y=sess.run(Y)
print("Y:",Y)
但是,初始化器并不令人满意:
(1, 3)
(3, 1)
...
tensorflow.python.framework.errors_impl.InvalidArgumentError: You must feed a value for placeholder tensor 'X' with dtype float
[[{{node X}}]]
During handling of the above exception, another exception occurred:
...
Caused by op 'X', defined at:
File "sample.py", line 2, in <module>
X = tf.placeholder(tf.float32, name="X")
...
是否有其他想法?
答案 0 :(得分:2)
您只需要稍微修改一下代码。 tf.Variable
的值不应为tf.placeholder
,否则在运行sess.run(tf.global_variables_initializer())
时会导致初始化错误。您可以使用tf.stack
代替它。
另外,请记住在运行sess.run(Y)
时要提供数据。
import tensorflow as tf
X = tf.placeholder(tf.float32, name="X")
W = tf.Variable([1,2,1], dtype=tf.float32, name="weights")
W = tf.reshape(W,[1,3])
F = tf.stack([X*X,X,1.0])
F = tf.reshape(F,[3,1])
Y = tf.matmul(W,F)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for i in range(10):
Y_val = sess.run(Y, feed_dict={X: i})
print("Y:",Y_val)
Y: [[1.]]
Y: [[4.]]
Y: [[9.]]
Y: [[16.]]
Y: [[25.]]
Y: [[36.]]
Y: [[49.]]
Y: [[64.]]
Y: [[81.]]
Y: [[100.]]
答案 1 :(得分:0)
我认为,即使您仍然可以像这样初始化依赖于占位符的变量,W
也会被重复初始化,除非您添加更多代码以仅初始化未初始化的变量。那是更多的努力。
希望我没有错过这种方法的其他低效率之处。
import tensorflow as tf
sess = tf.InteractiveSession()
X = tf.placeholder(tf.float32, name="X")
W = tf.Variable([1, 2, 1], dtype=tf.float32, name="weights")
W = tf.reshape(W, [1, 3])
var = tf.reshape([X*X,X,1],[3,1])
F = tf.get_variable('F', dtype=tf.float32, initializer=var)
init = tf.global_variables_initializer()
Y=tf.matmul(W,F)
for i in range(10):
sess.run([init], feed_dict={X: i})
print(sess.run(Y))
[[1.]]
[[4.]]
[[9.]]
[[16.]]
[[25.]]
[[36.]]
[[49.]]
[[64.]]
[[81.]]
[[100.]]