Tensorflow恢复和使用

时间:2018-04-17 09:54:44

标签: tensorflow

我创建了一个简单的张量流模型:

import tensorflow as tf

tf.reset_default_graph()

x_data = [1,2,3]
y_data = [3,4,5]

X = tf.placeholder(tf.float32, name="X")
Y = tf.placeholder(tf.float32, name="Y")

W = tf.Variable(tf.random_uniform([1], -1.0, 1.0), name='W')
b = tf.Variable(tf.random_uniform([1], 0.0, 2.0), name='b')

hypothesis = tf.add(b, tf.multiply(X,W), name="op_restore")

saver = tf.train.Saver()
cost = tf.reduce_mean(tf.square(hypothesis - Y))
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.1)

train_op = optimizer.minimize(cost)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    tf.train.write_graph(sess.graph_def, '.', 'tfandroid.pbtxt')

    for step in range(100):
       _, cost_val = sess.run([train_op, cost], feed_dict={X:x_data, Y:y_data})
       print((step, cost_val, sess.run(W), sess.run(b)))

    saver.save(sess, './tfandroid.ckpt')

    print("\n == Test ==")
    print("X: 5, Y: ", sess.run(hypothesis, feed_dict={X:5}))
    print("X: 2.5, Y: ", sess.run(hypothesis, feed_dict={X:2.5}))

然而,结果不是我期望的结果。我期待'6',但我得到'None'

下面是重复使用代码,请告诉我这段代码有什么问题?

import tensorflow as tf

tf.reset_default_graph()

with tf.Session() as sess:
    saver = tf.train.import_meta_graph('tfandroid.ckpt.meta')
    saver.restore(sess, tf.train.latest_checkpoint('./'))
    graph = tf.get_default_graph()

    W = graph.get_tensor_by_name("W:0")
    b = graph.get_tensor_by_name("b:0")
    X = graph.get_tensor_by_name("X:0")

    print('sess.run(W) = ', sess.run(W))
    print('sess.run(b) = ', sess.run(b))

    feed_dict = {X: 4.0}

    hypothesis = graph.get_operation_by_name("op_restore")
    print(hypothesis)

    print(sess.run(hypothesis, feed_dict))

1 个答案:

答案 0 :(得分:0)

你应该像恢复w,b,c那样对恢复操作做同样的事情,如果你不需要它们你需要恢复wb,你只需要恢复op_restore如下并用feed_dict运行它正如你在你的代码中所做的那样

$ pgagent hostaddr=localhost dbname=postgres user=postgres