恢复张量流模型并在输入下运行

时间:2019-07-04 05:55:33

标签: python neural-network tensor

我有一个模型,该模型接受int输入x并创建大小为x的向量的均值和方差。 我可以保存此模型,但要恢复,可以通过传递x值来运行它。

行之后,我也可以恢复,但不知道如何执行它
    if  currentcam == frontcam {
                let device = frontcam
                //did other stuff for zooimng
                 } 
   else
  {
     let device = AVCaptureDevice.default(for: .video)
     //did other stuff for zooimng
  }

对于不同的x。我可以为此使用feed_dict吗?请帮我解决这个问题。

saver.restore(sess, './mean_var.ckpt')

1 个答案:

答案 0 :(得分:0)

使用它来恢复和预测:

with tf.Graph().as_default():
    with tf.Session() as sess:
        saver = tf.train.import_meta_graph('./mean_var.ckpt.meta')
        saver.restore(sess, tf.train.latest_checkpoint('./'))
        graph = tf.get_default_graph()
        x = graph.get_tensor_by_name("x:0")   
        output = mean_var(x)
        y_pred = sess.run(output, feed_dict={x:4})
        print(y_pred)

还有另一件事,为占位符x命名,如下所示:

x = tf.placeholder(tf.int32, name="x")

完整代码:

import tensorflow as tf
def mean_var(x):
    vec = tf.random_normal([x])
    mean, variance = tf.nn.moments(vec, [0], keep_dims=True)
    return  mean, variance 

with tf.Graph().as_default():
    x = tf.placeholder(tf.int32, name="x")
    output = mean_var(x)
    init = tf.initialize_all_variables()
    _ = tf.Variable(initial_value='fake_variable')
    saver = tf.train.Saver()


    with tf.Session() as sess:
        sess.run(init)
        sess.run(_.initializer)
        val = sess.run(output, feed_dict={x: 4})
        print(val[0], val[1])
        save_path = saver.save(sess, "./mean_var/mean_var.ckpt")

tf.reset_default_graph()

with tf.Graph().as_default():
    with tf.Session() as sess:
        saver = tf.train.import_meta_graph('./mean_var/mean_var.ckpt.meta')
        saver.restore(sess, tf.train.latest_checkpoint('./mean_var/'))
        #saver.restore(sess, './mean_var/mean_var.ckpt')
        graph = tf.get_default_graph()
        x = graph.get_tensor_by_name("x:0")   
        output = mean_var(x)
        y_pred = sess.run(output, feed_dict={x:4})
        print(y_pred)