Exporting and loading models

时间:2017-07-12 08:00:37

标签: python tensorflow

System information OS Platform and Distribution (e.g., Linux Ubuntu 16.04):

Mac os Sierra (10.12.5)

TensorFlow installed from:

Using pip

TensorFlow version (use command below): 1.2.1

The Problem:

I'm trying to save and restore a model trained from Python to Python. I've the model saved in three .chkpt files (meta, index and data-000000-of-00001) and I'm trying to read it into my session, save the model using add_meta_graph_and_variables and then read it again using the loader: loader.load(session,[tf.saved_model.tag_constants.TRAINING], pathToSaveModel).

This is my code:

First, I restore the weights from the three files containing "data", "index" and "meta" (the metagraph and the weights") into my session using saver restore:

with tf.Session(graph=tf.Graph()) as session:
    ##HERE IS THE CODE OF MY NETWORK (Very long)

    session.run(tf.global_variables_initializer())
    #Load
    saver = tf.train.Saver()
    saver.restore(session, "newModel.chkpt")

    features = loadFeatures(["cat2.jpg"])
    res = predictions.eval(
            feed_dict={
                x: features,
                keep_prob: 1.0, })
    print('Image {} has a prob {} '.format(image, res))

    b = saved_model_builder.SavedModelBuilder(pathToSaveModel)
    b.add_meta_graph_and_variables(session, [tf.saved_model.tag_constants.TRAINING])
    b.save()

With this code, I've a good classification and finally a new folder containing the model saved with add_meta_graph_and_variables: Folder with the model saved with

Now, I want to use the saved model to classify, again, the same image. This time I used the loader instead the restore:

with tf.Session(graph=tf.Graph()) as session:
    ##HERE IS THE CODE OF MY NETWORK (Very long)

    #session.run(tf.global_variables_initializer())
    #Load
    from tensorflow.python.saved_model import loader
    loader.load(session, [tf.saved_model.tag_constants.TRAINING], pathToSaveModel)

    features = loadFeatures(["cat2.jpg"])
    res = predictions.eval(
            feed_dict={
                x: features,
                keep_prob: 1.0, })
    print('Image {} has a prob {} '.format(image, res))

And here comes the problem:

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value b_fcO
     [[Node: b_fcO/read = Identity[T=DT_FLOAT, _class=["loc:@b_fcO"], _device="/job:localhost/replica:0/task:0/cpu:0"](b_fcO)]]

If I've tried to use: session.run(tf.global_variables_initializer()) then it works but the classification is not valid, I think that the weights are not being exported / imported from the very beginning and after test many things I'm stuck here.

Any clues about what I'm doing wrong?.

Thanks in advance.

Update: This is how the model is in three files in the beginning: ckpt files

1 个答案:

答案 0 :(得分:0)

您应该检查的一些事项是:

  • 什么是pathToSaveModel?
  • 检查点文件在哪里?
  • 使用文本编辑器打开检查点文件:指向哪个文件夹?
  • 是权重的路径正确吗?

通过回答这些问题,我总能找到我犯的错误。希望它有所帮助!