Tensorflow中的optimize_for_inference库不再起作用

时间:2018-09-17 06:27:40

标签: python macos tensorflow object-detection

我有一个经过冷冻训练的对象检测模型,该模型已经在视频上进行了测试。我想使用optimize_for_inference库并使用以下代码来优化模型:

import tensorflow as tf

from tensorflow.python.tools import freeze_graph
from tensorflow.python.tools import optimize_for_inference_lib


input_graph_def = tf.GraphDef()
with tf.gfile.Open("frozen_inference_graph.pb", "rb") as f:
    data = f.read()
    input_graph_def.ParseFromString(data)

output_graph_def = optimize_for_inference_lib.optimize_for_inference(
        input_graph_def,
        ['input'],  ## input
        ['y_'], ## output
        tf.float32.as_datatype_enum)

f = tf.gfile.FastGFile("./inference_graph/optimized_inference_graph.pb", "wb")
f.write(output_graph_def.SerializeToString())

我测试了生成的optimized_inference_graph.pb,但它在测试期间发出了此错误:

  

DecodeError Traceback(最近的呼叫   最后)在()         5以tf.gfile.GFile(PATH_TO_CKPT,'rb')作为fid:         6 serialized_graph = fid.read()   ----> 7 od_graph_def.ParseFromString(serialized_graph)         8 tf.import_graph_def(od_graph_def,name ='')         9

     

DecodeError:错误解析消息

该图书馆不再可用吗?它被transform _graph取代了吗?

0 个答案:

没有答案