Tensorflow - 如何从SavedModel冻结.pb以用于TensorFlowInferenceInterface中的推理?

时间:2017-10-31 06:22:46

标签: android python tensorflow

根据this answer,我可以从MetaGraph中提取SavedModel,然后冻结MetaGraph的{​​{1}},然后运行{{}在GraphDef上添加1}}脚本,以便在Android中使用freeze_graph.py。我的问题:我究竟如何提取GraphDef(然后是.pb)?由于MetaGraph会返回GraphDef而不是tf.saved_model.loader.load(sess, [tag_constants.SERVING], <model_path>)

1 个答案:

答案 0 :(得分:2)

我得到了它。事实证明,在删除我从conda获取的Tensorflow版本并将其替换为pip中的版本后,我可以这样做:

from tensorflow.python.tools import freeze_graph
from tensorflow.python.saved_model import tag_constants

input_saved_model_dir = "F:/python_machine_learning_codes/estimator_exported_model/1509418513"
output_node_names = "softmax_tensor"
input_binary = False
input_saver_def_path = False
restore_op_name = None
filename_tensor_name = None
clear_devices = False
input_meta_graph = False
checkpoint_path = None
input_graph_filename = None
saved_model_tags = tag_constants.SERVING

freeze_graph.freeze_graph(input_graph_filename, input_saver_def_path,
                            input_binary, checkpoint_path, output_node_names,
                              restore_op_name, filename_tensor_name,
                              output_graph_filename, clear_devices, "", "", "",
                              input_meta_graph, input_saved_model_dir,
                            saved_model_tags)

来自conda-forge的内容不完整,即使安装了pip,我也必须从freeze_graph.py复制saved_model_utilstensorflow-master。此外,上面的代码主要是从freeze_graph_test.py

复制而来