如何保存从冻结推理图生成的TensorRT图?

时间:2019-08-01 18:40:06

标签: python tensorflow tensorrt

我使用以下脚本将我的Frozen_inference_graph转换为TensorRT优化的一个:

import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt

with tf.Session() as sess:
    # First deserialize your frozen graph:
    with tf.gfile.GFile('frozen_inference_graph.pb', 'rb') as f:
        frozen_graph = tf.GraphDef()
        frozen_graph.ParseFromString(f.read())
    # Now you can create a TensorRT inference graph from your
    # frozen graph:
    converter = trt.TrtGraphConverter(
        input_graph_def=frozen_graph,
        nodes_blacklist=['outputs/Softmax']) #output nodes
    trt_graph = converter.convert()
    # Import the TensorRT graph into a new graph and run:
    output_node = tf.import_graph_def(
        trt_graph,
        return_elements=['outputs/Softmax'])
    sess.run(output_node)

我的问题是如何将优化后的图形保存到磁盘上以便可以运行推理?

1 个答案:

答案 0 :(得分:0)

是的,您只需添加这两行:

saved_model_dir_trt = "./tensorrt_model.trt"
converter.save(saved_model_dir_trt)