我正在使用TensorFlow v2.3.0。我想将保存为带有.h5扩展名的HDF5的模型转换为协议缓冲区(.pb文件)。 我尝试过existing solutions,但不能直接使用,因为它们是为TensorFlow v1.x.x编写的。所以我尝试编辑代码以使其与TensorFlow v2.3.0兼容,最终得到了以下代码:
import tensorflow as tf
from tensorflow.keras.models import load_model
from tensorflow.compat.v1.keras.backend import get_session
from tensorflow.python.platform import gfile
from tensorflow.compat.v1 import global_variables
from tensorflow.compat.v1.graph_util import convert_variables_to_constants as c_to_c
model = load_model('models/model-v2.h5')
# print(model.summary())
Model: "functional_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 150, 150, 3)] 0
_________________________________________________________________
conv2d (Conv2D) (None, 148, 148, 16) 448
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 74, 74, 16) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 72, 72, 32) 4640
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 36, 36, 32) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 34, 34, 64) 18496
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 17, 17, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 18496) 0
_________________________________________________________________
dense (Dense) (None, 512) 9470464
_________________________________________________________________
dropout (Dropout) (None, 512) 0
_________________________________________________________________
dense_1 (Dense) (None, 1) 513
=================================================================
Total params: 9,494,561
Trainable params: 9,494,561
Non-trainable params: 0
# print(model.outputs)
# [<tf.Tensor 'dense_1/Sigmoid:0' shape=(None, 1) dtype=float32>]
# print(model.inputs)
# [<tf.Tensor 'input_1:0' shape=(None, 150, 150, 3) dtype=float32>]
def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
graph = session.graph
with graph.as_default():
freeze_var_names = list(set(v.op.name for v in global_variables()).difference(keep_var_names or []))
output_names = output_names or []
output_names += [v.op.name for v in global_variables()]
# Graph -> GraphDef ProtoBuf
input_graph_def = graph.as_graph_def()
if clear_devices:
print(input_graph_def.node)
for node in input_graph_def.node:
print('Node',node)
node.device = ""
frozen_graph = c_to_c(session, input_graph_def, output_names, freeze_var_names)
return frozen_graph
frozen_graph = freeze_session(tf.compat.v1.Session(),
output_names=[out.op.name for out in model.outputs])
# Save to model/model.pb
tf.io.write_graph(frozen_graph, "models", "model_v2.pb", as_text=False)
但是,我遇到此错误: AssertionError:density_1 / Sigmoid不在图中。显然graph_def返回一个空列表,因为当我循环遍历节点时没有打印任何内容。我想知道如何更正此代码,或者是否有将.h5模型转换为.pb
的更好的选择答案 0 :(得分:0)
尝试在兼容模式下使用tf.saved_model.save:
tf.keras.Model实例已经由输入和输出构成 有签名,因此不需要@ tf.function装饰器或 签名参数。如果两者均未指定,则模型的前向 通行证已导出。
x = input_layer.Input((4,), name="x")
y = core.Dense(5, name="out")(x)
model = training.Model(x, y)
tf.compat.v1.saved_model.save(model, '/tmp/saved_model/')
# The exported SavedModel takes "x" with shape [None, 4] and returns "out"
# with shape [None, 5]
P.S。没有机会测试,如果您要尝试,请告诉我结果。
答案 1 :(得分:0)
经过大量的搜索,我在这里找到了正确的(与TensorFlow 2.x.x兼容)代码:https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/
# Convert Keras model to ConcreteFunction
full_model = tf.function(lambda x: model(x))
full_model = full_model.get_concrete_function(
x=tf.TensorSpec(model.inputs[0].shape, model.inputs[0].dtype))
# Get frozen ConcreteFunction
frozen_func = convert_variables_to_constants_v2(full_model)
frozen_func.graph.as_graph_def()
layers = [op.name for op in frozen_func.graph.get_operations()]
print("-" * 50)
print("Frozen model layers: ")
for layer in layers:
print(layer)
print("-" * 50)
print("Frozen model inputs: ")
print(frozen_func.inputs)
print("Frozen model outputs: ")
print(frozen_func.outputs)
# Save frozen graph from frozen ConcreteFunction to hard drive
tf.io.write_graph(graph_or_graph_def=frozen_func.graph,
logdir="./frozen_models",
name="simple_frozen_graph.pb",
as_text=False)
此文件中使用的代码:https://github.com/leimao/Frozen_Graph_TensorFlow/blob/master/TensorFlow_v2/example_1.py