我正在借用sharing variables教程中的示例:
def my_image_filter(input_images):
with tf.variable_scope("conv1"):
# Variables created here will be named "conv1/weights", "conv1/biases".
relu1 = conv_relu(input_images, [5, 5, 32, 32], [32])
with tf.variable_scope("conv2"):
# Variables created here will be named "conv2/weights", "conv2/biases".
return conv_relu(relu1, [5, 5, 32, 32], [32])
说我通过将weights
传递给{{1}来训练这些变量并保存所有四个变量:biases
和conv1
来自conv2
和var_list
层}。
现在我想要恢复并使用它们两次:
tf.train.Saver
但变量的名称现在有with tf.variable_scope("image_filters") as scope:
result1 = my_image_filter(image1)
scope.reuse_variables()
result2 = my_image_filter(image2)
作为前缀,即image_filters
,因此保护程序无法恢复它们:image_filters/conv1/weights
如何恢复所有训练过的变量并多次重复使用?