我有一个已保存的图表定义,该定义是使用tf.train.import_meta_graph
导入的。该图包含py_func
op,它不可序列化。我可以在不构建图表的情况下为此操作定义和分配python函数吗?
答案 0 :(得分:5)
这是可能的,但可能有点脆弱。特别是,pyfunc需要按照它们在原始图中定义的顺序重新定义(以便它们在FuncRegistry中具有相同的标识符。)
一个例子。我们可以定义一个包含py_func:
的图形import tensorflow as tf
def my_py_func(x):
return 13. * x + 2.
def train_model():
with tf.Graph().as_default():
some_input = tf.constant([[1., 2., 3., 4.],
[5., 6., 7., 8.]])
after_py_func = tf.py_func(my_py_func, [some_input], Tout=tf.float32,
name="my_py_func")
coefficient = tf.get_variable(
"coefficient",
shape=[])
bias = tf.get_variable(
"bias",
shape=[])
loss = tf.reduce_sum((coefficient * some_input + bias - after_py_func) ** 2)
global_step = tf.contrib.framework.get_or_create_global_step()
train_op = tf.group(tf.train.AdamOptimizer(0.1).minimize(loss),
tf.assign_add(global_step, 1))
# Make it easy to retreive things we care about when the metagraph is reloaded.
tf.add_to_collection('useful_ops', bias)
tf.add_to_collection('useful_ops', coefficient)
tf.add_to_collection('useful_ops', loss)
tf.add_to_collection('useful_ops', train_op)
tf.add_to_collection('useful_ops', global_step)
tf.add_to_collection('useful_ops', some_input)
init_op = tf.global_variables_initializer()
saver = tf.train.Saver()
with tf.Session() as session:
session.run(init_op)
for i in range(5000):
(_, evaled_loss, evaled_coefficient, evaled_bias,
evaled_global_step) = session.run(
[train_op, loss, coefficient, bias, global_step])
if i % 1000 == 0:
print(evaled_global_step, evaled_loss, evaled_coefficient,
evaled_bias)
saver.save(session, "./trained_pyfunc_model", global_step=global_step)
这会做一些基本的训练(匹配py_func中的线性函数):
1 37350.4 -0.0934748 0.193026
1001 19.2717 12.3749 5.40368
2001 0.108373 12.9532 2.2548
3001 8.28227e-06 12.9996 2.00222
4001 3.77258e-09 13.0 2.00004
如果我们在新的Python会话中尝试加载元图而不重新定义pyfunc,我们会收到错误:
def load_model():
with tf.Graph().as_default():
saver = tf.train.import_meta_graph("./trained_pyfunc_model-5000.meta")
bias, coefficient, loss, train_op, global_step, some_input = tf.get_collection('useful_ops')
#after_py_func = tf.py_func(my_py_func, [some_input], Tout=tf.float32,
# name="my_py_func")
with tf.Session() as session:
saver.restore(session, "./trained_pyfunc_model-5000")
(_, evaled_loss, evaled_coefficient, evaled_bias,
evaled_global_step) = session.run(
[train_op, loss, coefficient, bias, global_step])
print("Restored: ", evaled_global_step, evaled_loss, evaled_coefficient, evaled_bias)
UnknownError(参见上面的回溯):KeyError:' pyfunc_0'
但是,只要py_funcs以相同的顺序定义并具有相同的实现,我们应该没问题:
def load_model():
with tf.Graph().as_default():
saver = tf.train.import_meta_graph("./trained_pyfunc_model-5000.meta")
bias, coefficient, loss, train_op, global_step, some_input = tf.get_collection('useful_ops')
after_py_func = tf.py_func(my_py_func, [some_input], Tout=tf.float32,
name="my_py_func")
with tf.Session() as session:
saver.restore(session, "./trained_pyfunc_model-5000")
(_, evaled_loss, evaled_coefficient, evaled_bias,
evaled_global_step) = session.run(
[train_op, loss, coefficient, bias, global_step])
print("Restored: ", evaled_global_step, evaled_loss, evaled_coefficient, evaled_bias)
这可以让我们继续训练,或者我们想要对恢复的模型做任何其他事情:
Restored: 5001 1.77897e-09 13.0 2.00003
请注意,有状态的py_funcs将更难处理:TensorFlow没有保存任何可能与它们相关联的Python变量!