Tensorflow服务-SavedModel格式并在其中传递信息

时间:2018-12-01 17:50:21

标签: python tensorflow machine-learning tensorflow-serving

我真的没有得到这个!我正在尝试将tensorflow模型保存为SavedModel格式,以准备与Tensorflow Serving一起使用。为此,我需要在保存模型之前定义输入和输出。

我正在加载以前保存的模型(ckpt),并希望有一个numpy数组作为输入。这是一个文本文档,已转换为如下所示的功能数组:

array([0., 4., 0., ..., 0., 0., 0.])
<class 'numpy.ndarray'>

作为我的输出,我想评估一下这一系列功能。

这是我的模特:

hidden_layer_1 = {'weights':tf.Variable(tf.truncated_normal([len(train_x[0]), n_nodes_hl1], stddev=0.1)),'biases': tf.Variable(tf.constant(0.1,shape=[n_nodes_hl1]))}
hidden_layer_2 = {'weights': tf.Variable(tf.truncated_normal([n_nodes_hl1, n_nodes_hl2], stddev=0.1)),'biases': tf.Variable(tf.constant(0.1,shape=[n_nodes_hl2]))}
hidden_layer_3 = {'weights': tf.Variable(tf.truncated_normal([n_nodes_hl2, n_nodes_hl3], stddev=0.1)),'biases': tf.Variable(tf.constant(0.1,shape=[n_nodes_hl3]))}
output_layer = {'weights': tf.Variable(tf.truncated_normal([n_nodes_hl3, n_classes], stddev=0.1)),'biases': tf.Variable(tf.constant(0.1,shape=[n_classes]))}

def neural_network_model(data):
    l1 = tf.add(tf.matmul(data,hidden_layer_1['weights']), hidden_layer_1['biases'])
    l1 = tf.nn.relu(l1)
    l2 = tf.add(tf.matmul(l1,hidden_layer_2['weights']), hidden_layer_2['biases'])
    l2 = tf.nn.relu(l2)
    l3 = tf.add(tf.matmul(l2,hidden_layer_3['weights']), hidden_layer_3['biases'])
    l3 = tf.nn.relu(l3)
    output = tf.add(tf.matmul(l3,output_layer['weights']), output_layer['biases'],name='output')
    return output

这是此预训练的模型,其中装有一个占位符x。

x = tf.placeholder(tf.float32, [None, len(train_x[0])])
prediction = neural_network_model(x)
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print("variables initialised") 
saver.restore(sess,'/path_to_model.ckpt')

result_tf = sess.run(tf.argmax(prediction.eval(feed_dict={x:[???]}),1))

三个'???'标记是我不明白的地方。我不知道如何在没有feed_dict的情况下获得预测,并且无法根据自身评估占位符,所以不确定去哪里!

其余代码如下:

#  Exports the model
export_path_base = FLAGS.export_model_dir
export_path = os.path.join(
    tf.compat.as_bytes(export_path_base),
    tf.compat.as_bytes(str(FLAGS.model_version)))
print('Exporting trained model to', export_path)
builder = tf.saved_model.builder.SavedModelBuilder(export_path)

# Creates the TensorInfo protobuf objects that encapsulates the input/output tensors
tensor_info_x = tf.saved_model.utils.build_tensor_info(x)
tensor_info_output= tf.saved_model.utils.build_tensor_info(result_tf)

prediction_signature = (
    tf.saved_model.signature_def_utils.build_signature_def(
    inputs={'input': tensor_info_x},
    outputs={'output': tensor_info_output},
    method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))

builder.add_meta_graph_and_variables(
    sess, [tf.saved_model.tag_constants.SERVING],
    signature_def_map={
        'classify_documents':
            prediction_signature,
    })
# export the model
builder.save(as_text=True)

0 个答案:

没有答案