如何使用tf.estimator保存张量流模型

时间:2019-03-11 05:52:17

标签: tensorflow mnist

我有以下示例代码使用tensorflow的estimator API训练和评估cnn mnist模型:

 def model_fn(features, labels, mode):
        images = tf.reshape(features, [-1, 28, 28, 1])
        model = Model()
        logits = model(images)

        predicted_logit = tf.argmax(input=logits, axis=1, output_type=tf.int32)

        if mode == tf.estimator.ModeKeys.PREDICT:
            probabilities = tf.nn.softmax(logits)

            predictions = {
                'predicted_logit': predicted_logit,
                'probabilities': probabilities
            }
            return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)

        else:
            ...

    def mnist_train_and_eval(_):
        train_data, train_labels, eval_data, eval_labels, val_data, val_labels = get_mnist_data()

        # Create a input function to train
        train_input_fn = tf.estimator.inputs.numpy_input_fn(
            x= train_data,
            y=train_labels,
            batch_size=_BATCH_SIZE,
            num_epochs=1,
            shuffle=True)

        # Create a input function to eval
        eval_input_fn = tf.estimator.inputs.numpy_input_fn(
            x= eval_data,
            y=eval_labels,
            batch_size=_BATCH_SIZE,
            num_epochs=1,
            shuffle=False)

        # Create a estimator with model_fn
        image_classifier = tf.estimator.Estimator(model_fn=model_fn, model_dir=_MODEL_DIR)

        # Finally, train and evaluate the model after each epoch
        for _ in range(_NUM_EPOCHS):
            image_classifier.train(input_fn=train_input_fn)
            metrics = image_classifier.evaluate(input_fn=eval_input_fn)

如何使用estimator.export_savedmodel保存经过训练的模型以供以后推断?我应该如何编写serving_input_receiver_fn?

非常感谢您的帮助!

1 个答案:

答案 0 :(得分:0)

使用输入功能字典创建函数。占位符应与图片的形状匹配,第一尺寸为batch_size。

def serving_input_receiver_fn():
  x = tf.placeholder(tf.float32, [None, Shape])
  inputs = {'x': x}
  return tf.estimator.export.ServingInputReceiver(features=inputs, receiver_tensors=inputs)

或者您可以使用不需要TensorServingInputReceiver的字典映射

inputs = tf.placeholder(tf.float32, [None, 32*32*3])
tf.estimator.export.TensorServingInputReceiver(inputs, inputs)

此函数返回ServingInputReceiver的新实例,该新实例将传递到export_savedmodeltf.estimator.FinalExporter

...
image_classifier.export_savedmodel(saved_dir, serving_input_receiver_fn)
相关问题