无法针对已保存的Tensorflow模型运行命令行预测

时间:2017-09-12 15:13:07

标签: python tensorflow

我无法从命令行导出和运行一个简单的示例。该示例仅训练0-99的随机数,如果小于50则标记值0,否则标签为1.我希望能够保存模型,然后使用它来生成对通过命令输入的值的预测线。

Here is the rough guide I am using for model saving and CLI

环境是Tensorflow Docker image

这是程序 - 在我的环境中运行大约需要10秒钟。

"""Create a simple model that can be run from the CLI."""
import math
import os
import numpy as np
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'  # Prevents the compiler warnings
import tensorflow as tf  # pylint: disable=import-error


def generate_value_list(size=100):
    """Return an array of random values 0-99."""
    value_list = np.random.rand(size)
    value_list = [math.trunc(x * 100) for x in value_list]

    return value_list


def generate_label_list(value_list):
    """Return an array of labels for the passed values."""
    label_list = [0 if x < 50 else 1 for x in value_list]

    return label_list


def input_fn():
    """Generate input for training or evaluating."""
    value_list = generate_value_list()
    label_list = generate_label_list(value_list)
    features = {"value_list": value_list}

    return features, label_list


def main():
    """Execute the main program."""
    content = tf.feature_column.numeric_column(key='value_list')
    columns = [content]
    estimator = tf.estimator.LinearClassifier(
        feature_columns=columns,
        optimizer=tf.train.FtrlOptimizer(
            learning_rate=0.1,
            l1_regularization_strength=0.001))

    estimator.train(input_fn=input_fn, steps=500)
    result = estimator.evaluate(input_fn=input_fn, steps=5)
    print result

    feature_spec = tf.feature_column.make_parse_example_spec(columns)
    export_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
    estimator.export_savedmodel('/notebooks/model', export_input_fn)


if __name__ == "__main__":
    main()

运行程序后,我在模型目录中保存了一个模型。要查看模型界面,请运行以下命令:

saved_model_cli show --dir /notebooks/model/1505225506 --all

我得到了这个:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['None']:
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
    dtype: DT_STRING
    shape: (-1, 2)
    name: linear/head/Tile:0
outputs['scores'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 2)
    name: linear/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify

signature_def['classification']:
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
    dtype: DT_STRING
    shape: (-1, 2)
    name: linear/head/Tile:0
outputs['scores'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 2)
    name: linear/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify

signature_def['regression']:
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['outputs'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 1)
    name: linear/head/predictions/logistic:0
Method name is: tensorflow/serving/regress

signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
    dtype: DT_STRING
    shape: (-1, 2)
    name: linear/head/Tile:0
outputs['scores'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 2)
    name: linear/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify

接下来我运行它来预测数字输入:

saved_model_cli run --dir /notebooks/model/1505225506 --tag_set serve --signature_def serving_default --input_exprs 'inputs=[32]'

我得到了这个:

Traceback (most recent call last):
  File "/usr/local/bin/saved_model_cli", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/tools/saved_model_cli.py", line 649, in main
    args.func(args)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/tools/saved_model_cli.py", line 529, in run
    args.overwrite, tf_debug=args.tf_debug)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/tools/saved_model_cli.py", line 298, in run_saved_model_with_feed_dict
    outputs = sess.run(output_tensor_names_sorted, feed_dict=inputs_feed_dict)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 895, in run
    run_metadata_ptr)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1124, in _run
    feed_dict_tensor, options, run_metadata)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1321, in _do_run
    options, run_metadata)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1340, in _do_call
    raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InternalError: Unable to get element from the feed as bytes.

我为模型的输入尝试了很多变化,这个变量在出错之前似乎在代码中最远。

任何想法我做错了什么?是否有更好的方法来完成类似的事情?

0 个答案:

没有答案