TensorFlow:如何使用TensorHub模块导出估算器?

时间:2018-07-23 15:47:38

标签: tensorflow tensorflow-serving tensorflow-estimator

我有一个使用TensorHub text_embedding列的估算器,如下所示:

my_dataframe = pandas.DataFrame(columns=["title"})
# populate data
labels = [] 
# populate labels with 0|1
embedded_text_feature_column = hub.text_embedding_column(
    key="title" 
    ,module_spec="https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1")


estimator = tf.estimator.LinearClassifier(
    feature_columns = [ embedded_text_feature_column ]
    ,optimizer=tf.train.FtrlOptimizer(
        learning_rate=0.1
        ,l1_regularization_strength=1.0
    )
    ,model_dir=model_dir
)

estimator.train(
    input_fn=tf.estimator.inputs.pandas_input_fn(
        x=my_dataframe
        ,y=labels
        ,batch_size=128
        ,num_epochs=None
        ,shuffle=True
        ,num_threads=5
    )
    ,steps=5000
)
export(estimator, "/tmp/my_model")

如何导出并提供模型服务,以便它接受字符串作为预测的输入?我有一个serving_input_receiver_fn,如下,并尝试了很多,但是我对它的外观感到很困惑,以便可以使用它(例如,使用save_model_cli)并用标题字符串调用它(或简单的JSON结构)作为输入。

def export(estimator, dir_path):
    def serving_input_receiver_fn():
        feature_spec = tf.feature_column.make_parse_example_spec(hub.text_embedding_column(
        key="title" 
        ,module_spec="https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1"))
        return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)


    estimator.export_savedmodel(
        export_dir_base=dir_path
        ,serving_input_receiver_fn=serving_input_receiver_fn()
    )

1 个答案:

答案 0 :(得分:4)

If you want to feed raw strings, you might want to consider using the raw input receiver. This code:

feature_placeholder = {'title': tf.placeholder('string', [1], name='title_placeholder')}
serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholder)

estimator.export_savedmodel(dir_path, serving_input_fn)

will give you a SavedModel with the following input specification according to the SavedModel CLI:

saved_model_cli show --dir ./ --tag_set serve --signature_def serving_default

The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: title_placeholder_1:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
    dtype: DT_STRING
    shape: (-1, 2)
    name: linear/head/Tile:0
  outputs['scores'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 2)
    name: linear/head/predictions/probabilities:0

You can provide a python expression to the CLI to serve an input to the model to validate that it works:

saved_model_cli run --dir ./ --tag_set serve --signature_def \
serving_default --input_exprs "inputs=['this is a test sentence']"

Result for output key classes:
[[b'0' b'1']]
Result for output key scores:
[[0.5123377 0.4876624]]