AI平台云预测不起作用,但局部预测起作用

时间:2019-12-24 10:05:51

标签: tensorflow machine-learning google-cloud-platform deep-learning

我已经通过ai-platform-samples模板成功地训练了DNNLinearCombinedClassifier并进行了本地预测。

当我在本地PC上运行pip freeze| grep tensorflow时:

tensorflow==1.15.0
tensorflow-datasets==1.2.0
tensorflow-estimator==1.15.1
tensorflow-hub==0.6.0
tensorflow-io==0.8.0
tensorflow-metadata==0.15.1
tensorflow-model-analysis==0.15.4
tensorflow-probability==0.8.0
tensorflow-serving-api==1.15.0

为保存的模型运行saved_model_cli show时,得到以下输出:

The given SavedModel SignatureDef contains the following input(s):
  inputs['Sector'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder_2:0
  inputs['announcement_type_simple'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder_1:0
  inputs['market_cap'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1)
      name: Placeholder_3:0
  inputs['sens_content'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['all_class_ids'] tensor_info:
      dtype: DT_INT32
      shape: (-1, 3)
      name: head/predictions/Tile:0
  outputs['all_classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 3)
      name: head/predictions/Tile_1:0
  outputs['class_ids'] tensor_info:
      dtype: DT_INT64
      shape: (-1, 1)
      name: head/predictions/ExpandDims_2:0
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 1)
      name: head/predictions/str_classes:0
  outputs['logits'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: dnn/logits/BiasAdd:0
  outputs['probabilities'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: head/predictions/probabilities:0
Method name is: tensorflow/serving/predict

输入内容与我在json文件中输入的内容一致:

{"sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group", "announcement_type_simple": "trade statement", "Sector": "Consumer, Non-cyclical","market_cap": 4377615219.88}

gcloud ai-platform local predict推断出的模型。

当我运行gcloud ai-platform predict --model=${MODEL_NAME} --version=${MODEL_VERSION} --json-instances=data/new-data.json --verbosity debug --log-http时,它会创建以下帖子:

==== request start ====
uri: https://ml.googleapis.com/v1/projects/simon-teraflow-project/models/tensorflow_sens1/versions/v3:predict
method: POST
== headers start ==
Authorization: --- Token Redacted ---
Content-Type: application/json
user-agent: gcloud/270.0.0 command/gcloud.ai-platform.predict invocation-id/f01f2f4b8c494082abfc38e19499019b environment/GCE environment-version/None interactive/True from-script/False python/2.7.13 term/xterm (Linux 4.9.0-11-amd64)
== headers end ==
== body start ==
{"instances": [{"Sector": "Consumer, Non-cyclical", "announcement_type_simple": "trade statement", "market_cap": 4377615219.88, "sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group"}]}
== body end ==
==== request end ====

您可以看到输入与所需内容一致。 以下是响应:

Traceback (most recent call last):
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 984, in Execute
    resources = calliope_command.Run(cli=self, args=args)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 798, in Run
    resources = command_instance.Run(args)
  File "/usr/lib/google-cloud-sdk/lib/surface/ai_platform/predict.py", line 110, in Run
    signature_name=args.signature_name)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/ml_engine/predict.py", line 77, in Predict
    response_body)
HttpRequestFailError: HTTP request failed. Response: {
  "error": {
    "code": 400,
    "message": "Bad Request",
    "status": "INVALID_ARGUMENT"
  }
}

ERROR: (gcloud.ai-platform.predict) HTTP request failed. Response: {
  "error": {
    "code": 400,
    "message": "Bad Request",
    "status": "INVALID_ARGUMENT"
  }
} 

在ai平台上尝试了同样的事情“测试您的模型”。结果相同:
在AI平台GUI上预测 predict on ai platform gui

我检查了运行时是否为1.15,与本地预测一致,Python版本也一致。

我已经搜索了类似的案件,却一无所获。任何建议将不胜感激。

1 个答案:

答案 0 :(得分:0)

您可以尝试以下操作:

1)在本地保存模型,您可以使用以下适合您的模式的代码段[1]示例

2)使用Docker进行测试

3)将模型部署到GCP中,并向模型[2](适应于您的模型)发出请求,使用gcloud命令而不是GCP UI。

[1]

========Code snippet===============
MODEL_NAME = <MODEL NAME>
VERSION = <MODEL VERSION>
SERVE_PATH = './models/{}/{}'.format(MODEL_NAME, VERSION)

import tensorflow as tf
import tensorflow_hub as hub

use_model = "https://tfhub.dev/google/<MODEL NAME>/<MODEL VERSION>"

with tf.Graph().as_default():
  module = hub.Module(use_model, name=MODEL_NAME)
  text = tf.placeholder(tf.string, [None])
  embedding = module(text)

  init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])

  with tf.Session() as session:
    session.run(init_op)

    tf.saved_model.simple_save(
        session,
        SERVE_PATH,
        inputs = {"text": text},
        outputs = {"embedding": embedding},
        legacy_init_op = tf.tables_initializer()
    )    
========/ Code snippet===============

[2]

Replace <Project_name>, <model_name>, <bucket_name> and <model_version>

    $ gcloud ai-platform models create <model_name> --project <Project_name>
    $ gcloud beta ai-platform versions create v1 --project <Project_name> --model <model_name> --origin=/location/of/model/dir/<model_name>/<model_version> --staging-bucket gs://<bucket_name> --runtime-version=1.15 --machine-type=n1-standard-8
    $ echo '{"text": "cat"}' > instances.json
    $ gcloud ai-platform predict --project <Project_name> --model <model_name> --version v1 --json-instances=instances.json
    $ curl -X POST -v -k -H "Content-Type: application/json" -d '{"instances": [{"text": "cat"}]}'  -H "Authorization: Bearer `gcloud auth print-access-token`" "https://ml.googleapis.com/v1/projects/<Project_name>/models/<model_name>/versions/v1:predict"