在Google AI平台模型上投放模型的TFX Pusher

时间:2020-06-17 01:36:34

标签: python-3.x tensorflow2.0 google-api-client tensorflow-serving tfx

我已经使用tfx.extensions.google_cloud_ai_platform.pusher将模型成功部署到Google AI平台模型中。

但是,我在将数据输入格式化为请求的JSON对象进行预测时遇到问题。多谢您的帮助和指导。

有2个签名def,“ serving_default”是使用Base64序列化数据输入的数据输入。 “ serving_raw”是不带Base64序列化数据输入的数据输入,而且serving_raw中有两个数据输入,cc和pt。

使用以下函数中的“ serving_default”成功调用了预测功能。但是我在使用“ serving_raw”时遇到问题。当我切换到“ serving_raw”时,是否有任何JSON格式缺少的内容?

Google AI平台模型的功能

import googleapiclient.discovery

def predict_json(project, model, signature, instances, version=None):
    service = googleapiclient.discovery.build('ml', 'v1')
    name = 'projects/{}/models/{}'.format(project, model)

    if version is not None:
        name += '/versions/{}'.format(version)

    response = service.projects().predict(
        name=name,
        body={
            'signature_name': signature,
            'instances': instances
        }
    ).execute()

    if 'error' in response:
        raise RuntimeError(response['error'])

    return response['predictions']

“ serving_raw”-失败

predict_json(project="abc",
    model="abc",
    signature = 'serving_raw',
    instances=[
        {"cc":"egg",
        "pt":"def"}],
    version='vserving_model_dir')

RuntimeError: Prediction failed: Error during model execution: <_MultiThreadedRendezvous of RPC that terminated with:
    status = StatusCode.NOT_FOUND
    details = "/tmp/model/0001/assets/vocab_compute_and_apply_vocabulary_1_vocabulary; No such file or directory
     [[{{node transform_features_layer_1/transform/transform/compute_and_apply_vocabulary_1/apply_vocab/text_file_init/InitializeTableFromTextFileV2}}]]"
    debug_error_string = "{"created":"@1592357542.425468532","description":"Error received from peer ipv4:127.0.0.1:8081","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"/tmp/model/0001/assets/vocab_compute_and_apply_vocabulary_1_vocabulary; No such file or directory\n\t [[{{node transform_features_layer_1/transform/transform/compute_and_apply_vocabulary_1/apply_vocab/text_file_init/InitializeTableFromTextFileV2}}]]","grpc_status":5}"
>

“ serving_default”-成功

predict_json(project="abc",
    model="abc",
    signature = 'serving_default',
    instances=[
    {
       "examples":{"b64": "ChcmwSBgoECgJVUw==",
                   "b64": "Ch8KHQoLaXNpdGVz"}
    }],
             version='vserving_model_dir')

签名默认

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['examples'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: serving_default_examples:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['outputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: StatefulPartitionedCall:0
  Method name is: tensorflow/serving/predict

signature_def['serving_raw']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['cc'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: serving_raw_country_code:0
    inputs['pt'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: serving_raw_project_type:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['outputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: StatefulPartitionedCall_1:0
  Method name is: tensorflow/serving/predict

0 个答案:

没有答案