使用基数为64的输入进行呼叫预测时出错

时间:2018-11-20 01:08:13

标签: tensorflow tensorflow-serving tensorflow-hub

我正在使用Tensorflow集线器的示例来导出saved_model以与使用Docker的Tensorflow服务一起使用。 (https://github.com/tensorflow/hub/blob/master/examples/image_retraining/retrain.py

我只是按照互联网上的一些说明进行操作,并修改了export_model,如下所示

def export_model(module_spec, class_count, saved_model_dir):
  """Exports model for serving.

  Args:
    module_spec: The hub.ModuleSpec for the image module being used.
    class_count: The number of classes.
    saved_model_dir: Directory in which to save exported model and variables.
  """
  # The SavedModel should hold the eval graph.
  sess, in_image, _, _, _, _ = build_eval_session(module_spec, class_count)

  # Shape of [None] means we can have a batch of images.
  image = tf.placeholder(shape=[None], dtype=tf.string)

  with sess.graph.as_default() as graph:
    tf.saved_model.simple_save(
        sess,
        saved_model_dir,
        #inputs={'image': in_image},
        inputs = {'image_bytes': image},
        outputs={'prediction': graph.get_tensor_by_name('final_result:0')},
        legacy_init_op=tf.group(tf.tables_initializer(), name='legacy_init_op')
    )

问题是,当我尝试使用邮递员调用api时,会出现此错误

{
    "error": "Tensor Placeholder_1:0, specified in either feed_devices or fetch_devices was not found in the Graph"
}

我是否需要修改再培训过程以便它可以接受base64输入?

0 个答案:

没有答案