使用NMT进行TF服务

时间:2019-01-16 07:25:01

标签: tensorflow-serving

我正在努力导出翻译模型以使用TF-Serving服务。 我在以下链接中提到了这些问题。 https://github.com/tensorflow/serving/issues/712

所服务的模型似乎总是给出相同的结果,而与收到的输入无关。我正在使用以下代码。

def导出(自己):     infer_model = self._create_infer_model()

with tf.Session(graph=infer_model.graph,
                config=tf.ConfigProto(allow_soft_placement=True)) as sess:
  feature_config = {
    'input': tf.FixedLenSequenceFeature(dtype=tf.string, shape=[], allow_missing=True),
  }

  #serialized_example = tf.placeholder(dtype=tf.string, name="tf_example")
  #tf_example = tf.parse_example(serialized_example, feature_config)
  tf_example = ['This is created just for export']
  inference_input = tf.identity(tf_example, name="inference_input")
  #batch_size_placeholder = tf.constant(1, shape=[1,], dtype=tf.int64)

  saver = infer_model.model.saver
  saver.restore(sess, self._ckpt_path)

  # initialize tables
  sess.run(tf.tables_initializer())
  sess.run(
    infer_model.iterator.initializer,
    feed_dict={
      infer_model.src_placeholder: inference_input.eval()
      })

  # get outputs of model
  inference_outputs, _ = infer_model.model.decode(sess=sess)
  #inference_outputs = infer_model.model.sample_words
  #get the first of the outputs as the result of inference
  inference_output = inference_outputs[0]

  # create signature def
  # key `seq_input` in `inputs` dict could be changed as your will,
  # but the client should consistent with this
  # when you make an inference request.
  # key `seq_output` in outputs dict is the same as above
  inference_signature = tf.saved_model.signature_def_utils.predict_signature_def(
    inputs={
      'seq_input': infer_model.src_placeholder
    },
    outputs={
      'seq_output': tf.convert_to_tensor(inference_output)
    }
  )
  legacy_ini_op = tf.group(tf.tables_initializer(), name='legacy_init_op')

  builder = tf.saved_model.builder.SavedModelBuilder(self._export_dir)
  # key `tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY`
  #  (is `serving_default` actually) in signature_def_map could be changed
  # as your will. But the client should consistent with this when you make an inference request.
  builder.add_meta_graph_and_variables(
    sess, [tf.saved_model.tag_constants.SERVING],
    signature_def_map={
      tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: inference_signature,
    },
    legacy_init_op=legacy_ini_op,
    clear_devices=True,
    assets_collection=tf.get_collection(tf.GraphKeys.ASSET_FILEPATHS))
  builder.save(as_text=True)
  print("Done!")

在这种情况下,我总是将输出作为 “这只是出口”

任何帮助都会很棒。

谢谢, 苏吉思。

0 个答案:

没有答案