使用BERT模型进行推理时没有batch_size

时间:2019-07-02 06:06:35

标签: python tensorflow machine-learning deep-learning text-classification

我正在使用Tensorflow BERT语言模型处理二进制分类问题。这是link至Google colab。保存并加载了模型后,进行预测时出现错误。

保存模型

def serving_input_receiver_fn():
  feature_spec = {
      "input_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "input_mask" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "segment_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "label_ids" :  tf.FixedLenFeature([], tf.int64)
  }
  serialized_tf_example = tf.placeholder(dtype=tf.string,
                                         shape=[None],
                                         name='input_example_tensor')
  print(serialized_tf_example.shape)
  receiver_tensors = {'example': serialized_tf_example}
  features = tf.parse_example(serialized_tf_example, feature_spec)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

export_path = '/content/drive/My Drive/binary_class/bert/'
estimator._export_to_tpu = False  # this is important
estimator.export_saved_model(export_dir_base=export_path,serving_input_receiver_fn=serving_input_receiver_fn)

根据虚拟文字进行预测

pred_sentences = [
  "A novel, simple method to get insights from reviews"
]

def getPrediction1(in_sentences):
  labels = ["Irrelevant", "Relevant"]
  input_examples = [run_classifier.InputExample(guid="", text_a = x, text_b = None, label = 0) for x in in_sentences] # here, "" is just a dummy label
  input_features = run_classifier.convert_examples_to_features(input_examples, label_list, MAX_SEQ_LENGTH, tokenizer)
  predict_input_fn = run_classifier.input_fn_builder(features=input_features, seq_length=MAX_SEQ_LENGTH, is_training=False, drop_remainder=False)
  predictions = est.predict(predict_input_fn)
  print(predictions)
  return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]

est = tf.contrib.estimator.SavedModelEstimator(MODEL_FILE_PATH)
predictions = getPrediction1(pred_sentences[0])
predictions

错误

W0702 05:44:17.551325 139812812932992 estimator.py:1811] Using temporary folder as model directory: /tmp/tmpzeiaa6q8
W0702 05:44:17.605536 139812812932992 saved_model_estimator.py:170] train mode not found in SavedModel.
W0702 05:44:17.608479 139812812932992 saved_model_estimator.py:170] eval mode not found in SavedModel.
<generator object Estimator.predict at 0x7f27fa721eb8>
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-28-56ea95428bf4> in <module>()
     21 # Relevant "Nanoparticulate drug delivery is a promising drug delivery system to a range of molecules to desired site specific action in the body. In this present work nanoparticles are prepared with positive group of amino group of chitosan with varying concentration based nanoparticles are loaded with anastrazole were prepared by with negative group of sodium tripolyphosphate by ionotropic gelation method. All these formulated nanoparticles are characterized for its particle size ,zeta potential ,drug entrapment efficacy and in-vitro release kinetics .The particle size of all these formulations were found to be 200,365,420,428 And 483.zeta potential of all formulations are-16.3±2.1 ,28.2±4.3,-10.38±3.6,-24.31±3.2 and 21.38±5.2.respectively. FT-IR studies indicated that there was no chemical interaction between drug and polymer and stability of drug. The in-vitro release behaviour from all the drug loaded batches was found to be zero order and provided sustained release over a period of 12 h by diffusion and swelling mechanism and The values of n and r 2 for coated batch was 0.731 and 0.979.Since the values of slope (n) lies in between 0.5 and 1 it was concluded that the mechanism by which drug is being released is a Non-Fickian anomalous solute diffusion mechanism, "
     22 
---> 23 predictions = getPrediction1(pred_sentences[0:2])
     24 predictions
     25 

5 frames
<ipython-input-28-56ea95428bf4> in getPrediction1(in_sentences)
     14   predictions = est.predict(predict_input_fn)
     15   print(predictions)
---> 16   return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]
     17 
     18 

<ipython-input-28-56ea95428bf4> in <listcomp>(.0)
     14   predictions = est.predict(predict_input_fn)
     15   print(predictions)
---> 16   return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]
     17 
     18 

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in predict(self, input_fn, predict_keys, hooks, checkpoint_path, yield_single_examples)
    615         self._create_and_assert_global_step(g)
    616         features, input_hooks = self._get_features_from_input_fn(
--> 617             input_fn, ModeKeys.PREDICT)
    618         estimator_spec = self._call_model_fn(
    619             features, None, ModeKeys.PREDICT, self.config)

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _get_features_from_input_fn(self, input_fn, mode)
    991   def _get_features_from_input_fn(self, input_fn, mode):
    992     """Extracts the `features` from return values of `input_fn`."""
--> 993     result = self._call_input_fn(input_fn, mode)
    994     result, _, hooks = estimator_util.parse_input_fn_result(result)
    995     self._validate_features_in_predict_input(result)

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _call_input_fn(self, input_fn, mode, input_context)
   1111       kwargs['input_context'] = input_context
   1112     with ops.device('/cpu:0'):
-> 1113       return input_fn(**kwargs)
   1114 
   1115   def _call_model_fn(self, features, labels, mode, config):

/usr/local/lib/python3.6/dist-packages/bert/run_classifier.py in input_fn(params)
    727   def input_fn(params):
    728     """The actual input function."""
--> 729     batch_size = params["batch_size"]
    730 
    731     num_examples = len(features)

KeyError: 'batch_size'

batch_size参数存在于估算器中,但不存在于加载的模型参数中。

estimator.params['batch_size'] # 32

est.params['batch_size'] # KeyError: 'batch_size'

2 个答案:

答案 0 :(得分:4)

您正在使用SavedModelEstimator,它不提供传递RunConfigparams参数的选项

because the model function graph is defined statically in the SavedModel.

由于SavedModelEstimatorEstimator的子类,因此params仅仅是存储超参数的字典。我认为您可以通过在调用getPrediction1之前将所需的(键,值)对传递给params来进行修改。例如:

est = tf.contrib.estimator.SavedModelEstimator(MODEL_FILE_PATH)
est.params['batch_size'] = 1
predictions = getPrediction1(pred_sentences)

答案 1 :(得分:0)

将批处理大小放入参数中。

estimator2 = tf.estimator.Estimator(
    model_fn, model_dir="/Models", config=None, params={'batch_size': 32}, warm_start_from=None
)