tensorflow中的LSTM ptb模型始终返回相同的单词

时间:2016-06-21 15:48:58

标签: python tensorflow lstm

我应用了Predicting the next word using the LSTM ptb model tensorflow example中描述的相同方法来使用tensorflow LSTM并预测我的测试文档中的下一个单词。但是,LSTM总是在每次运行时为每个序列预测相同的单词。

更具体地说,我添加了以下几行:

  class PTBModel(object):
  """The PTB model."""

  def __init__(self, is_training, config):
    # General definition of LSTM (unrolled)
    # identical to tensorflow example ...     
    # omitted for brevity ...
    outputs = []
    state = self._initial_state
    with tf.variable_scope("RNN"):
        for time_step in range(num_steps):
            if time_step > 0: tf.get_variable_scope().reuse_variables()
            (cell_output, state) = cell(inputs[:, time_step, :], state)
            outputs.append(cell_output)

    output = tf.reshape(tf.concat(1, outputs), [-1, size])
    softmax_w = tf.get_variable("softmax_w", [size, vocab_size])
    softmax_b = tf.get_variable("softmax_b", [vocab_size])
    logits = tf.matmul(output, softmax_w) + softmax_b

    #Storing the probabilities and logits
    self.probabilities = probabilities =  tf.nn.softmax(logits)
    self.logits = logits

然后以下列方式更改run_epoch:

def run_epoch(session, m, data, eval_op, verbose=True, is_training = True):
  """Runs the model on the given data."""
  # first part of function unchanged from example

  for step, (x, y) in enumerate(reader.ptb_iterator(data, m.batch_size,
                                                    m.num_steps)):
    # evaluate proobability and logit tensors too:
    cost, state, probs, logits, _ = session.run([m.cost, m.final_state, m.probabilities, m.logits, eval_op],
                                 {m.input_data: x,
                                  m.targets: y,
                                  m.initial_state: state})
    costs += cost
    iters += m.num_steps

    if not is_training:
        chosen_word = np.argmax(probs, 1)
        print(chosen_word[-1])


  return np.exp(costs / iters)

我想预测测试数据集中的下一个单词。当我运行该程序时,它总是返回相同的索引(大多数时候索引为< eos>)。任何帮助表示赞赏。

1 个答案:

答案 0 :(得分:0)

也许SoftMax的温度太冷了?