检查语言模型的困惑性

时间:2018-11-28 08:56:22

标签: keras nlp lstm language-model perplexity

我用Keras LSTM创建了一个语言模型,现在我想评估它是否很好,所以我想计算困惑度。

在Python中计算模型的困惑度的最佳方法是什么?

1 个答案:

答案 0 :(得分:0)

我提出了两个版本,并附上了它们的相应来源,请随时查看链接。

def perplexity_raw(y_true, y_pred):
    """
    The perplexity metric. Why isn't this part of Keras yet?!
    https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
    https://github.com/keras-team/keras/issues/8267
    """
#     cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
    cross_entropy = K.cast(K.equal(K.max(y_true, axis=-1),
                          K.cast(K.argmax(y_pred, axis=-1), K.floatx())),
                  K.floatx())
    perplexity = K.exp(cross_entropy)
    return perplexity

def perplexity(y_true, y_pred):
    """
    The perplexity metric. Why isn't this part of Keras yet?!
    https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
    https://github.com/keras-team/keras/issues/8267
    """
    cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
    perplexity = K.exp(cross_entropy)
    return perplexity