HMM中使用的传统维特比算法具有起始概率矩阵(viterbi algorithm wiki),但张量流中的 viterbi_decode 参数仅需要转移概率矩阵和排放概率矩阵。如何理解?
def viterbi_decode(score, transition_params):
"""Decode the highest scoring sequence of tags outside of
TensorFlow.
This should only be used at test time.
Args:
score: A [seq_len, num_tags] matrix of unary potentials.
transition_params: A [num_tags, num_tags] matrix of binary potentials.
Returns:
viterbi: A [seq_len] list of integers containing the highest scoring tag
indicies.
viterbi_score: A float containing the score for the Viterbi
sequence.
"""
答案 0 :(得分:1)
答案 1 :(得分:1)
我用带有tensorflow的viterbi算法的示例创建了完整的详细教程,您可以在这里查看:
假设您的数据如下所示:
# logits : A [batch_size, max_seq_len, num_tags] tensor of unary potentials to use as input to the CRF layer.
# labels_a : A [batch_size, max_seq_len] matrix of tag indices for which we compute the log-likelihood.
# sequence_len : A [batch_size] vector of true sequence lengths.
然后
log_likelihood , transition_params = tf.contrib.crf.crf_log_likelihood(logits,labels_a,sequence_len)
#return of crf log_likelihood function
# log_likelihood: A scalar containing the log-likelihood of the given sequence of tag indices.
# transition_params: A [num_tags, num_tags] transition matrix.
# This is either provided by the caller or created in this function.
现在我们可以计算维特比分数了
# score: A [seq_len, num_tags] matrix of unary potentials.
# transition_params: A [num_tags, num_tags] matrix of binary potentials.