如何使用tf.contrib.seq2seq.AttentionWrapperState查找“ alignments”和“ alignment_history”参数?

时间:2018-11-03 07:36:10

标签: python tensorflow

如何使用tf.contrib.seq2seq.AttentionWrapperState?具体来说,我在哪里可以获取参数“ alignments”和“ alignment_history”?

click here to see the document

这是我的代码和错误信息。

with tf.variable_scope("attention"):
            attention = tf.contrib.seq2seq.BahdanauAttention(
                state_size, self.att_states, self.encoder_len)
            decoder_cell = tf.contrib.seq2seq.AttentionWrapper(
                decoder_cell, attention, state_size * 2)
            wrapper_state = tf.contrib.seq2seq.AttentionWrapperState(
                self.init_state, self.prev_att, self.global_step, attention_state=self.att_states)

TypeError: __new__() missing 2 required positional arguments: 'alignments' and 'alignment_history'

0 个答案:

没有答案