tensorflow版本1.8中的tf.contrib.seq2seq.prepare_attention()的替代方法

时间:2018-05-21 18:54:13

标签: python tensorflow deep-learning chatbot seq2seq

AttributeError:模块'tensorflow.contrib.seq2seq'没有属性'prepare_attention'

我知道不推荐使用prepare_attention()。有什么替代方案?另请注明语法。

我正在使用的功能是: def decoding_layer_train(encoder_state,dec_cell,dec_embed_input,sequence_length,decoding_scope,                          output_fn,keep_prob,batch_size):     '''解码训练数据'''

attention_states = tf.zeros([batch_size, 1, dec_cell.output_size])

att_keys, att_vals, att_score_fn, att_construct_fn = tf.contrib.seq2seq.prepare_attention(attention_states,
                                             attention_option="bahdanau",
                                             num_units=dec_cell.output_size)

train_decoder_fn = tf.contrib.seq2seq.attention_decoder_fn_train(encoder_state[0],
                                                                 att_keys,
                                                                 att_vals,
                                                                 att_score_fn,
                                                                 att_construct_fn,
                                                                 name = "attn_dec_train")
train_pred, _, _ = tf.contrib.seq2seq.dynamic_rnn_decoder(dec_cell,
                                                          train_decoder_fn,
                                                          dec_embed_input,
                                                          sequence_length,
                                                          scope=decoding_scope)
train_pred_drop = tf.nn.dropout(train_pred, keep_prob)
return output_fn(train_pred_drop)

0 个答案:

没有答案