在拼写检查器程序中获得全神贯注的问题

时间:2018-11-09 09:54:31

标签: python tensorflow

https://github.com/Currie32/Spell-Checker

在上面的链接代码中,我收到未定义DynamicAttentionWrapper的错误消息。我正在使用TensorFlow版本为1.2。我无法克服这个错误。请帮助我。

1 个答案:

答案 0 :(得分:0)

您的DynamicAttentionWrapper版本中的tensorflow存在问题。

尝试将DynamicAttentionWrapper更改为AttentionWrapper或降级为tensorflow 1.1

对于您的tensorflow版本,请尝试对initial_state, inference_logits and training_logits进行以下更改:

initial_state = dec_cell.zero_state(batch_size=batch_size,dtype=tf.float32).clone(cell_state=enc_state)

inference_logits, _ ,_ = tf.contrib.seq2seq.dynamic_decode(inference_decoder,
output_time_major=False,
impute_finished=True,
maximum_iterations=max_target_length)

training_logits, _ ,_ = tf.contrib.seq2seq.dynamic_decode(training_decoder,
output_time_major=False,
impute_finished=True,
maximum_iterations=max_target_length)