我的Seq2Seq RNN代码无法正常工作

时间:2019-08-26 14:53:36

标签: tensorflow recurrent-neural-network seq2seq

我正在尝试使用动态RNN。但是,当我运行代码时,它会报告

ERROR:Inconsistent shapes: saw (40, 50) but expected (1, 50) (and infer_shape=True)

正在复查几次,但找不到问题的关键。寻求帮助!

环境:Python3.7.3,Tensorflow 1.14

    ...
#Encoder Part
e_cell=tf.nn.rnn_cell.GRUCell(num_units=12)
X_train= X_train.transpose((1,0,2))     #shape:(40,2361,4)

init_state = Mcell.zero_state(40, dtype=tf.float32)
encoder_output, encoder_state = tf.nn.dynamic_rnn(e_cell, X_train, 
                                              initial_state=init_state) 

#Decoder Part
#/helper
inference = False
seq_len= np.array([20])  #it seems to be wrong here?   
helper = tf.contrib.seq2seq.TrainingHelper(encoder_output, 
                                       sequence_length=seq_len)

#/attention
attention_mechanism = 
tf.contrib.seq2seq.BahdanauAttention(rnn_hidden_size=12, 
                                                       encoder_output)
d_cell = tf.nn.rnn_cell.GRUCell(rnn_hidden_size=12)
decoder_cell = tf.contrib.seq2seq.AttentionWrapper(d_cell, 
             attention_mechanism, attention_layer_size = 12)
out_cell = tf.contrib.rnn.OutputProjectionWrapper(decoder_cell, 
             target_size=50)
decoder_state = decoder_cell.zero_state(batch_size=40,dtype=tf.float32)

with tf.variable_scope('decoder'):
decoder = tf.contrib.seq2seq.BasicDecoder(out_cell, helper,decoder_state, 
             tf.layers.Dense(50))

#/dynamic decoder
dec_outputs, final_state, final_sequence_lengths = 
        tf.contrib.seq2seq.dynamic_decode(decoder,swap_memory=True)
...

问题1::如何纠正错误:

saw (40, 50) but expected (1, 50) (and infer_shape=True)

问题2::我没有使用嵌入层,是使用TrainingHelper的权利还是应该使用ScheduledOutputTrainingHelper

如果上面的代码没有帮助,我可能会添加更多信息。谢谢!

0 个答案:

没有答案