tf.nn.bidirectional_dynamic_rnn的输出未达到预期

时间:2019-04-13 00:56:14

标签: python tensorflow

我遇到一个奇怪的问题,当输入是单个样本并且输入具有多个样本时,tf.nn.bidirectional_dynamic_rnn()的输出不相等。这太奇怪了!

代码

inputs  = tf.placeholder(tf.int32, [None, None])
length = tf.placeholder(tf.int32, [None])
with tf.device('/cpu:0'), tf.variable_scope('word_embedding'):
    self.word_embeddings = tf.get_variable(
        'word_embeddings',
        shape=(10000, 100),
        initializer=tf.constant_initializer(self.vocab.embeddings),
        trainable=True
        )
    self.emb = tf.nn.embedding_lookup(self.word_embeddings, inputs)
cell_bw = tc.rnn.LSTMCell(num_units=hidden_size, state_is_tuple=True)
cell_fw = tc.rnn.LSTMCell(num_units=hidden_size, state_is_tuple=True)
outputs, states = tf.nn.bidirectional_dynamic_rnn(
    cell_bw, cell_fw, self.emb , sequence_length=length, dtype=tf.float32
)

# my inputs has two ways.

first---single sample

inputs = [[5017, 5, 4436, 4570]]
length = [4]

second---multi sample with padding

inputs = [[5017, 5, 4436, 4570, 0], [5017, 10, 100, 4570, 1111]]
# 0 is padding
length = [4, 5]

模型参数没有改变,但是输出[0] [0:4]不相等是两种方式。在我看来,填充不应影响网络的输出。

0 个答案:

没有答案