在两个不同的RNN中使用相同的LSTMCell - TensorFlow

时间:2017-05-15 13:56:26

标签: tensorflow lstm recurrent-neural-network

我正在尝试创建一个连体LSTM网络,这意味着2个单独的输入进入使用相同LSTM权重的RNN。

这是我到目前为止所拥有的

def run_through_siamese_lstm(left_input, right_input, left_length, right_length, lstm_cell)
    left_output, _ = tf.nn.dynamic_rnn(lstm_cell, left_input, left_length, dtype=tf.float32)
    right_output, _ = tf.nn.dynamic_rnn(lstm_cell, right_input,right_length, dtype=tf.float32)

    # Extract the last output
    left_output = tf.transpose(left_output, [1, 0, 2])
    left_last_output = tf.gather(left_output, int(left_output.get_shape()[0]) - 1)

    right_output = tf.transpose(right_output, [1, 0, 2])
    right_last_output = tf.gather(right_output, int(right_output.get_shape()[0]) - 1)

    return tf.exp(-tf.reduce_sum(tf.abs(tf.subtract(left_last_output, right_last_output)), axis=1))

graph = tf.Graph()
with graph.as_default():
    lstm_cell = tf.contrib.rnn.LSTMCell(n_hidden, initializer=tf.truncated_normal_initializer,
                                    forget_bias=forget_bias)
    left_batch = tf.placeholder(tf.float32, [None, max_seq_length, embedding_size])
    left_seq_length = tf.placeholder(tf.float32, [None])

    right_batch = tf.placeholder(tf.float32, [None, max_seq_length, embedding_size])
    right_seq_length = tf.placeholder(tf.float32, [None])



    similarity = run_through_siamese_lstm(left_batch, right_batch, left_seq_length, right_seq_length, lstm_cell)
    .
    .

代码使用以下输出<{1}}抛出异常

  

ValueError:变量rnn / lstm_cell / weights已经存在,不允许。你的意思是在VarScope中设置reuse = True吗?

我尝试设置run_through_siamese_lstm但是它没有用......有什么想法可以解决这个问题吗?

0 个答案:

没有答案