Tensorflow RNN variable_scope错误

时间:2017-08-03 05:40:19

标签: tensorflow scope rnn

我正在尝试在不使用MultiRNNCell的情况下制作多层RNN,因为我想独立更新每一层。所以我没有使用tf.dynamic_rnn。

with tf.variable_scope("cell"):
  with tf.variable_scope("cell_1", reuse=True):
    cell_1 = tf.contrib.rnn.BasicLSTMCell(n_hidden)
    states_1 = cell_1.zero_state(batch_size, tf.float32)

  with tf.variable_scope("cell_2", reuse=True):
    cell_2 = tf.contrib.rnn.BasicLSTMCell(n_hidden)
    states_2 = cell_2.zero_state(batch_size, tf.float32)

  with tf.variable_scope("cell_3", reuse=True):
    cell_3 = tf.contrib.rnn.BasicLSTMCell(n_hidden)
    states_3 = cell_3.zero_state(batch_size, tf.float32)

outputs_1=[]
outputs_2=[]
outputs_3=[]

with tf.variable_scope("architecture"):
  for i in range(n_step):
    output_1, states_1 = cell_1(X[:, i], states_1)
    output_2, states_2 = cell_2(output_1, states_2)
    output_3, states_3 = cell_3(output_2, states_3)
    outputs_3.append(output_3)

然后我得到了这样的错误。

  

ValueError:变量架构/ basic_lstm_cell / kernel已经存在,    不允许。你的意思是在VarScope中设置reuse = True吗?

因此,如果没有MultiRNNCell,在张量流中声明多个单元似乎是不可能的。我该如何解决这个问题?

1 个答案:

答案 0 :(得分:1)

我解决了问题并分享了答案。

cell = tf.contrib.rnn.BasicLSTMCell(n_hidden)
cell2 = tf.contrib.rnn.BasicLSTMCell(n_hidden)
cell3 = tf.contrib.rnn.BasicLSTMCell(n_hidden)

states = cell.zero_state(batch_size, tf.float32)
states2 = cell2.zero_state(batch_size, tf.float32)
states3 = cell3.zero_state(batch_size, tf.float32)

outputs=[]
for i in range(n_step):
  with tf.variable_scope("cell1"):
    output, states = cell(X[:, i], states)
  with tf.variable_scope("cell2"):
    output2, states2 = cell2(output, states2)
  with tf.variable_scope("cell3"):
    output3, states3 = cell3(output2, states3)

  outputs.append(output3)