tensorflow dynamic_rnn如何保存和提供隐藏状态

时间:2018-10-22 00:07:10

标签: python tensorflow machine-learning lstm rnn

我有一个像这样的tensorflow多层rnn单元:

def MakeLSTMCell(self):
    cells = []
    for n in self.numUnits:
        cell = tf.nn.rnn_cell.LSTMCell(n)
        dropout = tf.nn.rnn_cell.DropoutWrapper(cell,
                                                input_keep_prob=self.keep_prob,
                                                output_keep_prob=self.keep_prob)
        cells.append(dropout)
    stackedRNNCell = tf.nn.rnn_cell.MultiRNNCell(cells)
    return stackedRNNCell

def BuildGraph(self):
    """
    Build the Graph of the recurrent reinforcement neural network.
    """
    with self.graph.as_default():
        with tf.variable_scope(self.scope):
            self.inputSeq = tf.placeholder(tf.float32, [None, None, self.observationDim], name='input_seq')
            self.batch_size = tf.shape(self.inputSeq)[0]
            self.seqLength = tf.shape(self.inputSeq)[1]
            self.cell = self.MakeLSTMCell()

            with tf.name_scope("LSTM_layers"):
                self.zeroState = self.cell.zero_state(self.batch_size, tf.float32)
                self.cellState = self.zeroState

                self.outputs, self.outputState = tf.nn.dynamic_rnn(self.cell,
                                                         self.inputSeq,
                                                         initial_state=self.cellState,
                                                         swap_memory=True)

但是,此self.cellState是不可配置的。我想知道如何保存lstm隐藏状态(保留相同的形式,以便我可以随时将其反馈给rnn)并在任何时候以initial_state的形式重复使用它?

我已经尝试过接受这个问题的答案: Tensorflow, best way to save state in RNNs? 但是,创建tf变量时不允许动态批处理大小。

任何帮助将不胜感激

0 个答案:

没有答案