dynamic_rnn中作为占位符的初始状态

时间:2018-11-15 22:42:40

标签: python tensorflow machine-learning lstm recurrent-neural-network

我想将张量输入到LSTM的intial_state作为占位符,以便以后在学习过程中自己给它赋值。

我写了以下内容:

import tensorflow as tf
class PGNetwork:
def __init__(self, inpsize, name='PGNetwork'):
    tf.reset_default_graph()
    self.inpsize = inpsize

    with tf.variable_scope(name):
        # We create the placeholders
        self.inputs_vec = tf.placeholder(tf.float32, [None, 
                            self.inpsize], name="inputs_vec")
        self.in_state = tf.placeholder(tf.float32, [1, 16], name="in_state")

        self.lstm_layer = tf.contrib.rnn.BasicLSTMCell(16,forget_bias=1)
#             self.in_state = self.lstm_layer.zero_state(1, dtype=tf.float32) # By uncommenting this line the error is no longer there.
        self.out_rnn, self.rnn_state = tf.nn.dynamic_rnn(self.lstm_layer, \
                                                      tf.expand_dims(self.inputs_vec, 1), initial_state=self.in_state)

        self.output = tf.layers.dense(inputs = self.out_rnn, 
                                       kernel_initializer=tf.contrib.layers.xavier_initializer(),
                                      units = 5, 
                                    activation=None, name="output")

        self.action_distribution = tf.nn.softmax(self.output, name="softmax")

PGNetwork = PGNetwork(8)

我收到此错误: Tensor objects are not iterable when eager execution is not enabled. To iterate over this tensor use tf.map_fn.

实际上由于initial_state中的tf.nn.dynamic_rnn()而存在,因为它不被视为占位符。

是否可以将占位符转换为dynamic_rnn接受的内容??

0 个答案:

没有答案