如何通过在张量流

时间:2017-05-14 07:53:33

标签: tensorflow lstm recurrent-neural-network

我是tensorflow的新手,最近阅读了各种博客中的LSTM,例如了解LSTM网络,Colah,回归神经网络的不合理效果,Karparthy等。

我在网上找到了这个代码:

import numpy as np
import tensorflow as tf

def length(sequence):
    used = tf.sign(tf.reduce_max(tf.abs(sequence), reduction_indices=2))
    length = tf.reduce_sum(used, reduction_indices=1)
    length = tf.cast(length, tf.int32)
    return length

num_neurons = 10
num_layers = 3
max_length = 8
frame_size = 5

# dropout = tf.placeholder(tf.float32)
cell = tf.contrib.rnn.LSTMCell(num_neurons, state_is_tuple= True)
# cell = DropoutWrapper(cell, output_keep_prob=dropout)
cell = tf.contrib.rnn.MultiRNNCell([cell] * num_layers)

sequence = tf.placeholder(tf.float32, [None, max_length, frame_size])
output, state = tf.nn.dynamic_rnn(
    cell,
    sequence,
    dtype=tf.float32,
    sequence_length=length(sequence),
)

if __name__ == '__main__':
        sample = np.random.random((8, max_length, frame_size)) + 0.1
        # sample[np.ix_([0,1],range(50,max_length))] = 0
        # drop = 0.2
        with tf.Session() as sess:
                init_op = init_op = tf.global_variables_initializer()
                sess.run(init_op)
                o, s  = sess.run([output, state], feed_dict={sequence: sample})
                # print "Output shape is ", o.shape()
                # print "state shape is ", s.shape()
                print "Output is ", o
                print "State is ", s

与state_is_tuple = True的上述代码有关,我有些疑惑。

Q值。输出和状态的简单含义是什么tf.nn.dynamic_rnn返回。

我在互联网上看到,输出是最后几层的输出 国家是最终状态。

我的中间疑问是,我们的意思是"在几个时间步骤输出最后一层"

我查看了dynamic_rnn代码,因为我的主要任务是查找 (https://github.com/tensorflow/tensorflow/blob/r1.1/tensorflow/python/ops/rnn.py

Q值。 ***通过以与上述代码相同的方式调用dynamic_rnn,LSTM的所有中间输出。我怎么能这样做。

我还在内部调用dynamic_rnn调用_dynamic_rnn。 这个_dynamic_rnn返回final_output和final_state。除了final_output。我想要所有中间输出。

我的目的是编写自定义_dynamic_rnn,如中所定义 https://github.com/tensorflow/tensorflow/blob/r1.1/tensorflow/python/ops/rnn.py 请帮忙。

0 个答案:

没有答案