恢复受过训练的lstm并获得预测

时间:2019-01-06 22:37:07

标签: python tensorflow recurrent-neural-network

问题在于如何使用大约70万个样本的数据集学习RNN。通过tf.train.Saver()保存模型,然后在其他程序执行中恢复模型和会话,从而有可能基于文件中保存的模型对新样本进行预测。

def prepareNeuralNet(self):

    self.X = tf.placeholder(tf.float32, [None, self.n_steps, self.n_inputs], name="X")
    self.y = tf.placeholder(tf.float32, [None, self.n_steps, self.n_outputs], name="y")

    layers = [
        tf.contrib.rnn.LSTMCell(
            num_units=self.n_neurons,
            activation=tf.nn.tanh,
            use_peepholes=True,
            initializer=tf.contrib.layers.xavier_initializer())
        for layer in range(2)]

    cell = tf.contrib.rnn.OutputProjectionWrapper(
        tf.contrib.rnn.MultiRNNCell(layers),
        output_size=self.n_outputs)

    outputs, states = tf.nn.dynamic_rnn(cell, self.X, dtype=tf.float32)

    learning_rate = 0.001

    self.loss = tf.reduce_mean(tf.square(outputs[:][-1][:] - self.y[:][-1][:]))
    tf.add_to_collection('outpouts', outputs)

    optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate)
    self.training_op = optimizer.minimize(self.loss)

    self.init = tf.global_variables_initializer()

    self.saver = tf.train.Saver()

    self.mse_summary = tf.summary.scalar('MSE', self.loss)


def learnNeuralNet(self):

    self.prepareNeuralNet()

    with tf.Session() as sess:

        totalRes = []
        totalSal = []
        self.init.run()
        for iteration, index in enumerate(self.shuffledIndexes):
            X_batch, y_batch, batchScaler = self.next_batch()
            if iteration % 1000:
                mse = self.loss.eval(feed_dict={self.X: X_batch, self.y: y_batch})
                summary_str = self.mse_summary.eval(feed_dict={self.X: X_batch, self.y: y_batch})
                self.fileWriter.add_summary(summary_str, iteration * (self.n_steps + self.batch_size))

        self.saver.save(sess, "model_files/model.ckpt")
        self.fileWriter.close()

def makePrediction(self, sess):
    try:
        d = DataManager.data.iloc[-self.n_steps:].copy().values
        X_batch = np.array(d).reshape(1, self.n_steps, self.n_inputs)
        y_batch = X_batch[0][:, self.price_close_idx].reshape(1, self.n_steps, self.n_outputs)
        X = tf.get_default_graph().get_tensor_by_name("X:0")
        y = tf.get_default_graph().get_tensor_by_name("y:0")
        outputs = tf.get_collection('outputs')[0]
        result = sess.run(outputs, feed_dict={X: X_batch, y: y_batch})
    except Exception as ex:
        print(ex.__repr__())

if __name__ == '__main__':

saver = tf.train.import_meta_graph('model_files/model.ckpt.meta')
with tf.Session() as sess:
    saver.restore(sess, "model_files/model.ckpt")
        p = Predictor()
        p.makePrediction(sess)

我以以下方式执行此操作,我正在保存X和y张量,以便以后为这些占位符提供feed_dict数据,但是我想从makePrediction(sess)获得预测值。所以我想以某种方式查看dynamic_rnn的“输出”。如何实现这一点,我尝试使用add_collection,然后在还原get_collection时使用。恢复LSTM网络以便将其用于预测是否正确?我想指出的是,在以前的程序执行中。通过执行p.learnNeuralNetwork()学习了神经网络。

0 个答案:

没有答案