使用张量流从头开始简单的回归神经网络

时间:2017-06-02 20:17:28

标签: tensorflow time-series recurrent-neural-network

我构建了一个简单的递归神经网络,其中一个隐藏层有4个节点。这是我的代码:

import tensorflow as tf

# hyper parameters
learning_rate = 0.0001
number_of_epochs = 10000

# Computation Graph

W1 = tf.Variable([[1.0, 1.0, 1.0, 1.0]], dtype=tf.float32, name = 'W1')
W2 = tf.Variable([[1.0], [1.0], [1.0], [1.0]], dtype=tf.float32, name = 'W2')
WR = tf.Variable([[1.0, 1.0, 1.0, 1.0]], dtype=tf.float32, name = 'WR')
# b = tf.Variable([[0], [0], [0], [0]], dtype=tf.float32)
prev_val = [[0.0]]

X = tf.placeholder(tf.float32, [None, None], name = 'X')
labels = tf.placeholder(tf.float32, [None, 1], name = 'labels')

sess = tf.Session()
sess.run(tf.initialize_all_variables())

z = tf.matmul(X, W1) + tf.matmul(prev_val, WR)# - b 
prev_val = z

predict = tf.matmul(z, W2)

error = tf.reduce_mean((labels - predict)**2)
train = tf.train.GradientDescentOptimizer(learning_rate).minimize(error)

time_series = [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1]
lbsx = [0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0]

for i in range(number_of_epochs):
    for j in range(len(time_series)):
        curr_X = time_series[j]
        lbs = lbsx[j]

        sess.run(train, feed_dict={X: [[curr_X]], labels: [[lbs]]})

print(sess.run(predict, feed_dict={X: [[0]]}))
print(sess.run(predict, feed_dict={X: [[1]]}))

我得到了输出:

[[ 0.]]
[[  3.12420416e-05]]

输入1时,输出0,反之亦然。我也对以前的价值感到困惑'。它应该是占位符吗?我非常感谢您为修复代码所做的努力。

0 个答案:

没有答案