具有一因变量和一自变量的回归神经网络

时间:2019-07-11 09:15:12

标签: tensorflow neural-network regression

我正在尝试创建一个具有一个因变量和一个自变量的简单神经网络。您能给我一个教程还是帮我实现一个具有一个因变量和一个自变量的神经网络。到目前为止,我具有以下代码,但是尽管错误最小化,但我的预测仍然不好。我应该缩放X和Y还是出现一些错误?

提前谢谢

import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np

x=[(i*i)+0.2 for i in range(1000)]
y=[i for i in range(1000)]

x_train=np.reshape(x,(-1,1))
y_train=np.reshape(y,(-1,1))
x_test=x_train[:,-10:]
y_test=y_train[:,-10:]
plt.scatter(x_train,y_train)
plt.show()


X=tf.placeholder(tf.float32,[None,1])
Y=tf.placeholder(tf.float32,[None,1])

n_inputs=1
n_hidden_1=20
n_hidden_2=20
n_outputs=1

weights={
    "h1": tf.Variable(tf.random_normal([n_inputs,n_hidden_1])),
    "h2": tf.Variable(tf.random_normal([n_hidden_1,n_hidden_2])),
    "out": tf.Variable(tf.random_normal([n_hidden_2,n_outputs]))
}

biases={
    "b1": tf.Variable(tf.random_normal([n_hidden_1])),
    "b2": tf.Variable(tf.random_normal([n_hidden_2])),
    "out": tf.Variable(tf.random_normal([n_outputs]))
}

def neural_net(x):
    layer_1=tf.add(tf.matmul(x,weights["h1"]),biases["b1"])
    layer_1=tf.nn.relu(layer_1)
    layer_2=tf.add(tf.matmul(layer_1,weights["h2"]),biases["b2"])
    layer_2=tf.nn.relu(layer_2)
    layer_3=tf.matmul(layer_2,weights["out"])+biases["out"]
    return layer_3

Y_pred=neural_net(X)

loss=tf.losses.mean_squared_error(X,Y_pred)
optimizer=tf.train.AdamOptimizer(learning_rate=0.01)
train_op=optimizer.minimize(loss)

epochs=1000
init=tf.global_variables_initializer()
with tf.Session() as sess:
    sess.run(init)
    for i in range(epochs):
        sess.run(train_op,feed_dict={X:x_train,Y:y_train})
        loss_op=sess.run(loss,feed_dict={X:x_train,Y:y_train})
        if i%10==0:
            print("Epoch "+str(i)+" loss "+str(loss_op))
    pred=sess.run(Y_pred,feed_dict={X:x_test})
    plt.plot(pred,color="red")
    plt.plot(y_test,color="blue")
    plt.show()
    plt.scatter(pred,y_test)
    plt.show()
    for i in range(len(pred)):
        print(str(pred[i])+" "+str(y_test[i]))


0 个答案:

没有答案