Tensorflow回归模型每次都给出相同的预测

时间:2017-08-04 10:12:18

标签: python regression

  
    
import tensorflow as tf

x = tf.placeholder(tf.float32, [None,4])    # input vector    

w1 = tf.Variable(tf.random_normal([4,2]))   # weights between first and second layers

b1 = tf.Variable(tf.zeros([2]))             # biases added to hidden layer

w2 = tf.Variable(tf.random_normal([2,1]))   # weights between second and third layer

b2 = tf.Variable(tf.zeros([1]))             # biases added to third (output) layer

def feedForward(x,w,b):                     # function for forward propagation
    
  
          Input = tf.add(tf.matmul(x,w), b)

          Output = tf.sigmoid(Input)

          return Output


>>> Out1 = feedForward(x,w1,b1)                # output of first layer

>>> Out2 = feedForward(Out1,w2,b2)             # output of second layer

>>> MHat = 50*Out2                             # final prediction is in the range (0,50)



>>> M = tf.placeholder(tf.float32, [None,1])   # placeholder for actual (target value of marks)

>>> J = tf.reduce_mean(tf.square(MHat - M))    # cost function -- mean square errors                          

>>> train_step = tf.train.GradientDescentOptimizer(0.05).minimize(J)     # minimize J using Gradient Descent

>>> sess = tf.InteractiveSession()             # create interactive session 

>>> tf.global_variables_initializer().run()    # initialize all weight and bias variables with specified values

>>> xs = [[1,3,9,7],    
          [7,9,8,2],                           # x training data
          [2,4,6,5]]

>>> Ms = [[47],
          [43],                                # M training data
          [39]]

>>> for _ in range(1000):                      # performing learning process on training data 1000 times

       sess.run(train_step, feed_dict = {x:xs, M:Ms})


>>> print(sess.run(MHat, feed_dict = {x:[[1,3,9,7]]}))

[[50。]]

>>> print(sess.run(MHat, feed_dict = {x:[[1,15,9,7]]}))

[[50。]]

>>> print(sess.run(tf.transpose(MHat), feed_dict = {x:[[1,15,9,7]]}))

[[50。]]

在这段代码中,我试图预测50名学生的标记M,考虑到他/她睡了多少小时,学习,使用过电子设备和玩过。这4个特征属于输入特征向量x。

为了解决这个回归问题,我正在使用深度神经网络 具有4个感知器的输入层(输入特征),具有两个感知器的隐藏层和具有一个感知器的输出层。我使用了sigmoid作为激活函数。但是,对于我提供的所有可能的输入向量,我得到了完全相同的预测([[50.0]])。有人可以告诉我 下面的代码有什么问题。我非常感谢你们的帮助! (提前)

1 个答案:

答案 0 :(得分:0)

您需要修改feedforward()功能。在这里,您不需要在最后一层应用sigmoid()(只需返回激活函数!),也不需要将此函数的输出乘以50。

def feedForward(X,W1,b1,W2,b2):
    Z=tf.sigmoid(tf.matmul(X,W1)+b1)
    return tf.matmul(Z,W2)+b2
MHat = feedForward(x,w1,b1,w2,b2)

希望这有帮助!


如果它解决了您的问题,请不要忘记告诉我们:)