TensorFlow线性回归模型不起作用

时间:2018-12-30 23:53:35

标签: python tensorflow deep-learning linear-regression

我正在构建一个线性回归模型,该模型将一个numpy数组映射为一个5的numpy数组,

[1.0,1.0,1.0,1.0] ---> [5.0,5.0,5.0,5.0]

我的网络显示如下,您可以看到x占位符对应于输入,y占位符对应于输出。但是,我的模型只是收敛到1.0 s:

import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import Dense

g= tf.Graph()
with g.as_default():

    x = tf.placeholder(dtype=tf.float32, shape = (None,4))
    y = tf.placeholder(dtype=tf.float32, shape = (None,4))

    model = tf.keras.Sequential([
    Dense(units=4, activation=tf.nn.relu),
    Dense(units=4, activation=tf.nn.sigmoid)
    ])


    pred = model(x)    
    loss = tf.reduce_mean(tf.square(pred - y))

    train_op = tf.train.AdamOptimizer().minimize(loss)

    init_op = tf.group(tf.global_variables_initializer(),
                         tf.local_variables_initializer())



with tf.Session(graph=g) as sess:
        sess.run(init_op)
        for step in range(1000):
            _ , lossy, predicted = sess.run([train_op,loss,pred], feed_dict = {x:np.ones(shape=(1,4)),
                                           y:5*np.ones(shape=(1,4))})        

            print(predicted)

不幸的是,结果收敛到numpy的{​​{1}}数组,而不是五个数组:

ones

1 个答案:

答案 0 :(得分:1)

您不应使用S型激活功能。请改用RELU。 因为S型函数将数字限制在range(-1,1)。

model = tf.keras.Sequential([
            Dense(units=4, activation=tf.nn.relu),
            Dense(units=4, activation=tf.nn.relu)
        ])