用numpy进行单变量回归的神经网络只给出线性结果

时间:2017-02-16 14:30:07

标签: python numpy neural-network regression non-linear-regression

我的目标是创建一个具有单个隐藏层(具有ReLU激活)的神经网络,该神经网络能够逼近简单的单变量平方根函数。 我已经用numpy实现了网络,也进行了渐变检查,一切似乎都很好,除了结果:由于某种原因我只能获得线性近似,如下所示:noisy sqrt approx

尝试更改超参数,但没有任何成功。有什么想法吗?

import numpy as np

step_size = 1e-6
input_size, output_size = 1, 1
h_size = 10
train_size = 500
x_train = np.abs(np.random.randn(train_size, 1) * 1000)
y_train = np.sqrt(x_train) + np.random.randn(train_size, 1) * 0.5

#initialize weights and biases
Wxh = np.random.randn(input_size, h_size) * 0.01
bh = np.zeros((1, h_size))
Why = np.random.randn(h_size, output_size) * 0.01
by = np.zeros((1, output_size))

for i in range(300000):
    #forward pass
    h = np.maximum(0, np.dot(x_train, Wxh) + bh1)
    y_est = np.dot(h, Why) + by

    loss = np.sum((y_est - y_train)**2) / train_size
    dy = 2 * (y_est - y_train) / train_size

    print("loss: ",loss)

    #backprop at output
    dWhy = np.dot(h.T, dy)
    dby = np.sum(dy, axis=0, keepdims=True)
    dh = np.dot(dy, Why.T)

    #backprop ReLU non-linearity    
    dh[h <= 0] = 0

    #backprop Wxh, and bh
    dWxh = np.dot(x_train.T, dh)
    dbh = np.sum(dh1, axis=0, keepdims=True)

    Wxh += -step_size * dWxh
    bh += -step_size * dbh
    Why += -step_size * dWhy
    by += -step_size * dby

编辑: 似乎问题是缺乏规范化并且数据非零中心。在对数据进行培训后应用这些转换后,我设法获得了以下结果:noisy sqrt2

1 个答案:

答案 0 :(得分:1)

我可以让你的代码产生一种分段线性逼近:

Piecewise linear approximation to $y=\sqrt(x)$

如果我将输入和输出范围归零并标准化:

# normalise range and domain
x_train -= x_train.mean()
x_train /= x_train.std()
y_train -= y_train.mean()
y_train /= y_train.std()

情节是这样产生的:

x = np.linspace(x_train.min(),x_train.max(),3000)
y = np.dot(np.maximum(0, np.dot(x[:,None], Wxh) + bh), Why) + by
import matplotlib.pyplot as plt
plt.plot(x,y)
plt.show()