梯度下降线性回归

时间:2020-09-25 10:19:48

标签: python linear-regression

我正在尝试通过梯度下降找到回归的截距和系数值。我为此使用简单的模拟x和y。但是我获得的结果与实际值相去甚远。

x=[10*random.random() for _ in range(100)]
y=[5*elem+1*random.random()+10 for elem in x]

plt.scatter(x, y)
x=np.asarray(x,  dtype=np.float64)
y=np.asarray(y,  dtype=np.float64)

# Building the model
m = 0
c = 0

L = 0.0001  # The learning Rate
epochs = 1000  # The number of iterations to perform gradient descent

n = float(len(x)) # Number of elements in X
# Performing Gradient Descent 
for i in range(epochs): 
   Y_pred = m*x + c  # The current predicted value of Y
   D_m = (-2/n) * sum(x * (y - Y_pred))  # Derivative wrt m
   D_c = (-2/n) * sum(y - Y_pred)  # Derivative wrt c
   m = m - L * D_m  # Update m
   c = c - L * D_c  # Update c

 print (m, c)

我得到m和c值为6.4和1,它们应该分别为5和10。 该如何解决?

0 个答案:

没有答案