我编写了一个线性回归代码,如下所示: 文件data.csv有两列:X和Y
这是我的代码。
# Remove trailing slash(es)
RewriteRule (.*)/$ /$1 [R=301,L]
1000次迭代后的最终输出:
迭代:1000成本:56.014846105 theta:[[1.1395461] [ 1.45709467]
这里,我假设y = 1.139 + 1.457X
同一数据集上的下一个代码:
import numpy as np
def gradientDescent(x, y, theta, alpha, m, numIterations):
xTrans = x.T
for i in range(numIterations):
print 'Iteration : ',i+1
hypo = np.dot(x, theta)
cost = np.sum((hypo-y)**2) / (2*m)
print 'Cost : ', cost
gradient = np.dot(xTrans, (hypo-y)) / m
theta = theta - alpha * gradient
print 'theta : ', theta
return theta
data = np.loadtxt('/Users/Nikesh/Downloads/linear_regression_live-master/data.csv', delimiter=',')
x = data[:, 0:1]
y = data[:, 1:2]
a = np.ones((100,1))
x = np.append(a, x, axis=1)
m, n = np.shape(x)
numIterations= 1000
alpha = 0.0005
theta = np.ones(n)
theta = theta[:, np.newaxis]
theta = gradientDescent(x, y, theta, alpha, m, numIterations)
输出是:
系数:[[1.32243102]]拦截:[7.99102099] 回归线:[7.99102099] + [[1.32243102]] X
我在网上找到的最后一个。 在同一数据集上,应用线性回归(Gradient Descent Alg) A different answer(take a look at this image)
有人可以在我出错的地方帮助我吗?
以下是数据集的链接: https://github.com/llSourcell/linear_regression_live/blob/master/data.csv