如何从线性回归获得系数误差?

时间:2019-09-24 14:45:53

标签: python python-3.x linear-regression

我已经能够计算线性回归的系数。但是有没有办法得到相关系数的误差呢?我的代码如下所示。

from scipy.interpolate import *
from numpy import *
x = np.array([4, 12, 56, 58.6,67, 89])
y = np.array([5, 6, 7, 16,18, 19])

degrees = [0,1]       # list of degrees of x to use
matrix = np.stack([x**d for d in degrees], axis=-1)    
coeff = np.linalg.lstsq(matrix, y)[0]     
print("Coefficients", coeff)
fit = np.dot(matrix, coeff)
print("Linear regression", fit)
p1=polyfit(x,y,1)

输出:

Coefficients for y=a +bx [3.70720668 0.17012128]
Linear fit [ 4.38769182  5.74866209 13.23399857 13.67631391 15.10533269 18.84800093]

未显示错误!如何计算误差?

2 个答案:

答案 0 :(得分:1)

您可以为y生成“预测”值,我们将其称为y_pred,然后将它们与y进行比较以获取错误。

predicted_line = poly1d(coeff)
y_pred = predicted_line(x)
errors = y-y_pred

答案 1 :(得分:0)

Althorugh我喜欢David Moseler的解决方案,如果您想要一个错误来评估回归的优劣,则可以使用R2 score中已实现的sklearn(使用平方误差) :

from sklearn.linear_model import LinearRegression
import numpy as np

x = np.array([4, 12, 56, 58.6,67, 89]).reshape(-1, 1)
y = np.array([5, 6, 7, 16,18, 19])

reg = LinearRegression().fit(x, y)
reg.score(x, y) # R2 score
# 0.7481301984276703

如果R2的值接近1,则该模型为