LinearRegression()和Ridge(alpha = 0)之间的区别

时间:2016-11-13 03:48:47

标签: python machine-learning scikit-learn regression linear-regression

当alpha参数接近零时,Tikhonov(脊)成本等于最小二乘成本。 scikit-learn docs about the subject上的所有内容都表示相同。所以我期待

Integer population = Integer.valueOf("1234567");  // Returns 1234567 as an Integer, which can autobox to an int if you prefer

等同于

sklearn.linear_model.Ridge(alpha=1e-100).fit(data, target)

但事实并非如此。为什么呢?

更新了代码:

sklearn.linear_model.LinearRegression().fit(data, target)

注意: import pandas as pd from sklearn.linear_model import Ridge, LinearRegression from sklearn.preprocessing import PolynomialFeatures import matplotlib.pyplot as plt %matplotlib inline dataset = pd.read_csv('house_price_data.csv') X = dataset['sqft_living'].reshape(-1, 1) Y = dataset['price'].reshape(-1, 1) polyX = PolynomialFeatures(degree=15).fit_transform(X) model1 = LinearRegression().fit(polyX, Y) model2 = Ridge(alpha=1e-100).fit(polyX, Y) plt.plot(X, Y,'.', X, model1.predict(polyX),'g-', X, model2.predict(polyX),'r-') alpha=1e-8 的情节看起来相同

enter image description here

1 个答案:

答案 0 :(得分:4)

根据documentationalpha必须是正浮点数。您的示例以alpha=0为整数。使用小的正alphaRidgeLinearRegression的结果似乎会收敛。

from sklearn.linear_model import Ridge, LinearRegression
data = [[0, 0], [1, 1], [2, 2]]
target = [0, 1, 2]

ridge_model = Ridge(alpha=1e-8).fit(data, target)
print("RIDGE COEFS: " + str(ridge_model.coef_))
ols = LinearRegression().fit(data,target)
print("OLS COEFS: " + str(ols.coef_))

# RIDGE COEFS: [ 0.49999999  0.50000001]
# OLS COEFS: [ 0.5  0.5]
#
# VS. with alpha=0:
# RIDGE COEFS: [  1.57009246e-16   1.00000000e+00]
# OLS COEFS: [ 0.5  0.5]

<强>更新 上面alpha=0int的问题似乎只是一些问题,例如上面的示例中的一些玩具问题。

对于住房数据,问题是缩放问题。您调用的15度多项式导致数值溢出。要从LinearRegressionRidge生成相同的结果,请先尝试扩展数据:

import pandas as pd
from sklearn.linear_model import Ridge, LinearRegression
from sklearn.preprocessing import PolynomialFeatures, scale

dataset = pd.read_csv('house_price_data.csv')

# scale the X data to prevent numerical errors.
X = scale(dataset['sqft_living'].reshape(-1, 1))
Y = dataset['price'].reshape(-1, 1)

polyX = PolynomialFeatures(degree=15).fit_transform(X)

model1 = LinearRegression().fit(polyX, Y)
model2 = Ridge(alpha=0).fit(polyX, Y)

print("OLS Coefs: " + str(model1.coef_[0]))
print("Ridge Coefs: " + str(model2.coef_[0]))

#OLS Coefs: [  0.00000000e+00   2.69625315e+04   3.20058010e+04  -8.23455994e+04
#  -7.67529485e+04   1.27831360e+05   9.61619464e+04  -8.47728622e+04
#  -5.67810971e+04   2.94638384e+04   1.60272961e+04  -5.71555266e+03
#  -2.10880344e+03   5.92090729e+02   1.03986456e+02  -2.55313741e+01]
#Ridge Coefs: [  0.00000000e+00   2.69625315e+04   3.20058010e+04  -8.23455994e+04
#  -7.67529485e+04   1.27831360e+05   9.61619464e+04  -8.47728622e+04
#  -5.67810971e+04   2.94638384e+04   1.60272961e+04  -5.71555266e+03
#  -2.10880344e+03   5.92090729e+02   1.03986456e+02  -2.55313741e+01]