我是从sklearn导入GridsearchCV来做到这一点的。我不知道应该在参数中的数组中给出什么值:
Parameters={'alpha':[array]}
Ridge_reg=GridsearchCV (ridge,parameters,scoring='neg mean squared error',cv=5)
答案 0 :(得分:0)
您发布的代码有多个语法错误,例如GridsearchCV
和scoring='neg mean squared error'
。
第一个输入参数应该是一个对象(模型)。
使用此功能:
from sklearn.linear_model import Ridge
import numpy as np
from sklearn.model_selection import GridSearchCV
n_samples, n_features = 10, 5
rng = np.random.RandomState(0)
y = rng.randn(n_samples)
X = rng.randn(n_samples, n_features)
parameters = {'alpha':[1, 10]}
# define the model/ estimator
model = Ridge()
# define the grid search
Ridge_reg= GridSearchCV(model, parameters, scoring='neg_mean_squared_error',cv=5)
#fit the grid search
Ridge_reg.fit(X,y)
# best estimator
print(Ridge_reg.best_estimator_)
# best model
best_model = Ridge_reg.best_estimator_
best_model.fit(X,y)
...
...
用于可视化(Ridge系数作为正则化的函数):
import matplotlib.pyplot as plt
alphas = [1, 10]
coefs = []
for a in alphas:
ridge = Ridge(alpha=a, fit_intercept=False)
ridge.fit(X, y)
coefs.append(ridge.coef_)
ax = plt.gca()
ax.plot(alphas, coefs)
ax.set_xscale('log')
ax.set_xlim(ax.get_xlim()[::-1]) # reverse axis
plt.xlabel('alpha')
plt.ylabel('weights')
plt.title('Ridge coefficients as a function of the regularization')
plt.axis('tight')
plt.show()