我有一个非常独特的问题。我有一个多元线性回归问题,我的目标是找到这种回归的截距,并确保系数之和小于等于1且每个系数都不为负。我花了很多时间在网上搜索,在这里找到了一个不错的答案:
下面的代码显示了我如何使用上面答案中共享的代码的输出来覆盖回归系数。 我现在的问题是:在给定自定义系数的情况下,如何计算新的截距值?
from sklearn.datasets import load_boston
X, Y = load_boston(return_X_y=True)
from scipy.optimize import minimize
Y = y
# Define the Model
model = lambda b, X: b[0] * X[:,0] + b[1] * X[:,1] + b[2] * X[:,2]
# The objective Function to minimize (least-squares regression)
obj = lambda b, Y, X: np.sum(np.abs(Y-model(b, X))**2)
# Bounds: b[0], b[1], b[2] >= 0
bnds = [(0, None), (0, None), (0, None)]
# Constraint: b[0] + b[1] + b[2] - 1 = 0
cons = [{"type": "eq", "fun": lambda b: b[0]+b[1]+b[2] - 1}]
# Initial guess for b[1], b[2], b[3]:
xinit = np.array([0, 0, 1])
res = minimize(obj, args=(Y, X), x0=xinit, bounds=bnds, constraints=cons)
print(f"b1={res.x[0]}, b2={res.x[1]}, b3={res.x[2]}")
#Save the coefficients for further analysis on goodness of fit
beta1 = res.x[0]
beta2 = res.x[1]
beta3 = res.x[2]
from sklearn.linear_model import LinearRegression
model2 = LinearRegression(nonnegative=False)
model2.fit(X, Y)
print("Regression intecept = {}".format(model2.intercept_))
print("Regression coefficient(s) -> \n{}".format(model2.coef_))
r_sq_model2 = model2.score(X, y)
print("Regression R-squared = {}".format(r_sq_model2))
model2.coef_ = np.array([ beta1, beta2, beta3 ])
print("\n* Overriden Regression coefficient(s) -> \n{}".format(model2.coef_))
r_sq_model2 = model2.score(X, y)
print("Regression R-squared with adj coeff(s) = {}".format(r_sq_model2))
# HOW TO IF I FIND THE NEW INTERCEPT?
感谢您的帮助
答案 0 :(得分:1)
在模型定义中添加截距。像这样
UserModel.findOne({}, function(err, user) {
if (!err) {
user.addToSet('business');
user.save();
}
});
,现在直接使用您的model = lambda b, X: b[3] + b[0] * X[:,0] + b[1] * X[:,1] + b[2] * X[:,2]
作为截距。您可以使用
b[3]
希望这会有所帮助!