二维线性回归系数

时间:2017-10-28 10:07:11

标签: python math scikit-learn linear-algebra linear-regression

我正在使用二维变量进行线性回归:

 filtered[['p_tag_x', 'p_tag_y', 's_tag_x', 's_tag_y']].head()

     p_tag_x      p_tag_y            s_tag_x     s_tag_y
35    589.665646  1405.580171        517.5       1636.5
36    589.665646  1405.580171        679.5       1665.5
100   610.546851  2425.303250        569.5       2722.0
101   610.546851  2425.303250        728.0       2710.0
102   717.237730  1411.842428        820.0       1616.5



clt = linear_model.LinearRegression()
clt.fit(filtered[['p_tag_x', 'p_tag_y']], filtered[['s_tag_x', 's_tag_y']])

我得到了以下回归系数:

clt.coef_

array([[ 0.4529769 , -0.22406594],
       [-0.00859452, -0.00816968]])

残留物(X_0和Y_0)

clt.residues_
array([ 1452.97816371,    69.12754694])

如何根据回归线理解上述系数矩阵?

1 个答案:

答案 0 :(得分:4)

正如我在评论中已经解释的那样,您在coef_以及intercept_中获得了额外维度,因为您有 2个目标y.shape(n_samples, n_targets) )。在这种情况下,sklearn将适合 2个独立回归器,每个目标一个。

然后,您可以将这些 n个回归器拆开并自行处理每个

回归线的formula仍为:

y(w, x) = intercept_ + coef_[0] * x[0] + coef_[1] * x[1] ... 

可悲的是,由于维度,你的例子有点难以想象。

考虑这个演示,对这个特定情况进行了大量难看的硬编码(以及错误的示例数据!):

代码:

# Warning: ugly demo-like code using a lot of hard-coding!!!!!

import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
from sklearn import linear_model

X = np.array([[589.665646,  1405.580171],
              [589.665646,  1405.580171],
              [610.546851,  2425.303250],
              [610.546851,  2425.303250],
              [717.237730,  1411.842428]])

y = np.array([[517.5,       1636.5],
              [679.5,       1665.5],
              [569.5,       2722.0],
              [728.0,       2710.0],
              [820.0,       1616.5]])

clt = linear_model.LinearRegression()
clt.fit(X, y)

print(clt.coef_)
print(clt.residues_)

def curve_0(x, y):  # target 0; single-point evaluation hardcoded for 2 features!
    return clt.intercept_[0] + x * clt.coef_[0, 0] + y * clt.coef_[0, 1]

def curve_1(x, y):  # target 1; single-point evaluation hardcoded for 2 features!
    return clt.intercept_[1] + x * clt.coef_[1, 0] + y * clt.coef_[1, 1]

fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')

xs = [np.amin(X[:, 0]), np.amax(X[:, 0])]
ys = [np.amin(X[:, 1]), np.amax(X[:, 1])]

# regressor 0
ax.scatter(X[:, 0], X[:, 1], y[:, 0], c='blue')
ax.plot([xs[0], xs[1]], [ys[0], ys[1]], [curve_0(xs[0], ys[0]), curve_0(xs[1], ys[1])], c='cyan')

# regressor 1
ax.scatter(X[:, 0], X[:, 1], y[:, 1], c='red')
ax.plot([xs[0], xs[1]], [ys[0], ys[1]], [curve_1(xs[0], ys[0]), curve_1(xs[1], ys[1])], c='magenta')

ax.set_xlabel('X[:, 0] feature 0')
ax.set_ylabel('X[:, 1] feature 1')
ax.set_zlabel('Y')

plt.show()

输出:

enter image description here

说明:

  • 您不必自己计算公式:clt.predict()会这样做!
  • 涉及ax.plot(...)的代码行使用假设,即我们的行仅由2个点(线性)定义!