当数据偏移(不以零为中心)时,LinearSVC()
和SVC(kernel='linear')
会产生非常不同的结果。 (编辑:问题可能是它没有处理非规范化数据。)
import matplotlib.pyplot as plot
plot.ioff()
import numpy as np
from sklearn.datasets.samples_generator import make_blobs
from sklearn.svm import LinearSVC, SVC
def plot_hyperplane(m, X):
w = m.coef_[0]
a = -w[0] / w[1]
xx = np.linspace(np.min(X[:, 0]), np.max(X[:, 0]))
yy = a*xx - (m.intercept_[0]) / w[1]
plot.plot(xx, yy, 'k-')
X, y = make_blobs(n_samples=100, centers=2, n_features=2,
center_box=(0, 1))
X[y == 0] = X[y == 0] + 100
X[y == 1] = X[y == 1] + 110
for i, m in enumerate((LinearSVC(), SVC(kernel='linear'))):
m.fit(X, y)
plot.subplot(1, 2, i+1)
plot_hyperplane(m, X)
plot.plot(X[y == 0, 0], X[y == 0, 1], 'r.')
plot.plot(X[y == 1, 0], X[y == 1, 1], 'b.')
xv, yv = np.meshgrid(np.linspace(98, 114, 10), np.linspace(98, 114, 10))
_X = np.c_[xv.reshape((xv.size, 1)), yv.reshape((yv.size, 1))]
_y = m.predict(_X)
plot.plot(_X[_y == 0, 0], _X[_y == 0, 1], 'r.', alpha=0.4)
plot.plot(_X[_y == 1, 0], _X[_y == 1, 1], 'b.', alpha=0.4)
plot.show()
这是我得到的结果:
(左= LinearSVC(),右= SVC(内核='线性')
sklearn.__version__
= 0.17。但我也在Ubuntu 14.04中测试过,它带有0.15。
我考虑过报告错误,但似乎很明显是一个错误。我错过了什么?