scikit-learn SGDClassifier热启动被忽略了

时间:2014-08-29 20:51:45

标签: python machine-learning scikit-learn

我正在尝试使用scikit-learn 0.15.1版的SGDClassifier。除了迭代次数之外,似乎没有任何方法可以设置收敛标准。所以我想通过检查每次迭代时的错误,然后热启动其他迭代直到改进足够小来手动完成。

不幸的是,warm_start标志和coef_init / intercept_init似乎都没有真正热启动优化 - 它们似乎都是从头开始。

我该怎么办?如果没有真正的收敛标准或热启动,分类器就无法使用。

请注意以下偏差如何在每次重启时增加很多,以及损失如何增加但随着进一步的迭代而减少。经过250次迭代后,偏差为-3.44,平均损失为1.46。

sgd = SGDClassifier(loss='log', alpha=alpha, verbose=1, shuffle=True, 
                    warm_start=True)
print('INITIAL FIT')
sgd.fit(X, y, sample_weight=sample_weight)
sgd.n_iter = 1
print('\nONE MORE ITERATION')
sgd.fit(X, y, sample_weight=sample_weight)
sgd.n_iter = 3
print('\nTHREE MORE ITERATIONS')
sgd.fit(X, y, sample_weight=sample_weight)


INITIAL FIT
-- Epoch 1
Norm: 254.11, NNZs: 92299, Bias: -5.239955, T: 122956, Avg. loss: 28.103236
Total training time: 0.04 seconds.
-- Epoch 2
Norm: 138.81, NNZs: 92598, Bias: -5.180938, T: 245912, Avg. loss: 16.420537
Total training time: 0.08 seconds.
-- Epoch 3
Norm: 100.61, NNZs: 92598, Bias: -5.082776, T: 368868, Avg. loss: 12.240537
Total training time: 0.12 seconds.
-- Epoch 4
Norm: 74.18, NNZs: 92598, Bias: -5.076395, T: 491824, Avg. loss: 9.859404
Total training time: 0.17 seconds.
-- Epoch 5
Norm: 55.57, NNZs: 92598, Bias: -5.072369, T: 614780, Avg. loss: 8.280854
Total training time: 0.21 seconds.

ONE MORE ITERATION
-- Epoch 1
Norm: 243.07, NNZs: 92598, Bias: -11.271497, T: 122956, Avg. loss: 26.148746
Total training time: 0.04 seconds.

THREE MORE ITERATIONS
-- Epoch 1
Norm: 258.70, NNZs: 92598, Bias: -16.058395, T: 122956, Avg. loss: 29.666688
Total training time: 0.04 seconds.
-- Epoch 2
Norm: 142.24, NNZs: 92598, Bias: -15.809559, T: 245912, Avg. loss: 17.435114
Total training time: 0.08 seconds.
-- Epoch 3
Norm: 102.71, NNZs: 92598, Bias: -15.715853, T: 368868, Avg. loss: 12.731181
Total training time: 0.12 seconds.

1 个答案:

答案 0 :(得分:6)

warm_start=True将使用拟合系数作为起点,但重新开始学习率计划。

如果您想手动检查收敛情况,建议您使用partial_fit代替fit @AdrienNK建议:

sgd = SGDClassifier(loss='log', alpha=alpha, verbose=1, shuffle=True, 
                warm_start=True, n_iter=1)
sgd.partial_fit(X, y)
# after 1st iteration
sgd.partial_fit(X, y)
# after 2nd iteration
...