sklearn没有关注n_iter param:提供的迭代次数超过了

时间:2017-08-15 11:23:04

标签: scikit-learn

以下是我现在有一段时间的怀疑。如果它与你产生共鸣,希望它有所帮助。

我有以下简单代码

with_model_analysis = Perceptron(n_iter=2, warm_start=True, verbose=1)

运行以下代码时

with_model_analysis.fit(X_train, Y_train)

我得到的详细输出如下:

-- Epoch 1
Norm: 2117.10, NNZs: 151491, Bias: -0.200000, T: 2438128, Avg. loss: 0.136197
Total training time: 1.57 seconds.
-- Epoch 2
Norm: 2152.62, NNZs: 152310, Bias: -0.210000, T: 4876256, Avg. loss: 0.138114
Total training time: 3.14 seconds.
-- Epoch 1
Norm: 2864.00, NNZs: 144626, Bias: -0.250000, T: 2438128, Avg. loss: 0.140278
Total training time: 1.57 seconds.
-- Epoch 2
Norm: 2908.83, NNZs: 145051, Bias: -0.240000, T: 4876256, Avg. loss: 0.141844
Total training time: 3.13 seconds.
-- Epoch 1
Norm: 996.64, NNZs: 55420, Bias: -0.160000, T: 2438128, Avg. loss: 0.012540
Total training time: 1.59 seconds.
-- Epoch 2
Norm: 1013.77, NNZs: 56011, Bias: -0.150000, T: 4876256, Avg. loss: 0.012728
Total training time: 3.18 seconds.
-- Epoch 1
Norm: 2850.54, NNZs: 176581, Bias: -0.270000, T: 2438128, Avg. loss: 0.209191
Total training time: 1.58 seconds.
-- Epoch 2
Norm: 2895.90, NNZs: 177293, Bias: -0.260000, T: 4876256, Avg. loss: 0.211221
Total training time: 3.18 seconds.
-- Epoch 1
Norm: 1489.41, NNZs: 80787, Bias: -0.270000, T: 2438128, Avg. loss: 0.029082
Total training time: 1.54 seconds.
-- Epoch 2
Norm: 1516.51, NNZs: 81432, Bias: -0.290000, T: 4876256, Avg. loss: 0.029050
Total training time: 3.06 seconds.
-- Epoch 1
Norm: 2718.56, NNZs: 191107, Bias: 0.190000, T: 2438128, Avg. loss: 0.178792
Total training time: 1.48 seconds.
-- Epoch 2
Norm: 2762.41, NNZs: 191638, Bias: 0.220000, T: 4876256, Avg. loss: 0.181443
Total training time: 2.99 seconds.
[Parallel(n_jobs=1)]: Done   6 out of   6 | elapsed:   28.5s finished

最后一行Done 6 out of 6是什么意思?当所需的迭代仅为2时,为什么要进行6 * 2次迭代?

1 个答案:

答案 0 :(得分:1)

6表示输出类的数量。在多级分类中,它训练一对一的决策边界,从而分别训练类。