如何看待早停在Keras的最佳时代的流逝?

时间:2019-11-20 13:30:55

标签: python tensorflow keras neural-network

我设法在Keras模型中实现了早期停止,但是我不确定如何才能看到最佳时期的消失。

es = EarlyStopping(monitor='val_out_soft_loss', 
            mode='min',
            restore_best_weights=True, 
            verbose=2, 
            patience=10)

model.fit(tr_x,
          tr_y,
          batch_size=batch_size,
          epochs=epochs,
          verbose=1,
          callbacks=[es],
          validation_data=(val_x, val_y))
loss = model.history.history["val_out_soft_loss"][-1]
return model, loss

我定义损失得分的方式,意味着返回的得分来自最后一个时期,而不是最佳时期。

示例:

from sklearn.model_selection import train_test_split, KFold
losses = []
models = []
for k in range(2):
    kfold = KFold(5, random_state = 42 + k, shuffle = True)
    for k_fold, (tr_inds, val_inds) in enumerate(kfold.split(train_y)):
        print("-----------")
        print("-----------")
        model, loss = get_model(64, 100)
        models.append(model)
        print(k_fold, loss)
        losses.append(loss)
print("-------")
print(losses)
print(np.mean(losses))

Epoch 23/100
18536/18536 [==============================] - 7s 362us/step - loss: 0.0116 - out_soft_loss: 0.0112 - out_reg_loss: 0.0393 - val_loss: 0.0131 - val_out_soft_loss: 0.0127 - val_out_reg_loss: 0.0381

Epoch 24/100
18536/18536 [==============================] - 7s 356us/step - loss: 0.0116 - out_soft_loss: 0.0112 - out_reg_loss: 0.0388 - val_loss: 0.0132 - val_out_soft_loss: 0.0127 - val_out_reg_loss: 0.0403

Restoring model weights from the end of the best epoch
Epoch 00024: early stopping
0 0.012735568918287754

因此在此示例中,我希望看到在纪元00014(即0.0124)处的损失。

我还有一个单独的问题:如何设置val_out_soft_loss分数的小数位?

1 个答案:

答案 0 :(得分:1)

将Keras中的fit()调用分配给变量,以便您可以在各个时期跟踪指标。

history = model.fit(tr_x, ...

它将返回字典,像这样访问它:

loss_hist = history.history['loss']

然后获取min()max()或您想要的任何内容。

np.min(loss_hist)