使用TensorFlow在keras中的代码中无法提前停止

时间:2019-05-19 14:10:37

标签: python python-3.x tensorflow keras-2

当我使用提早停止模型训练仅一个时期时,这不是应该做的事情。

这是没有提前停止的示例:

# split a univariate sequence into samples

def split_sequence(sequence, n_steps):
    X, y = list(), list()
    for i in range(len(sequence)):
        # find the end of this pattern
        end_ix = i + n_steps
        # check if we are beyond the sequence
        if end_ix > len(sequence)-1:
            break
        # gather input and output parts of the pattern
        seq_x, seq_y = sequence[i:end_ix], sequence[end_ix]
        X.append(seq_x)
        y.append(seq_y)
    return np.array(X), np.array(y)

sequence = np.arange(10, 1000, 10)

n_steps = 3

X, y = split_sequence(sequence, n_steps)

n_features = 1
X = X.reshape((X.shape[0], X.shape[1], n_features))

model = Sequential()
model.add(LSTM(50, activation='relu', input_shape=(n_steps, n_features)))
model.add(Dense(1))
model.compile(optimizer='adam', loss='mean_absolute_percentage_error')


# early_stopping = EarlyStopping(monitor='val_loss', patience= 5)

hist = model.fit(X, y, validation_split=0.2,  batch_size = 16, epochs = 200)

从以下屏幕截图中可以看出,在前15个以上的时间段中,错误持续下降:

enter image description here

enter image description here

现在,如果我尝试尽早停止,它将在第一个时期停止:

hist = model.fit(X, y, validation_split=0.2,  callbacks = [EarlyStopping(patience=5)], batch_size = 16)

enter image description here

我在做什么错了,我该如何纠正?

1 个答案:

答案 0 :(得分:0)

您忘记指定此调用中的纪元数,因此默认为1:

hist = model.fit(X, y, validation_split=0.2,  callbacks = [EarlyStopping(patience=5)], batch_size = 16)

将其更改为:

hist = model.fit(X, y, validation_split=0.2,  callbacks=[EarlyStopping(patience=5)], batch_size=16, epochs=200)

欢呼