我正在训练CNN模型,其文本数据集(域名)为100万条目。该模型似乎收敛得非常快。在一个时期内,我只有94%的准确率。任何想法为什么会这样?
def build_model(max_features, maxlen):
"""Build CNN model"""
model = Sequential()
model.add(Embedding(max_features, 8, input_length=maxlen))
model.add(Convolution1D(6, 4, border_mode='same'))
model.add(Convolution1D(4, 4, border_mode='same'))
model.add(Convolution1D(2, 4, border_mode='same'))
model.add(Flatten())
#model.add(Dropout(0.2))
#model.add(Dense(2,activation='sigmoid'))
#model.add(Dense(180,activation='sigmoid'))
#model.add(Dropout(0.2))
model.add(Dense(2,activation='softmax'))
return model
前三个时期的输出:
950000/950000 [==============================] - 232s - loss: 0.1764 - categorical_accuracy: 0.9356 - fmeasure: 0.9356 - precision: 0.9356 - recall: 0.9356 - val_loss: 0.1579 - val_categorical_accuracy: 0.9418 - val_fmeasure: 0.9418 - val_precision: 0.9418 - val_recall: 0.9418
Epoch 2/3
950000/950000 [==============================] - 232s - loss: 0.1567 - categorical_accuracy: 0.9450 - fmeasure: 0.9450 - precision: 0.9450 - recall: 0.9450 - val_loss: 0.1518 - val_categorical_accuracy: 0.9489 - val_fmeasure: 0.9489 - val_precision: 0.9489 - val_recall: 0.9489
Epoch 3/3
950000/950000 [==============================] - 232s - loss: 0.1515 - categorical_accuracy: 0.9474 - fmeasure: 0.9474 - precision: 0.9474 - recall: 0.9474 - val_loss: 0.1474 - val_categorical_accuracy: 0.9472 - val_fmeasure: 0.9472 - val_precision: 0.9472 - val_recall: 0.9472
3392/3801 [=========================>....] - ETA: 0s[0.15151389103144666, 0.94817153352462946, 0.94817148384625149, 0.94817153391666209, 0.94817153391666209]