训练准确性与验证准确性之间的关系

时间:2019-12-04 01:06:40

标签: neural-network cross-validation training-data

在CNN模型训练期间,我注意到训练和验证准确性之间的各种行为。我了解'The training set is used to train the model, while the validation set is only used to evaluate the model's performance...',但我想知道培训和验证准确性之间是否有任何关系,如果可以,则

1)培训期间验证和验证准确性发生变化时到底发生了什么;

2)不同行为意味着什么

例如,有些人认为如果训练>验证准确性,则存在过度拟合的问题。如果一个大于另一个另一个,会发生什么情况?

这是代码

inputs_1 = keras.Input(shape=(10081,1))

layer1 = Conv1D(64,14)(inputs_1)
layer2 = layers.MaxPool1D(5)(layer1)
layer3 = Conv1D(64, 14)(layer2)
layer4 = layers.GlobalMaxPooling1D()(layer3)

inputs_2 = keras.Input(shape=(104,))             
layer5 = layers.concatenate([layer4, inputs_2])
layer6 = Dense(128, activation='relu')(layer5)
layer7 = Dense(2, activation='softmax')(layer6)


model_2 = keras.models.Model(inputs = [inputs_1, inputs_2], output = [layer7])
model_2.summary()


X_train, X_test, y_train, y_test = train_test_split(df.iloc[:,0:10185], df[['Result_cat','Result_cat1']].values, test_size=0.2) 
X_train = X_train.to_numpy()
X_train = X_train.reshape([X_train.shape[0], X_train.shape[1], 1]) 
X_train_1 = X_train[:,0:10081,:]
X_train_2 = X_train[:,10081:10185,:].reshape(736,104)   


X_test = X_test.to_numpy()
X_test = X_test.reshape([X_test.shape[0], X_test.shape[1], 1]) 
X_test_1 = X_test[:,0:10081,:]
X_test_2 = X_test[:,10081:10185,:].reshape(185,104)    

adam = keras.optimizers.Adam(lr = 0.0005)
model_2.compile(loss = 'categorical_crossentropy', optimizer = adam, metrics = ['acc'])

history = model_2.fit([X_train_1,X_train_2], y_train, epochs = 120, batch_size = 256, validation_split = 0.2, callbacks = [keras.callbacks.EarlyStopping(monitor='val_loss', patience=20)])

模型摘要

/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:15: UserWarning: Update your `Model` call to the Keras 2 API: `Model(inputs=[<tf.Tenso..., outputs=[<tf.Tenso...)`
  from ipykernel import kernelapp as app
Model: "model_3"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_5 (InputLayer)            (None, 10081, 1)     0                                            
__________________________________________________________________________________________________
conv1d_5 (Conv1D)               (None, 10068, 64)    960         input_5[0][0]                    
__________________________________________________________________________________________________
max_pooling1d_3 (MaxPooling1D)  (None, 2013, 64)     0           conv1d_5[0][0]                   
__________________________________________________________________________________________________
conv1d_6 (Conv1D)               (None, 2000, 64)     57408       max_pooling1d_3[0][0]            
__________________________________________________________________________________________________
global_max_pooling1d_3 (GlobalM (None, 64)           0           conv1d_6[0][0]                   
__________________________________________________________________________________________________
input_6 (InputLayer)            (None, 104)          0                                            
__________________________________________________________________________________________________
concatenate_3 (Concatenate)     (None, 168)          0           global_max_pooling1d_3[0][0]     
                                                                 input_6[0][0]                    
__________________________________________________________________________________________________
dense_5 (Dense)                 (None, 128)          21632       concatenate_3[0][0]              
__________________________________________________________________________________________________
dense_6 (Dense)                 (None, 2)            258         dense_5[0][0]                    
==================================================================================================
Total params: 80,258
Trainable params: 80,258
Non-trainable params: 0

和培训过程

__________________________________________________________________________________________________
Train on 588 samples, validate on 148 samples
Epoch 1/120
588/588 [==============================] - 16s 26ms/step - loss: 5.6355 - acc: 0.4932 - val_loss: 4.1086 - val_acc: 0.6216
Epoch 2/120
588/588 [==============================] - 15s 25ms/step - loss: 4.5977 - acc: 0.5748 - val_loss: 3.8252 - val_acc: 0.4459
Epoch 3/120
588/588 [==============================] - 15s 25ms/step - loss: 4.3815 - acc: 0.4575 - val_loss: 2.4087 - val_acc: 0.6622
Epoch 4/120
588/588 [==============================] - 15s 25ms/step - loss: 3.7480 - acc: 0.6003 - val_loss: 2.0060 - val_acc: 0.6892
Epoch 5/120
588/588 [==============================] - 15s 25ms/step - loss: 3.3019 - acc: 0.5408 - val_loss: 2.3176 - val_acc: 0.5676
Epoch 6/120
588/588 [==============================] - 15s 25ms/step - loss: 3.1739 - acc: 0.5663 - val_loss: 2.2607 - val_acc: 0.6892
Epoch 7/120
588/588 [==============================] - 15s 25ms/step - loss: 3.2322 - acc: 0.6207 - val_loss: 1.8898 - val_acc: 0.7230
Epoch 8/120
588/588 [==============================] - 15s 25ms/step - loss: 2.9777 - acc: 0.6020 - val_loss: 1.8401 - val_acc: 0.7500
Epoch 9/120
588/588 [==============================] - 15s 25ms/step - loss: 2.8982 - acc: 0.6429 - val_loss: 1.8517 - val_acc: 0.7365
Epoch 10/120
588/588 [==============================] - 15s 25ms/step - loss: 2.8342 - acc: 0.6344 - val_loss: 1.7941 - val_acc: 0.7095
Epoch 11/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7426 - acc: 0.6327 - val_loss: 1.8495 - val_acc: 0.7162
Epoch 12/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7340 - acc: 0.6531 - val_loss: 1.7652 - val_acc: 0.7162
Epoch 13/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6680 - acc: 0.6616 - val_loss: 1.8097 - val_acc: 0.7365
Epoch 14/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6922 - acc: 0.6786 - val_loss: 1.7143 - val_acc: 0.7500
Epoch 15/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6161 - acc: 0.6786 - val_loss: 1.6960 - val_acc: 0.7568
Epoch 16/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6054 - acc: 0.6905 - val_loss: 1.6779 - val_acc: 0.7297
Epoch 17/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6072 - acc: 0.6684 - val_loss: 1.6750 - val_acc: 0.7703
Epoch 18/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5907 - acc: 0.6871 - val_loss: 1.6774 - val_acc: 0.7432
Epoch 19/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5854 - acc: 0.6718 - val_loss: 1.6609 - val_acc: 0.7770
Epoch 20/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5621 - acc: 0.6905 - val_loss: 1.6709 - val_acc: 0.7365
Epoch 21/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5515 - acc: 0.6854 - val_loss: 1.6904 - val_acc: 0.7703
Epoch 22/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5749 - acc: 0.6837 - val_loss: 1.6862 - val_acc: 0.7297
Epoch 23/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6212 - acc: 0.6514 - val_loss: 1.7215 - val_acc: 0.7568
Epoch 24/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6532 - acc: 0.6633 - val_loss: 1.7105 - val_acc: 0.7230
Epoch 25/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7300 - acc: 0.6344 - val_loss: 1.6870 - val_acc: 0.7432
Epoch 26/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7355 - acc: 0.6650 - val_loss: 1.6733 - val_acc: 0.7703
Epoch 27/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6336 - acc: 0.6650 - val_loss: 1.6572 - val_acc: 0.7297
Epoch 28/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6018 - acc: 0.6803 - val_loss: 1.7292 - val_acc: 0.7635
Epoch 29/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5448 - acc: 0.7143 - val_loss: 1.8065 - val_acc: 0.7095
Epoch 30/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5724 - acc: 0.6820 - val_loss: 1.8029 - val_acc: 0.7297
Epoch 31/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6622 - acc: 0.6650 - val_loss: 1.6594 - val_acc: 0.7568
Epoch 32/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6211 - acc: 0.6582 - val_loss: 1.6375 - val_acc: 0.7770
Epoch 33/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5911 - acc: 0.6854 - val_loss: 1.6964 - val_acc: 0.7500
Epoch 34/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5050 - acc: 0.7262 - val_loss: 1.8496 - val_acc: 0.6892
Epoch 35/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6012 - acc: 0.6752 - val_loss: 1.7443 - val_acc: 0.7432
Epoch 36/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5688 - acc: 0.6871 - val_loss: 1.6220 - val_acc: 0.7568
Epoch 37/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4843 - acc: 0.7279 - val_loss: 1.6166 - val_acc: 0.7905
Epoch 38/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4707 - acc: 0.7449 - val_loss: 1.6496 - val_acc: 0.7905
Epoch 39/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4683 - acc: 0.7109 - val_loss: 1.6641 - val_acc: 0.7432
Epoch 40/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4671 - acc: 0.7279 - val_loss: 1.6553 - val_acc: 0.7703
Epoch 41/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4479 - acc: 0.7347 - val_loss: 1.6302 - val_acc: 0.7973
Epoch 42/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4355 - acc: 0.7551 - val_loss: 1.6241 - val_acc: 0.7973
Epoch 43/120
588/588 [==============================] - 14s 25ms/step - loss: 2.4286 - acc: 0.7568 - val_loss: 1.6249 - val_acc: 0.7973
Epoch 44/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4250 - acc: 0.7585 - val_loss: 1.6248 - val_acc: 0.7770
Epoch 45/120
588/588 [==============================] - 14s 25ms/step - loss: 2.4198 - acc: 0.7517 - val_loss: 1.6212 - val_acc: 0.7703
Epoch 46/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4246 - acc: 0.7568 - val_loss: 1.6129 - val_acc: 0.7838
Epoch 47/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4237 - acc: 0.7517 - val_loss: 1.6166 - val_acc: 0.7973
Epoch 48/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4287 - acc: 0.7432 - val_loss: 1.6309 - val_acc: 0.8041
Epoch 49/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4179 - acc: 0.7381 - val_loss: 1.6271 - val_acc: 0.7838
Epoch 50/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4164 - acc: 0.7381 - val_loss: 1.6258 - val_acc: 0.7973
Epoch 51/120
588/588 [==============================] - 14s 24ms/step - loss: 2.1996 - acc: 0.7398 - val_loss: 1.3612 - val_acc: 0.7973
Epoch 52/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1387 - acc: 0.8265 - val_loss: 1.4811 - val_acc: 0.7973
Epoch 53/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1607 - acc: 0.8078 - val_loss: 1.5060 - val_acc: 0.7838
Epoch 54/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1783 - acc: 0.8129 - val_loss: 1.4878 - val_acc: 0.8176
Epoch 55/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1745 - acc: 0.8197 - val_loss: 1.4762 - val_acc: 0.8108
Epoch 56/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1764 - acc: 0.8129 - val_loss: 1.4631 - val_acc: 0.7905
Epoch 57/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1637 - acc: 0.8078 - val_loss: 1.4615 - val_acc: 0.7770
Epoch 58/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1563 - acc: 0.8112 - val_loss: 1.4487 - val_acc: 0.7703
Epoch 59/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1396 - acc: 0.8146 - val_loss: 1.4362 - val_acc: 0.7905
Epoch 60/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1240 - acc: 0.8316 - val_loss: 1.4333 - val_acc: 0.8041
Epoch 61/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1173 - acc: 0.8333 - val_loss: 1.4369 - val_acc: 0.8041
Epoch 62/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1228 - acc: 0.8384 - val_loss: 1.4393 - val_acc: 0.8041
Epoch 63/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1113 - acc: 0.8316 - val_loss: 1.4380 - val_acc: 0.8041
Epoch 64/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1102 - acc: 0.8452 - val_loss: 1.4217 - val_acc: 0.8041
Epoch 65/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0961 - acc: 0.8469 - val_loss: 1.4129 - val_acc: 0.7973
Epoch 66/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0903 - acc: 0.8537 - val_loss: 1.4019 - val_acc: 0.8041
Epoch 67/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0890 - acc: 0.8503 - val_loss: 1.3850 - val_acc: 0.8176
Epoch 68/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0878 - acc: 0.8520 - val_loss: 1.4035 - val_acc: 0.7635
Epoch 69/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0984 - acc: 0.8469 - val_loss: 1.4060 - val_acc: 0.8041
Epoch 70/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0893 - acc: 0.8418 - val_loss: 1.3981 - val_acc: 0.7973
Epoch 71/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0876 - acc: 0.8605 - val_loss: 1.3951 - val_acc: 0.8041__________________________________________________________________________________________________
Train on 588 samples, validate on 148 samples
Epoch 1/120
588/588 [==============================] - 16s 26ms/step - loss: 5.6355 - acc: 0.4932 - val_loss: 4.1086 - val_acc: 0.6216
Epoch 2/120
588/588 [==============================] - 15s 25ms/step - loss: 4.5977 - acc: 0.5748 - val_loss: 3.8252 - val_acc: 0.4459
Epoch 3/120
588/588 [==============================] - 15s 25ms/step - loss: 4.3815 - acc: 0.4575 - val_loss: 2.4087 - val_acc: 0.6622
Epoch 4/120
588/588 [==============================] - 15s 25ms/step - loss: 3.7480 - acc: 0.6003 - val_loss: 2.0060 - val_acc: 0.6892
Epoch 5/120
588/588 [==============================] - 15s 25ms/step - loss: 3.3019 - acc: 0.5408 - val_loss: 2.3176 - val_acc: 0.5676
Epoch 6/120
588/588 [==============================] - 15s 25ms/step - loss: 3.1739 - acc: 0.5663 - val_loss: 2.2607 - val_acc: 0.6892
Epoch 7/120
588/588 [==============================] - 15s 25ms/step - loss: 3.2322 - acc: 0.6207 - val_loss: 1.8898 - val_acc: 0.7230
Epoch 8/120
588/588 [==============================] - 15s 25ms/step - loss: 2.9777 - acc: 0.6020 - val_loss: 1.8401 - val_acc: 0.7500
Epoch 9/120
588/588 [==============================] - 15s 25ms/step - loss: 2.8982 - acc: 0.6429 - val_loss: 1.8517 - val_acc: 0.7365
Epoch 10/120
588/588 [==============================] - 15s 25ms/step - loss: 2.8342 - acc: 0.6344 - val_loss: 1.7941 - val_acc: 0.7095
Epoch 11/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7426 - acc: 0.6327 - val_loss: 1.8495 - val_acc: 0.7162
Epoch 12/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7340 - acc: 0.6531 - val_loss: 1.7652 - val_acc: 0.7162
Epoch 13/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6680 - acc: 0.6616 - val_loss: 1.8097 - val_acc: 0.7365
Epoch 14/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6922 - acc: 0.6786 - val_loss: 1.7143 - val_acc: 0.7500
Epoch 15/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6161 - acc: 0.6786 - val_loss: 1.6960 - val_acc: 0.7568
Epoch 16/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6054 - acc: 0.6905 - val_loss: 1.6779 - val_acc: 0.7297
Epoch 17/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6072 - acc: 0.6684 - val_loss: 1.6750 - val_acc: 0.7703
Epoch 18/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5907 - acc: 0.6871 - val_loss: 1.6774 - val_acc: 0.7432
Epoch 19/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5854 - acc: 0.6718 - val_loss: 1.6609 - val_acc: 0.7770
Epoch 20/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5621 - acc: 0.6905 - val_loss: 1.6709 - val_acc: 0.7365
Epoch 21/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5515 - acc: 0.6854 - val_loss: 1.6904 - val_acc: 0.7703
Epoch 22/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5749 - acc: 0.6837 - val_loss: 1.6862 - val_acc: 0.7297
Epoch 23/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6212 - acc: 0.6514 - val_loss: 1.7215 - val_acc: 0.7568
Epoch 24/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6532 - acc: 0.6633 - val_loss: 1.7105 - val_acc: 0.7230
Epoch 25/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7300 - acc: 0.6344 - val_loss: 1.6870 - val_acc: 0.7432
Epoch 26/120
588/588 [==============================] - 15s 25ms/step - loss: 2.7355 - acc: 0.6650 - val_loss: 1.6733 - val_acc: 0.7703
Epoch 27/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6336 - acc: 0.6650 - val_loss: 1.6572 - val_acc: 0.7297
Epoch 28/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6018 - acc: 0.6803 - val_loss: 1.7292 - val_acc: 0.7635
Epoch 29/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5448 - acc: 0.7143 - val_loss: 1.8065 - val_acc: 0.7095
Epoch 30/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5724 - acc: 0.6820 - val_loss: 1.8029 - val_acc: 0.7297
Epoch 31/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6622 - acc: 0.6650 - val_loss: 1.6594 - val_acc: 0.7568
Epoch 32/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6211 - acc: 0.6582 - val_loss: 1.6375 - val_acc: 0.7770
Epoch 33/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5911 - acc: 0.6854 - val_loss: 1.6964 - val_acc: 0.7500
Epoch 34/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5050 - acc: 0.7262 - val_loss: 1.8496 - val_acc: 0.6892
Epoch 35/120
588/588 [==============================] - 15s 25ms/step - loss: 2.6012 - acc: 0.6752 - val_loss: 1.7443 - val_acc: 0.7432
Epoch 36/120
588/588 [==============================] - 15s 25ms/step - loss: 2.5688 - acc: 0.6871 - val_loss: 1.6220 - val_acc: 0.7568
Epoch 37/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4843 - acc: 0.7279 - val_loss: 1.6166 - val_acc: 0.7905
Epoch 38/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4707 - acc: 0.7449 - val_loss: 1.6496 - val_acc: 0.7905
Epoch 39/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4683 - acc: 0.7109 - val_loss: 1.6641 - val_acc: 0.7432
Epoch 40/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4671 - acc: 0.7279 - val_loss: 1.6553 - val_acc: 0.7703
Epoch 41/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4479 - acc: 0.7347 - val_loss: 1.6302 - val_acc: 0.7973
Epoch 42/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4355 - acc: 0.7551 - val_loss: 1.6241 - val_acc: 0.7973
Epoch 43/120
588/588 [==============================] - 14s 25ms/step - loss: 2.4286 - acc: 0.7568 - val_loss: 1.6249 - val_acc: 0.7973
Epoch 44/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4250 - acc: 0.7585 - val_loss: 1.6248 - val_acc: 0.7770
Epoch 45/120
588/588 [==============================] - 14s 25ms/step - loss: 2.4198 - acc: 0.7517 - val_loss: 1.6212 - val_acc: 0.7703
Epoch 46/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4246 - acc: 0.7568 - val_loss: 1.6129 - val_acc: 0.7838
Epoch 47/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4237 - acc: 0.7517 - val_loss: 1.6166 - val_acc: 0.7973
Epoch 48/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4287 - acc: 0.7432 - val_loss: 1.6309 - val_acc: 0.8041
Epoch 49/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4179 - acc: 0.7381 - val_loss: 1.6271 - val_acc: 0.7838
Epoch 50/120
588/588 [==============================] - 15s 25ms/step - loss: 2.4164 - acc: 0.7381 - val_loss: 1.6258 - val_acc: 0.7973
Epoch 51/120
588/588 [==============================] - 14s 24ms/step - loss: 2.1996 - acc: 0.7398 - val_loss: 1.3612 - val_acc: 0.7973
Epoch 52/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1387 - acc: 0.8265 - val_loss: 1.4811 - val_acc: 0.7973
Epoch 53/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1607 - acc: 0.8078 - val_loss: 1.5060 - val_acc: 0.7838
Epoch 54/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1783 - acc: 0.8129 - val_loss: 1.4878 - val_acc: 0.8176
Epoch 55/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1745 - acc: 0.8197 - val_loss: 1.4762 - val_acc: 0.8108
Epoch 56/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1764 - acc: 0.8129 - val_loss: 1.4631 - val_acc: 0.7905
Epoch 57/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1637 - acc: 0.8078 - val_loss: 1.4615 - val_acc: 0.7770
Epoch 58/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1563 - acc: 0.8112 - val_loss: 1.4487 - val_acc: 0.7703
Epoch 59/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1396 - acc: 0.8146 - val_loss: 1.4362 - val_acc: 0.7905
Epoch 60/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1240 - acc: 0.8316 - val_loss: 1.4333 - val_acc: 0.8041
Epoch 61/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1173 - acc: 0.8333 - val_loss: 1.4369 - val_acc: 0.8041
Epoch 62/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1228 - acc: 0.8384 - val_loss: 1.4393 - val_acc: 0.8041
Epoch 63/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1113 - acc: 0.8316 - val_loss: 1.4380 - val_acc: 0.8041
Epoch 64/120
588/588 [==============================] - 15s 25ms/step - loss: 1.1102 - acc: 0.8452 - val_loss: 1.4217 - val_acc: 0.8041
Epoch 65/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0961 - acc: 0.8469 - val_loss: 1.4129 - val_acc: 0.7973
Epoch 66/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0903 - acc: 0.8537 - val_loss: 1.4019 - val_acc: 0.8041
Epoch 67/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0890 - acc: 0.8503 - val_loss: 1.3850 - val_acc: 0.8176
Epoch 68/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0878 - acc: 0.8520 - val_loss: 1.4035 - val_acc: 0.7635
Epoch 69/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0984 - acc: 0.8469 - val_loss: 1.4060 - val_acc: 0.8041
Epoch 70/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0893 - acc: 0.8418 - val_loss: 1.3981 - val_acc: 0.7973
Epoch 71/120
588/588 [==============================] - 15s 25ms/step - loss: 1.0876 - acc: 0.8605 - val_loss: 1.3951 - val_acc: 0.8041

请注意,acc首先比val_acc低,而后来比val_acc大。有人可以说明一下这里可能发生的事情吗?谢谢

1 个答案:

答案 0 :(得分:0)

测量Acc和val_acc以评估模型拟合。当两者之间有显着差异时,说明您的模型过度拟合。验证准确度(val_acc)应该等于或略小于训练准确度(acc),才能成为更好的模型。

当训练损失开始增加(即val_acc在减少)时,应停止训练。但是,如果总体准确性显示出显着差异,则应考虑模型中的某些变化。