我正在尝试对通过Imagenet权重预训练的ResNet50模型进行PASCAL VOC 2012数据集的迁移学习。由于它是一个多标签数据集,因此我在最后一层使用了sigmoid
激活函数,并且损失了binary_crossentropy
。指标为precision,recall and accuracy
。下面是我用来为20个类构建模型的代码(PASCAL VOC有20个类)。
img_height,img_width = 128,128
num_classes = 20
#If imagenet weights are being loaded,
#input must have a static square shape (one of (128, 128), (160, 160), (192, 192), or (224, 224))
base_model = applications.resnet50.ResNet50(weights= 'imagenet', include_top=False, input_shape= (img_height,img_width,3))
x = base_model.output
x = GlobalAveragePooling2D()(x)
#x = Dropout(0.7)(x)
predictions = Dense(num_classes, activation= 'sigmoid')(x)
model = Model(inputs = base_model.input, outputs = predictions)
for layer in model.layers[-2:]:
layer.trainable=True
for layer in model.layers[:-3]:
layer.trainable=False
adam = Adam(lr=0.0001)
model.compile(optimizer= adam, loss='binary_crossentropy', metrics=['accuracy',precision_m,recall_m])
#print(model.summary())
X_train, X_test, Y_train, Y_test = train_test_split(x_train, y, random_state=42, test_size=0.2)
savingcheckpoint = ModelCheckpoint('ResnetTL.h5',monitor='val_loss',verbose=1,save_best_only=True,mode='min')
earlystopcheckpoint = EarlyStopping(monitor='val_loss',patience=10,verbose=1,mode='min',restore_best_weights=True)
model.fit(X_train, Y_train, epochs=epochs, validation_data=(X_test,Y_test), batch_size=batch_size,callbacks=[savingcheckpoint,earlystopcheckpoint],shuffle=True)
model.save_weights('ResnetTLweights.h5')
它运行了35个时期,直到提前停止,指标如下(没有Dropout层):
loss: 0.1195 - accuracy: 0.9551 - precision_m: 0.8200 - recall_m: 0.5420 - val_loss: 0.3535 - val_accuracy: 0.8358 - val_precision_m: 0.0583 - val_recall_m: 0.0757
即使是Dropout层,我也看不出太大差别。
loss: 0.1584 - accuracy: 0.9428 - precision_m: 0.7212 - recall_m: 0.4333 - val_loss: 0.3508 - val_accuracy: 0.8783 - val_precision_m: 0.0595 - val_recall_m: 0.0403
在辍学的情况下,经过几个时期,模型的验证精度和准确度达到了0.2,但没有高于此。
我看到,与有和没有辍学层的训练集相比,验证集的准确性和召回率都非常低。我该怎么解释?这是否意味着模型过度拟合。如果是这样,我该怎么办?到目前为止,模型预测是相当随机的(完全错误)。数据集大小为11000张图像。
答案 0 :(得分:1)
请您可以如下修改代码并尝试执行
发件人:
predictions = Dense(num_classes, activation= 'sigmoid')(x)
收件人:
predictions = Dense(num_classes, activation= 'softmax')(x)
发件人:
model.compile(optimizer= adam, loss='binary_crossentropy', metrics=['accuracy',precision_m,recall_m])
收件人:
model.compile(optimizer= adam, loss='categorical_crossentropy', metrics=['accuracy',precision_m,recall_m])