您能否在代码上帮助我:保存每个时期的模型(体系结构和权重),以及如何继续从第5个检查点训练我的模型,例如从1到25的训练时期而不建立检查点(第5个模型,已保存)。
classifier = Sequential()
classifier.add(Conv2D(6, (3, 3), input_shape = (30, 30, 3), data_format="channels_last", activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))
classifier.add(Conv2D(6, (3, 3), activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))
classifier.add(Flatten())
classifier.add(Dense(units = 128, activation = 'relu'))
classifier.add(Dense(units = 64, activation = 'relu'))
classifier.add(Dense(units = 1, activation = 'sigmoid'))
opt = Adam(learning_rate = 0.001, beta_1 = 0.9, beta_2 = 0.999, epsilon = 1e-08, decay = 0.0)
classifier.compile(optimizer = opt, loss = 'binary_crossentropy', metrics = ['accuracy', precision, recall, fmeasure])
from keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale = 1./255,
horizontal_flip = True,
vertical_flip = True,
rotation_range = 180)
validation_datagen = ImageDataGenerator(rescale = 1./255)
training_set = train_datagen.flow_from_directory('/home/dataset/training_set',
target_size = (30, 30),
batch_size = 32,
class_mode = 'binary')
validation_set = validation_datagen.flow_from_directory('/home/dataset/validation_set',
target_size = (30, 30),
batch_size = 32,
class_mode = 'binary')
history = classifier.fit_generator(training_set,
steps_per_epoch = 208170,
epochs = 15,
validation_data = validation_set,
validation_steps = 89140)
答案 0 :(得分:0)
我假设您的意思是要在每个时期之后保存模型和权重,然后在以后的阶段中加载在第五个时期之后保存的模型和权重。
您通常可以在TensorFlow中使用SaveModel格式,如下所示:
classifier.save()
这将保存架构,权重,有关优化器的信息以及您在compile()
中设置的配置
由于您使用的是fit_generator,因此可以使用ModelCheckpoint()
这样保存模型:
from keras.callbacks import ModelCheckpoint
checkpoint = ModelCheckpoint(path_to_save_to, save_freq = 'epoch',
save_weights_only = False)
history = classifier.fit_generator(training_set,
steps_per_epoch = 208170,
epochs = 15,
validation_data = validation_set,
validation_steps = 89140,
callbacks = [checkpoint])
您可以设置路径的格式,以便保存带有{/ {1}}这样的历元/损失详细信息的模型
要加载第五个检查点,请执行以下操作:
path_name + '-{epoch:02d}-{val_loss:.2f}.h5'