Keras需要很长时间才能进入下一个时代

时间:2019-07-20 11:50:37

标签: performance tensorflow keras epoch

在训练CNN时,每个时期完成之后,需要更多的时间才能移至下一个时期,而每个时期可以在60s-80s内完成,而移至下一个时期则需要近5分钟。我已经提供了我的代码,有什么我想念的吗?

#importing the libraries

from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense

#inintializing the ANN

classifier = Sequential()

# Convolutional layer

classifier.add(Conv2D(64,(3,3),input_shape =(128, 128, 3), activation = 'relu'))

#pooling layer
classifier.add(MaxPooling2D(pool_size = (2,2)))

#second convolutional layer
classifier.add(Conv2D(128,(3,3), activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2,2)))

# flatten
classifier.add(Flatten())

#full connection
classifier.add(Dense(output_dim = 128, activation = 'relu'))

classifier.add(Dense(output_dim = 1, activation = 'sigmoid'))

#compiling the cnn
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])

# we create two instances with the same arguments
from keras.preprocessing.image import ImageDataGenerator


train_datagen = ImageDataGenerator(rescale = 1./255,
                                   shear_range = 0.2,
                                   zoom_range = 0.2,
                                   horizontal_flip = True)

test_datagen = ImageDataGenerator(rescale = 1./255)

training_set = train_datagen.flow_from_directory('dataset/training_set',
                                                 target_size = (128, 128),
                                                 batch_size = 32,
                                                 class_mode = 'binary')

test_set = test_datagen.flow_from_directory('dataset/test_set',
                                            target_size = (128, 128),
                                            batch_size = 32,
                                            class_mode = 'binary')

classifier.fit_generator(training_set,
                         samples_per_epoch = 8000,
                         nb_epoch = 25,
                         validation_data = test_set,
                         nb_val_samples = 2000)

1 个答案:

答案 0 :(得分:0)

如果使用ImageDataGenerator,则无需设置samples_per_epochnb_val_samples,因为这是一个Sequence,并且在内部包含其长度(当然,如果使用的是Keras最新版本)。问题是nb_val_steps用于参数validation_steps,我认为您将这个值设置得比正确的值高很多。

如果需要,应将steps_per_epochvalidation_steps设置为正确的值,如果将validation_steps设置为大于len(val_data) / batch_size的值,则实际上是在告诉keras要用比必要更多的数据进行验证,从而减慢了验证步骤。