我正在尝试训练卷积神经网络进行狗的品种分类。我正在使用超过20000张狗的图像进行训练。训练集有120个不同的品种。
如果我使用少量品种,例如15个品种,则可以毫无问题地进行训练。但是,当我使用整套(120个品种)时,我的计算机死机了(我猜想它的内存不足了)。这是我的代码:
import numpy as np
from keras.models import Sequential
from keras.models import model_from_json
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Convolution2D, MaxPooling2D
from keras.utils import np_utils
from keras.utils import to_categorical
from matplotlib import pyplot as plt
from PIL import Image
import os
np.random.seed(123)
numberOfBreeds = len(os.listdir('Dogs/Images'))
imageList= []
breedList = []
counter = 0
for breed in os.listdir('Dogs/Images'):
counter += 1
for imageFile in os.listdir('Dogs/Images/' + breed):
newImage = Image.open('Dogs/Images/' + breed + '/' + imageFile)
imageList.append(np.array(newImage.resize((200, 200))))
breedList.append(counter - 1)
imagesForTraining = np.stack(imageList, 0)
imagesForTraining = imagesForTraining.astype('float32')
imagesForTraining /= 255
breedsForTraining = to_categorical(breedList, numberOfBreeds)
model = Sequential()
model.add(Convolution2D(32, 3, 3, activation = 'relu', input_shape=(200, 200, 3)))
model.add(Convolution2D(32, 3, 3, activation = 'relu'))
model.add(MaxPooling2D(pool_size = (2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(numberOfBreeds, activation='softmax'))
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.fit(imagesForTraining, breedsForTraining, batch_size = 32, epochs = 10, verbose = 1)
model_json = model.to_json()
with open('model.json', 'w') as json_file:
json_file.write(model_json)
model.save_weights('model.h5')
我有2个问题:
1-如何在保持品种数等于120的同时减少内存使用量?
2-是否可以计算程序将使用多少内存?