当我开始训练数据集时,输入用完了弹出的数据错误

时间:2020-05-20 15:37:41

标签: keras spyder

我当时想运行我的训练集,但是当我尝试运行模型时,它会不断给出错误,它会弹出错误消息“您的输入用完了数据;中断了训练。请确保您的数据集或生成器可以生成至少steps_per_epoch * epochs个批次(在这种情况下为12000个批次)。构建数据集时,可能需要使用repeat()函数。” 在此处输入代码 #卷积神经网络

# Importing the libraries
import tensorflow as tf
from keras.preprocessing.image import ImageDataGenerator
tf.__version__

# Part 1 - Data Preprocessing

# Generating images for the Training set
train_datagen = ImageDataGenerator(rescale = 1./255,
                                   shear_range = 0.2,
                                   zoom_range = 0.2,
                                   horizontal_flip = True)

# Generating images for the Test set
test_datagen = ImageDataGenerator(rescale = 1./255)

# Creating the Training set
training_set = train_datagen.flow_from_directory('dataset/train',
                                                 target_size = (64, 64),
                                                 batch_size = 32,
                                                 class_mode = 'binary')

# Creating the Test set
test_set = test_datagen.flow_from_directory('dataset/test',
                                            target_size = (64, 64),
                                            batch_size = 32,
                                            class_mode = 'binary')

# Part 2 - Building the CNN

# Initialising the CNN
cnn = tf.keras.models.Sequential()

# Step 1 - Convolution
cnn.add(tf.keras.layers.Conv2D(filters=32, kernel_size=3, padding="same", activation="relu", input_shape=[64, 64, 3]))

# Step 2 - Pooling
cnn.add(tf.keras.layers.MaxPool2D(pool_size=2, strides=2, padding='valid'))

# Adding a second convolutional layer
cnn.add(tf.keras.layers.Conv2D(filters=32, kernel_size=3, padding="same", activation="relu"))
cnn.add(tf.keras.layers.MaxPool2D(pool_size=2, strides=2, padding='valid'))

# Step 3 - Flattening
cnn.add(tf.keras.layers.Flatten())

# Step 4 - Full Connection
cnn.add(tf.keras.layers.Dense(units=128, activation='relu'))

# Step 5 - Output Layer
cnn.add(tf.keras.layers.Dense(units=1, activation='sigmoid'))

# Part 3 - Training the CNN

# Compiling the CNN
cnn.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])

# Training the CNN on the Training set and evaluating it on the Test set
cnn.fit_generator(training_set,
                  steps_per_epoch = 4000,
                  epochs = 3,
                  validation_data = test_set,
                  validation_steps = 2000)

1 个答案:

答案 0 :(得分:0)

steps_per_epochs记录每个时期的批次数量,因此实际上您需要验证自己的数据集中是否有至少steps_per_ecpoh * batch_size个图像(而不是{{ 1}}。对于Teain和验证数据集都是如此。

一种常见的方法是设置steps_per_ecpoh * epochssteps_per_ecpoh=floor(len(dataset)/batch_size)类的默认ImageDataGenerator为32,您可以通过传递相关参数来更改它。