如何在Keras中一次加载所有数据集而不加载DenseNet

时间:2019-05-20 03:03:40

标签: keras neural-network deep-learning conv-neural-network resnet

我有一个约750Mb的图像本地数据集,每个图像大小为[1280,960,3],我想使用查询将其馈送到DenseNet神经网络,我在github上找到了一个有关DenseNet的存储库,但是它们会使用数据集的全部负载,并且如果我这样做,我将需要一个非常大的RAM,因为当我尝试使用Google Colab(12GB RAM)时,笔记本会阻塞。

我找到的存储库:

1- https://github.com/titu1994/DenseNet

2- https://github.com/tdeboissiere/DeepLearningImplementations/tree/master/DenseNet

我用来加载数据集但占用了所有RAM的代码是:

import os
import numpy as np
from keras.preprocessing import image

PATH = os.getcwd()

train_path = PATH+'/data/train/'
train_batch = os.listdir(train_path)
x_train = []

# if data are in form of images
for sample in train_data:
    img_path = train_path+sample
    x = image.load_img(img_path)
    # preprocessing if required
    x_train.append(x)

test_path = PATH+'/data/test/'
test_batch = os.listdir(test_path)
x_test = []

for sample in test_data:
    img_path = test_path+sample
    x = image.load_img(img_path)
    # preprocessing if required
    x_test.append(x)

# finally converting list into numpy array
x_train = np.array(x_train)
x_test = np.array(x_test)

那么,我该如何使用另一种无法一次加载所有数据集并馈入DenseNet的方法呢?

0 个答案:

没有答案