我目前正在尝试使用以下model.fit线训练Keras模型:
history = model.fit(imgs,ground_truths, batch_size=16, epochs=30, shuffle=True,
validation_split=0.2,
callbacks=[model_checkpoint])
两者都具有形状(2080, 256, 256, 3)
,这是模型的正确输入形状。
但是,出于某种原因,即使我传递了2个参数,我仍然会收到以下错误:
ValueError: The model expects 2 input arrays, but only received one array. Found: array with shape (2080, 256, 256, 3)
这是我预处理图像的方式:
def preprocess(imgs):
imgs_p = np.ndarray((imgs.shape[0], img_rows, img_cols, 3), dtype=np.uint8)
for i in range(imgs.shape[0]):
arr = imgs[i]
arr = arr.astype('float')
arr /= 255.
imgs_p[i] = resize(arr, (256, 256), preserve_range=True)
return imgs_p
预处理后,预处理的图像保存在numpy文件中:
np.save('imgs_train_preprocess.npy', imgs)
np.save('imgs_gt_train_preprocess.npy', ground_truths)
在训练之前,我会在训练前加载这样的numpy文件:
imgs = np.load('imgs_cup_train_preprocess.npy')
ground_truths = np.load('imgs_orig_train_preprocess.npy')
这是我的模特。简介:
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
conv1_1 (InputLayer) (None, 256, 256, 3) 0
____________________________________________________________________________________________________
relu1_1 (Activation) (None, 256, 256, 3) 0 conv1_1[0][0]
____________________________________________________________________________________________________
conv1_2_zeropadding (ZeroPadding (None, 258, 258, 3) 0 relu1_1[0][0]
____________________________________________________________________________________________________
conv1_2 (Conv2D) (None, 256, 256, 64) 1792 conv1_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu1_2 (Activation) (None, 256, 256, 64) 0 conv1_2[0][0]
____________________________________________________________________________________________________
pool1 (MaxPooling2D) (None, 128, 128, 64) 0 relu1_2[0][0]
____________________________________________________________________________________________________
conv2_1_zeropadding (ZeroPadding (None, 130, 130, 64) 0 pool1[0][0]
____________________________________________________________________________________________________
conv2_1 (Conv2D) (None, 128, 128, 128) 73856 conv2_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu2_1 (Activation) (None, 128, 128, 128) 0 conv2_1[0][0]
____________________________________________________________________________________________________
conv2_2_zeropadding (ZeroPadding (None, 130, 130, 128) 0 relu2_1[0][0]
____________________________________________________________________________________________________
conv2_2 (Conv2D) (None, 128, 128, 128) 147584 conv2_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu2_2 (Activation) (None, 128, 128, 128) 0 conv2_2[0][0]
____________________________________________________________________________________________________
pool2 (MaxPooling2D) (None, 64, 64, 128) 0 relu2_2[0][0]
____________________________________________________________________________________________________
conv3_1_zeropadding (ZeroPadding (None, 66, 66, 128) 0 pool2[0][0]
____________________________________________________________________________________________________
conv3_1 (Conv2D) (None, 64, 64, 256) 295168 conv3_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu3_1 (Activation) (None, 64, 64, 256) 0 conv3_1[0][0]
____________________________________________________________________________________________________
conv3_2_zeropadding (ZeroPadding (None, 66, 66, 256) 0 relu3_1[0][0]
____________________________________________________________________________________________________
conv3_2 (Conv2D) (None, 64, 64, 256) 590080 conv3_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu3_2 (Activation) (None, 64, 64, 256) 0 conv3_2[0][0]
____________________________________________________________________________________________________
conv3_3_zeropadding (ZeroPadding (None, 66, 66, 256) 0 relu3_2[0][0]
____________________________________________________________________________________________________
conv3_3 (Conv2D) (None, 64, 64, 256) 590080 conv3_3_zeropadding[0][0]
____________________________________________________________________________________________________
relu3_3 (Activation) (None, 64, 64, 256) 0 conv3_3[0][0]
____________________________________________________________________________________________________
pool3 (MaxPooling2D) (None, 32, 32, 256) 0 relu3_3[0][0]
____________________________________________________________________________________________________
conv4_1_zeropadding (ZeroPadding (None, 34, 34, 256) 0 pool3[0][0]
____________________________________________________________________________________________________
conv4_1 (Conv2D) (None, 32, 32, 512) 1180160 conv4_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu4_1 (Activation) (None, 32, 32, 512) 0 conv4_1[0][0]
____________________________________________________________________________________________________
conv4_2_zeropadding (ZeroPadding (None, 34, 34, 512) 0 relu4_1[0][0]
____________________________________________________________________________________________________
conv4_2 (Conv2D) (None, 32, 32, 512) 2359808 conv4_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu4_2 (Activation) (None, 32, 32, 512) 0 conv4_2[0][0]
____________________________________________________________________________________________________
conv4_3_zeropadding (ZeroPadding (None, 34, 34, 512) 0 relu4_2[0][0]
____________________________________________________________________________________________________
conv4_3 (Conv2D) (None, 32, 32, 512) 2359808 conv4_3_zeropadding[0][0]
____________________________________________________________________________________________________
relu4_3 (Activation) (None, 32, 32, 512) 0 conv4_3[0][0]
____________________________________________________________________________________________________
pool4 (MaxPooling2D) (None, 16, 16, 512) 0 relu4_3[0][0]
____________________________________________________________________________________________________
conv5_1_zeropadding (ZeroPadding (None, 18, 18, 512) 0 pool4[0][0]
____________________________________________________________________________________________________
conv5_1 (Conv2D) (None, 16, 16, 512) 2359808 conv5_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu5_1 (Activation) (None, 16, 16, 512) 0 conv5_1[0][0]
____________________________________________________________________________________________________
conv5_2_zeropadding (ZeroPadding (None, 18, 18, 512) 0 relu5_1[0][0]
____________________________________________________________________________________________________
conv5_2 (Conv2D) (None, 16, 16, 512) 2359808 conv5_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu5_2 (Activation) (None, 16, 16, 512) 0 conv5_2[0][0]
____________________________________________________________________________________________________
conv5_3_zeropadding (ZeroPadding (None, 18, 18, 512) 0 relu5_2[0][0]
____________________________________________________________________________________________________
conv5_3 (Conv2D) (None, 16, 16, 512) 2359808 conv5_3_zeropadding[0][0]
____________________________________________________________________________________________________
conv2_2_16_zeropadding (ZeroPadd (None, 130, 130, 128) 0 relu2_2[0][0]
____________________________________________________________________________________________________
relu5_3 (Activation) (None, 16, 16, 512) 0 conv5_3[0][0]
____________________________________________________________________________________________________
conv2_2_16 (Conv2D) (None, 128, 128, 16) 18448 conv2_2_16_zeropadding[0][0]
____________________________________________________________________________________________________
conv3_3_16_zeropadding (ZeroPadd (None, 66, 66, 256) 0 relu3_3[0][0]
____________________________________________________________________________________________________
conv4_3_16_zeropadding (ZeroPadd (None, 34, 34, 512) 0 relu4_3[0][0]
____________________________________________________________________________________________________
conv5_3_16_zeropadding (ZeroPadd (None, 18, 18, 512) 0 relu5_3[0][0]
____________________________________________________________________________________________________
concat (InputLayer) (None, 256, 256, 3) 0
____________________________________________________________________________________________________
upsample2__zeropadding (ZeroPadd (None, 130, 130, 16) 0 conv2_2_16[0][0]
____________________________________________________________________________________________________
conv3_3_16 (Conv2D) (None, 64, 64, 16) 36880 conv3_3_16_zeropadding[0][0]
____________________________________________________________________________________________________
conv4_3_16 (Conv2D) (None, 32, 32, 16) 73744 conv4_3_16_zeropadding[0][0]
____________________________________________________________________________________________________
conv5_3_16 (Conv2D) (None, 16, 16, 16) 73744 conv5_3_16_zeropadding[0][0]
____________________________________________________________________________________________________
new-score-weighting (Conv2D) (None, 256, 256, 1) 4 concat[0][0]
____________________________________________________________________________________________________
upsample2_ (Conv2DTranspose) (None, 262, 262, 16) 4112 upsample2__zeropadding[0][0]
____________________________________________________________________________________________________
upsample4_ (Conv2DTranspose) (None, 260, 260, 16) 16400 conv3_3_16[0][0]
____________________________________________________________________________________________________
upsample8_ (Conv2DTranspose) (None, 264, 264, 16) 65552 conv4_3_16[0][0]
____________________________________________________________________________________________________
upsample16_ (Conv2DTranspose) (None, 272, 272, 16) 262160 conv5_3_16[0][0]
____________________________________________________________________________________________________
sigmoid-fuse (Activation) (None, 256, 256, 1) 0 new-score-weighting[0][0]
====================================================================================================
Total params: 15,228,804
Trainable params: 15,228,804
Non-trainable params: 0
____________________________________________________________________________________________________
模型的JSON架构位于:https://pastebin.com/TE0Nda1p
有谁知道如何解决这个问题?谢谢!
答案 0 :(得分:0)
当output = [a,b]时,我遇到了同样的问题,将其更改为output = [a]。