ValueError: 层 conv2d_10 的输入 0 与层不兼容:预期 ndim=4,发现 ndim=3。收到的完整形状:[无、100、100]

时间:2021-01-02 20:57:12

标签: python tensorflow keras

所以我一直在学习关于机器学习的教程,我在代码中走到了这一点:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense,Dropout,Activation, Flatten, Conv2D, MaxPooling2D
import pickle
import numpy as np

pickle_in = open("X.pickle","rb")
X = pickle.load(pickle_in)

pickle_in = open("y.pickle","rb")
y = pickle.load(pickle_in)

X=np.array(X/255.0)
y=np.array(y)

model = Sequential()
model.add(Conv2D(64, (3,3), input_shape = X.shape[1:]))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Flatten())
model.add(Dense(64))

model.add(Dense(1))
model.add(Activation("sigmoid"))

model.compile(loss="binary_crossentropy",
             optimizer="adam",
             metrics=["accuracy"])
model.fit(X,y, batch_size=32, validation_split=0.1)

当我执行此代码时,它给了我以下错误: ValueError: Input 0 of layer conv2d_10 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [None, 100, 100] 我看过很多关于这个的帖子,但没有一个真正帮助过我!有人可以帮忙吗??提前致谢!! :)

2 个答案:

答案 0 :(得分:1)

添加重塑,因为 conv2D 层需要 (batch, x, y, channels), (ndim=4) 但您只提供它 (batch, x, y), (ndim=3)。只需将其重塑为 (batch, x, y, 1)

错误读取 Full shape received: [None, 100, 100]。它期望的是一个 4D 数组 [None, 100, 100, 1] -

model = Sequential()
model.add(Reshape((100,100,1),input_shape=X.shape[1:]))
model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Flatten())
model.add(Dense(64))

model.add(Dense(1))
model.add(Activation("sigmoid"))

model.compile(loss="binary_crossentropy",
             optimizer="adam",
             metrics=["accuracy"])


model.summary()
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
reshape_5 (Reshape)          (None, 100, 100, 1)       0         
_________________________________________________________________
conv2d_6 (Conv2D)            (None, 98, 98, 64)        640       
_________________________________________________________________
activation_9 (Activation)    (None, 98, 98, 64)        0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 49, 49, 64)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 47, 47, 64)        36928     
_________________________________________________________________
activation_10 (Activation)   (None, 47, 47, 64)        0         
_________________________________________________________________
max_pooling2d_7 (MaxPooling2 (None, 23, 23, 64)        0         
_________________________________________________________________
flatten_3 (Flatten)          (None, 33856)             0         
_________________________________________________________________
dense_6 (Dense)              (None, 64)                2166848   
_________________________________________________________________
dense_7 (Dense)              (None, 1)                 65        
_________________________________________________________________
activation_11 (Activation)   (None, 1)                 0         
=================================================================
Total params: 2,204,481
Trainable params: 2,204,481
Non-trainable params: 0
_________________________________________________________________

答案 1 :(得分:0)

我也有同样的问题。我的 X_train.shape=(175, 30, 126)

data_augmentation = keras.Sequential(
  [
    layers.experimental.preprocessing.RandomFlip("horizontal", input_shape=(30,126,3)),
    layers.experimental.preprocessing.RandomRotation(0.1),
    layers.experimental.preprocessing.RandomZoom(0.1),
  ]
)





    model = Sequential([
     data_augmentation,
    layers.LSTM(64, kernel_regularizer=regularizers.l1(0.001), return_sequences=True, activation='relu', input_shape=(1,30,30,126)),
    layers.LSTM(128, kernel_regularizer=regularizers.l1(0.001), return_sequences=True, activation='relu'),
    layers.LSTM(64, kernel_regularizer=regularizers.l1(0.001), return_sequences=False, activation='relu'),
    layers.Dense(64, kernel_regularizer=regularizers.l1(0.001), activation='relu'),
    layers.Dense(32, kernel_regularizer=regularizers.l1(0.001), activation='relu'),
    layers.Dense(actions.shape[0], activation='softmax')
    ])