将形状为(6400,1)的目标数组传递为形状(None,2)的输出,同时将其用作损耗`binary_crossentropy`

时间:2020-06-10 16:19:53

标签: python tensorflow keras cnn

我正在编写CNN代码,但是在此方面我遇到了错误。 我在pycharm 2020.1上使用Anaconda Python环境 代码是:

train_img,test_img,train_label,test_label = train_test_split(training_data,id,test_size = 0.20, random_state = 2)
train_img = np.expand_dims(train_img, axis=3)
#train_img = np.reshape(train_img , (6400,300,300,1))
print('train img',train_img.shape)
print('test img',test_img.shape)
train_label = np.array(train_label)
test_label = np.array(test_label)

print('train label',train_label.shape)
print('test label',test_label.shape)

# Applying CNN
# layer 1
my_model = tf.keras.models.Sequential()
my_model.add(tf.keras.layers.Conv2D(3, kernel_size=3, strides = 1,padding ='valid', input_shape = (300,300, 1)))
# layer 2
my_model.add(tf.keras.layers.Conv2D(3, kernel_size=3, strides = 1,padding ='valid'))
# layer 3
my_model.add(tf.keras.layers.MaxPooling2D(pool_size=(2, 2)))
# layer 4
my_model.add(tf.keras.layers.Conv2D(5, kernel_size=3, strides = 1,padding ='valid'))
# layer 5
my_model.add(tf.keras.layers.ReLU())
# layer 6
my_model.add(tf.keras.layers.MaxPooling2D(pool_size=(2, 2)))
# fully connected layer
my_model.add(tf.keras.layers.Flatten())
my_model.add(tf.keras.layers.Dense(1024, activation= "relu"))
my_model.add(tf.keras.layers.Dense(2, activation= "relu"))
my_model.summary()
# For compilation, training and making new predictions:
my_model.compile(optimizer="adam",loss="binary_crossentropy",metrics=['acc'])
my_model.fit(train_img,train_label,batch_size=5,epochs=10)
my_predictions = my_model.predict(test_img)
print(my_predictions)

代码错误是: ValueError: A target array with shape (6400, 1) was passed for an output of shape (None, 2) while using as loss "binary_crossentropy". This loss expects targets to have the same shape as the output.

error image

我检查了一些改变矩阵形状的方法,但没有任何矩阵可以将形状从6400,1调整为6400,2,但反之亦然。 请为我建议解决此错误的方法。

1 个答案:

答案 0 :(得分:0)

二进制分类需要最后一层密集,只有一个神经元具有sigmoid的激活

my_model.add(tf.keras.layers.Dense(1, activation= "sigmoid"))
相关问题